Showing posts with label Databases. Show all posts
Showing posts with label Databases. Show all posts

Tuesday, June 1, 2021

Microservices in ASP.NET

Microservices is the last significant change in modern development. Let's learn some tools and related design patterns by building a simplified e-commerce website using modern tools and techniques such as ASP.NET Core and Docker.
Photo by Adi Goldstein on Unsplash

For some time we've been discussing tools and technologies adjacent to microservices on this blog. Not randomly though. Most of these posts derived from my open-source project aspnet-microservices, a simple (yet complicated 😉) distributed application built primarily with .NET Core and Docker. While still work in progress, the project demoes important concepts in distributed architectures.

What's included in the project

This project uses popular tools such as:
On the administrative side, the project also includes:

Disclaimer

When you create a sample microservice-based application, you need to deal with complexity and make tough choices. For the aspnet-microservices application, I deliberately chose to balance complexity and architecture by reducing the emphasis on design patterns focusing on the development of the services themselves. The project was built to serve as an introduction and a start-point for those looking forward to working of Docker, Compose and microservices.

This project is not production-ready! Check Areas for Improvement for more information.

Microservices included in this project

So far, the project consists of the following services:

  • Web: the frontend for our e-commerce application;
  • Catalog: provides catalog information for the web store;
  • Newsletter: accepts user emails and stores them in the newsletter database for future use;
  • Order: provides order features for the web store;
  • Account: provides account services (login, account creation, etc) for the web store;
  • Recommendation: provides simple recommendations based on previous purchases;
  • Notification: sends email notifications upon certain events in the system;
  • Payment: simulates a fake payment store;
  • Shipping: simulates a fake shipping store;

Technologies Used

The technologies used were cherry-picked from the most commonly used by the community. I chose to favour open-source alternatives over proprietary (or commercially-oriented) ones. You'll find in this bundle:
  • ASP.NET Core: as the base of our microservices;
  • Docker and Docker Compose: to build and run containers;
  • MySQL: serving as a relational database for some microservices;
  • MongoDB: serving as the catalog database for the Catalog microservice;
  • Redis: serving as distributed caching store for the Web microservice;
  • RabbitMQ: serving as the queue/communication layer over which our services will communicate;
  • MassTransit: the interface between our apps and RabbitMQ supporting asynchronous communications between them;
  • Dapper: lightweight ORM used to simplify interaction with the MySQL database;
  • SendGrid: used to send emails from our Notification service as described on a previous post;
  • Vue.js and Axios.Js to abstract the frontend of the Web microservice on a simple and powerful  JavaScript framework.

Conventions and Design Considerations

Among others, you'll find in this project that:
  • The Web microservice serves as the frontend for our e-commerce application and implements the API Gateway / BFF design patterns routing the requests from the user to other services on an internal Docker network;
  • Web caches catalog data a Redis data store; Feel free to use Redis Commander to delete cached entries if you wish or need to.
  • Each microservice has its own database isolating its state from external services. MongoDB and MySQL were chosen as the main databases due to their popularity.
  • All services were implemented as ASP.NET Core webapps exposing the endpoints /help and /ping so they can be inspected from and observed automatically the the running engine.
  • No special logging infrastructure was added. Logs can be easily accessed via docker logs or indexed by a different application if you so desire.
  • Microservices communicate between themselves via Pub/Sub and asynchronous request/response using MassTransit and RabbitMQ.
  • The Notification microservice will eventually send emails. This project was tested with SendGrid but other SMTP servers should work from within/without the containers.
  • Monitoring is experimental and includes Grafana sourcing its data from a Prometheus backend.

Technical Requirements

To run this project on your machine, please make sure you have installed:

If you want to develop/extend/modify it, then I'd suggest you to also have:

Running the microservices

So let's get quickly learn how to load and build our own microservices.

Initializing the project

Get your copy by cloning the project:
git clone https://github.com/hd9/aspnet-microservices

Next open the solution src/AspNetContainers.sln with Visual Studio 2019. Since code is always the best documentation, the easiest way to understand the containers and their configurations is by reading the src/docker-compose.yml file.

Debugging with Visual Studio

Building and debugging with Visual Studio 2019 is straightforward. Simply open the AspNetMicroservices.sln solution from the src folder, build and run the project as debug (F5). Next, run the dependencies (Redis, MongoDB, RabbitMQ and MySQL) by issuing the below command from the src folder:

docker-compose -f docker-compose.debug.yml up

Running the services with Docker Compose

In order to run the services you'll need Docker and Docker Compose installed on your machine. Type the command below from the src folder on a terminal to start all services:
docker-compose up
Then to stop them:
docker-compose down
To remove everything, run:
docker-compose down -v
To run a specific service, do:
docker-compose up <service-name>
As soon as you run your services, Compose should start emitting on the console logs for each service:
The output of our docker-compose command

You can also query individual logs for services as usual with docker logs <svc-name>. For example:

~> docker logs src_catalog_1
info: CatalogSvc.Startup[0]
      DB Settings: ConnStr: mongodb://catalog-db:27017, Db: catalog, Collection: products
info: Microsoft.Hosting.Lifetime[0]
      Now listening on: http://[::]:80
info: Microsoft.Hosting.Lifetime[0]
      Application started. Press Ctrl+C to shut down.
info: Microsoft.Hosting.Lifetime[0]
      Hosting environment: Production
info: Microsoft.Hosting.Lifetime[0]
      Content root path: /app

Database Initialization

Database initialization is automatically handled by Compose. Check the docker-compose.yml file to understand how that happens. You'll find examples on how to initialize both MySQL and MongoDB.

Dockerfiles

Each microservice contains a Dockerfile in their respective roots and understanding them should be straightforward. If you never wrote a Dockerfile before, consider reading the official documentation.

Docker Compose

There are two docker-compose files in the solution. Their use is described below:
  • docker-compose.yml: this is the main Compose file. Running this file means you won't be able to access some of the services as they'll not be exposed.
  • docker-compose.debug.yml: this is the file you should run if you want to debug the microservices from Visual Studio. This file only contains the dependencies (Redis, MySQL, RabbitMQ, Mongo + admin interfaces) you'll need to use when debugging.

Accessing our App

If the application booted up correctly, go to http://localhost:8000 to access it. You should see a simple catalog and some other widgets. Go ahead and try to create an account. Just make sure that you have the settings correctly configured on your docker-compose.yml file:
Our simple e-commerce website. As most things, its beauty is in the details 😊.

    Admin Interfaces

    You'll still have available admin interfaces for our services on:
    I won't go over the details about each of these apps. Feel free to explore on your own.

    Monitoring

    Experimental monitoring is available with Grafana, Prometheus and cadvisor. Open Grafana at http://localhost:3000/ and login with admin | admin, select the Docker dashboard and you should see metrics for the services similar to:

    Grafana capturing and emitting telemetry about our microservices.

    Quick Reference

    As a summary, the microservices are configured to run at:

    The management tools are available on:

    And you can access the databases at:
    • MySql databases: use Adminer at: http://localhost:8010/, enter the server name (ex. order-db for the order microservice) and use root | todo as username/password.
    • MongoDB: use MongoExpress at: http://localhost:8011/. No username/password is required.

    Final Thoughts

    On this post I introduce to you my open-source project aspnet-microservices. This application was built as a way to present the foundations of Docker, Compose and microservices for the whole .NET community and hopefully serves as an intuitive guide for those starting in this area.

    Microservices is the last significant change in modern development and requires learning lots (really, lots!) of new technologies and new design patterns. This project is by far complete and should not be used in production as it lacks basic cross-cutting concerns any production-ready project would need. I deliberately omitted them for simplicity else I could simply point you to this project. For more information, check the project's README on GitHub.

    Feel free to play with it and above all, learn and have fun!

    Source Code

    As always, the source code is available on GitHub at: github.com/hd9/spnet-microservices.

    Monday, April 20, 2020

    How to profile ASP.NET apps using Application Insights

    Application Insights can monitor, log, alert and even help us understand performance problems with our apps.
    Photo by Marc-Olivier Jodoin on Unsplash
    We've been discussing AppInsights in depth on this blog and to complete the series, I'd like to discuss the performance features it offers. On the previous posts, we learned how to collect, suppress and monitor our applictions using AppInsights data.

    On this post let's understand how to use the performance features to identify and fix performance problems with our app.

    What's profiling?

    Wikipedia defines profiling as:
    a form of dynamic program analysis that measures, for example, the space (memory) or time complexity of a program, the usage of particular instructions, or the frequency and duration of function calls. Most commonly, profiling information serves to aid program optimization.
    Profiles usually monitor:
    • Memory
    • CPU
    • Disk IO
    • Network IO
    • Latency
    • Speed the of application
    • Access to resources
    • Databases
    • etc

    Profiling ASP.NET Applications

    ASP.NET developers have multiple ways of profiling our web applications, being the most popular: 
    Those are awesome tools that definitely you should use. But today we'll focus on what can we do to inspect our deployed application using Application Insights.

    How can Application Insights help

    Azure Application Insights collects telemetry from your application to help analyze its operation and performance. You can use this information to identify problems that may be occurring or to identify improvements to the application that would most impact users. This tutorial takes you through the process of analyzing the performance of both the server components of your application and the perspective of the client so you understand how to:
    • Identify the performance of server-side operations
    • Analyze server operations to determine the root cause of slow performance
    • Identify slowest client-side operations
    • Analyze details of page views using query language

    Using performance instrumentation to identify slow resources

    Let's illustrate how to detect performance bottlenecks in our app with some some. The code for this exercise is available on my github. You can quickly get it by:
    git clone https://github.com/hd9/aspnet-ai.git
    cd aspnet-ai
    git branch performance
    # insert your AppInsights instrumentation key on appSettings.Development.json
    dotnet run
    This project contains 5 endpoints that we'll use to simulate slow operations:
    • SlowPage - async, 3s to load, throws exception
    • VerySlowPage - async, 8s to load
    • CpuHeavyPage - sync, loops over 1 million results with 25ms of interval
    • DiskHeavyPage - sync, writing 1000 lines to a file
     Running the tool and get back to azure. We should have some data there.

    Performance Tools in AppInsights

    Our AppInsights resource in Azure greets us with an overview page already that shows us consolidaded information about failed requests, server response time, server requests and availability:

    Now, click on the Performance section. Out of the box, AppInsights has already captured previous requests and shows a consolidated view. Look below to already see our endpoints sorted out by duraction:

    You should also have access to an Overall panel where you'd see requests per time:
    There's also good stuff on the The End-to-end transaction details widget:

    For example, we could click on a given request and  get additional information about it:

    Tracing

    We now know which are the slowest pages on our site, let's now try to understand why. Essentially, have two options:
    1. use AppInsights's telemetry api (as on this example) 
    2. or integrating directly to your logging provider, using System.Diagnostics.Trace on this case.

    Tracing with AppInsights SDK

    Tracing with AppInsights SDK is done via the TrackTrace method from TelemetryClient class an is as simple as:
    public IActionResult Index()
    {
        _telemetry.TrackPageView("Index");
        return View();
    }

    Tracing with System.Diagnostics.Trace

    Tracing with System.Diagnostics.Trace is also not complicated but requires the NuGet package Microsoft.ApplicationInsights.TraceListener. For more information regarding other logging providers, please check this page. Let's start by installing it with:
    dotnet add package Microsoft.ApplicationInsights.TraceListener --version 2.13.0

    C:\src\aspnet-ai\src>dotnet add package Microsoft.ApplicationInsights.TraceListener --version 2.13.0
      Writing C:\Users\bruno.hildenbrand\AppData\Local\Temp\tmpB909.tmp
    info : Adding PackageReference for package 'Microsoft.ApplicationInsights.TraceListener' into project 'C:\src\aspnet-ai\src\aspnet-ai.csproj'.
    info : Restoring packages for C:\src\aspnet-ai\src\aspnet-ai.csproj...
    (...)
    info : Installing Microsoft.ApplicationInsights 2.13.0.
    info : Installing Microsoft.ApplicationInsights.TraceListener 2.13.0.
    info : Package 'Microsoft.ApplicationInsights.TraceListener' is compatible with all the specified frameworks in project 'C:\src\aspnet-ai\src\aspnet-ai.csproj'.info : PackageReference for package 'Microsoft.ApplicationInsights.TraceListener' version '2.13.0' added to file 'C:\src\aspnet-ai\src\aspnet-ai.csproj'.
    info : Committing restore...
    info : Writing assets file to disk. Path: C:\src\aspnet-ai\src\obj\project.assets.json
    log  : Restore completed in 4.18 sec for C:\src\aspnet-ai\src\aspnet-ai.csproj.

    Reviewing the results

    Back in Azure we should now see more information about the performance of the pages:
    And more importantly, we can verify that our traces (in green) were correctly logged:

    Where from here

    If you used the tools cited above, you now should have a lot of information to understand how your application performs on production. What next?

    We did two important steps here: understood the slowest pages and added trace information to them. From here, it's with up to you. Start by identifying the slowest endpoints and add extra telemetry on them. The root cause could be in a specific query in your app or even on an external resource. The point is, each situation is peculiar and extends the scope of this post. But the essential you have: which are the pages, methods and even calls that take longer. On that note, I'd recommend adding custom telemetry data so you have a real, reproducible scenario.

    Conclusion

    On this post, the last on the discussion about AppInsights, we reviewed how Application Insights can be used to understand, quantify and report about the performance or our apps. Once again, AppInsights demonstrates to be an essential tool for developers using Azure.

    More about AppInsights

    For more information, consider reading my previous articles about App Insights:
    1. Adding Application Insights telemetry to your ASP.NET Core website
    2. Suppressing Application Insights telemetry on .NET applications
    3. Monitoring ASP.NET applications using Application Insights and Azure Alerts

    References

    See Also

    Monday, February 11, 2019

    How to copy data between Azure databases

    What is the simplest way to move data between Azure Databases? Let's explore the options.
    Whenever I have a requirement, the first thing that I ask myself is what is the simplest way I can solve a problem. This time, I was asked to migrate an Azure Sql Table to Production. Let's see how we can do it.

    bcp 

    One simple option definitely would be using bcp, so why not? I downloaded it and during the installation, was surprised with the message below. Since I don't like installing things for an one-off usage, I decided to search an even simpler solution.

    PolyBase


    PolyBase is a recent feature available for Sql Server 16+ and Azure Sql databases, that allows us data integration between distributed data sources. Here's how Microsoft describes it:
    PolyBase does not require you to install additional software. You query external data by using the same T-SQL syntax used to query a database table. The support actions implemented by PolyBase all happen transparently.

    And this is a simple representation on how it works:
    It seems PolyBase is a candidate to transfer data between Azure Databases because it allows us to:
    • create a connection on your target server
    • create an external data source on the server were working on
    • create an external table: a virtual representation of your remote table
    • insert/select/join normally as if the remote table were in the same database

    Using Polybase

    So let's migrate this data in  simple steps.

    Step 1 - Create an external data source

    An external data source is a virtual representation of your remote database. With it, you'll be able to access your remote database locally. The first thing we'll need to do is to create a SCOPED CREDENTIAL to store the remote data sort address/credentials on your local Sql Database.

    Here's how you create them:

    -- You are required to create a scoped credential if your server still does have one
    CREATE DATABASE SCOPED CREDENTIAL RemoteCred
    WITH IDENTITY = '<remote-username>', SECRET = '<remote-password>';

    -- The external data source is a virtual representation of your remote database
    CREATE EXTERNAL DATA SOURCE RemoteDataSource
    WITH
    (
        TYPE=RDBMS,
        LOCATION='<remote-server-address>',
        DATABASE_NAME='<remote-db-name>',
        CREDENTIAL= RemoteCred
    );

    Note that for the CREATE DATABASE SCOPED CREDENTIAL to work, we have to have MASTER KEY ENCRYPTION set in the database. If it isn't created on your server, you can create one with:
    CREATE MASTER KEY ENCRYPTION BY PASSWORD = '<a-very-strong-password>';

    Step 2 - Create an external table

    An external table is a virtual representation of your remote table. We can create one with:
    CREATE EXTERNAL TABLE [dbo].[RemoteTbl] (
        --your column definitions here as a normal create table
        --note that Identity fields are not allowed
    )
    WITH
    (
        DATA_SOURCE = RemoteDataSource,
        SCHEMA_NAME = '<remote-schema-name>',
        OBJECT_NAME = '<remote-table-name>'
    )

    Once created, we can already select from that table and see results from the remote host in our current connection.

    Step 3 -  Insert your records

    Last and final record for me was to move the data between servers. In my case, a simple select into resolved my problem:
    select *
    into [dbo].[YourLocalTable]
    from [dbo].[RemoteTbl]
    Note that you can also insert on an existing local table using SELECT INTO:
    -- if columns match
    INSERT INTO dbo.<dest-table>
    SELECT <col1>, <col2>, <coln>
      FROM dbo.<source-table>
     WHERE <condition>

    -- specific columns
    INSERT INTO dbo.<dest-table>
      (<col1>, <col2>)
    SELECT <col1>, <col2>
      FROM dbo.<source-table>
     WHERE <condition>

    Other Operations

    Step 3 above was to illustrate how to move data between servers. In fact, after you created your external table, you can join, query, and do almost everything else as if the remote database were present on the source server.

    Conclusion

    On this post we reviewed how can we easily migrate data between databases using Sql Server's PolyBase feature. There's a lot more on this topic and I encourage you to research and learn more about it. Hope it helps.

    Source Code

    As always, the full source code for this article is available on my GitHub.

    References

    See Also

    For other  posts about Azure on this blog, please click here.

    Monday, July 23, 2018

    Patching RavenDB Metadata

    On this post, let's understand how simple it is to patch the metadata of your documents using RavenDB Studio
    On a previous post we talked about how to query metadata using RavenDB. Now, let's take a look at some ways to patch our metadata in RavenDB.

    Patching Records with RavenDB

    Patching records is RavenDB can be easy or less easy =). The easy way is doing it from RavenDB Studio. There you can do it by collection or index and write some JavaScript to it.

    Patching using RavenDB Studio

    Here's a simple example on how I can patch the Name and IsActivated fields in my Users collection:

    Patching metadata using RavenDB Studio

    But, you may be required to patch the metadata, how to do it? The answer is: Using the @metadata dictionary. Let's take a look.

    Final Thoughts

    In the example above, we're updating the document metadata property "Raven-Clr-Type" with "MyApp.Core.User, MyApp.Core":

    this["@metadata"]["Raven-Clr-Type"] = "MyApp.Core.User, MyApp.Core";

    You can obviously patch whatever you want in your metadata. In the previous example, I was patching the namespace of an entity that was refactored in my application.

    Hope it helps!

    See Also

    Monday, June 4, 2018

    Installing and Running RavenDB on Windows and Linux

    Let's install and run RavenDB on Windows and Linux and learn how it works.

    On a previous post  I introduced RavenDB on this blog. On this one let's review how to install or run a standalone RavenDB instance on our machine.

    On this post we will cover:
    1. Installing and running on Windows;
    2. Installing and running on Linux;
    3. Using the RavenDB console tool;
    4. Creating a new database;

    Downloading RavenDB

    First off, navigate to the RavenDB downloads page and download the server version for the environment you're working on. The currently supported platforms are: Windows, Linux, OSX, Raspberry PI and Docker.

    To download your image, select Server, Stable and the appropriate version for your environment. Aaccept the terms, click on the .ZIP Package download button to download the image to your disk.

    Running standalone RavenDB on Windows

    On Windows, once the download is completed, extract all those files in a folder and you'll see two PowerShell files: run.ps1 and setup-as-service.ps1.

    Open the Powershell terminal, cd into the folder you extract your files and run .\run.ps1 You'll then see some outputs the RavenDB service is emitting for us when running as a standalone instance:
    A new window will open for you where you'll need to configure a cluster and/or security. For now, let's skip the cluster configuration and go with the Unsecure option. 

    This configuration is enough for this demo and simple development efforts. Clicking on it, RavenDB Studio will open on the default Url: http://127.0.0.1:8080/studio/index.html.
    That's it! The standalone instance is running and you can start testing RavenDB on your Windows box.

    Installing on RavenDB Windows

    To install you your machine,  open the PowerShell terminal as an administrator and run the setup-as-service.ps1 script.

    If all goes well, you'll  have to accept the user license agreement and proceed with the instance configuration.

    Note that:
    • the installation requires administrator privileges
    • will use port 8080 for "unsecure" installs or 443 for secure (options selected during the installation)

    Configuring the new Instance

    After installed, you'll have to configure your instance as shows the image below. For a development setup, you should be good with the Unsecure option.
    Clicking on it, will prompt you for Http/Tcp ports and IP address. Leaving empty will use the defaults. Click "Restart Server" and RavenDB should be installed.

    The RavenDB Service

    Once installed, on Windows, we can view the service status using the Get-Service Powershell cmdlet:
    Get-Service -Name RavenDB

    For more information, please visit: https://ravendb.net/docs/article-page/4.0/csharp/start/installation/setup-wizard

    Running standalone RavenDB on Linux

    The Linux installation is similar: download your Linux image from the RavenDB downloads page, unzip it and run the script. Let's see how it works.

    Download the image by selecting Linux x64 from the downloads page and download it using Firefox:
    Once downloaded, extract the bz2 pkg on a local folder:

    Cd into that folder and run "run.sh". I should then see:

    Installing RavenDB on Linux

    Installing RavenDB on Linux is very similar to Windows. You run the run.sh shell script and select the installation option on the command line.

    The RavenDB Console

    After installed, basic manipulation of the server can be done either by using the Raven UI (Raven Studio) or by using the console previously opened.

    For example, when I type help on my shell in Fedora, I get:
    From the console, You can do things like restarting/shutting down the server, like exporting/importing data, reading logs and viewing server stats. Just type the commands shown. For example, to shut the instance down, I should type: shutdown.

    Creating a Database

    The final needed step before touching code is to create a database. For that do: Databases -> New Database:
    Enter a DB name (for example, "Blog") and click "Create"):
    Clicking on that Database creates our database and takes us to the its page. From there we basically can view our documents (records), create, query, patch, view logs, stats, etc:

    Next Steps

    Now that RavenDB is running and the database is created, the next step is to start interacting with it. You can use either RavenDB Studio or the client Api (with C#, Java, Python, etc). For more details, check my simple introduction to RavenDB.

    Don't forget that RavenDB is also available on the cloud. Check the article An in depth review of the RavenDB Cloud for more information.

    Conclusion

    Hope this post shows how is a simple introduction on how to install RavenDB on Windows and Linux boxes. For more information, check the official documentation.

    Monday, May 21, 2018

    Seven Databases in Seven Weeks, 2nd Edition

    The book is an interesting heads-up on databases being used on different fields throughout the world.
    You may or may not have heard about polyglot persistence. The fact is that more and more, software projects are making use of different technologies. And when it comes to the database world, the number of options is immense: it can be relational, document or columnar databases, key-value stores, graph databases, not to mention other cloud infrastructure options like service buses, blobs, storage accounts and queues.

    What to use? Where? And how does that affects developers?

    Choosing a database is perhaps one of the most important architectural decisions a developer can make. In that regard, I'd like to recommend a very interesting book that addresses some of that discussion: Seven Databases in Seven Weeks, 2nd Edition.

    Why you should read this book

    Because the book:
    • provides practical and conceptual introductions to Redis, Neo4J, CouchDB, MongoDB, HBase, PostgreSQL, and DynamoDB
    • introduces you to different technologies encouraging you to run your own experiences
    • revises important topics like the CAP Theorem
    • will give you an overview of what’s out there so you can choose the best tool for the job
    • explore some cutting-edge databases available - from a traditional relational database to newer NoSQL approaches
    • and make informed decisions about challenging data storage problems
    • tackle a real-world problem that highlights the concepts and features that make it shine

    Conclusion

    Whether you're a programmer building the next big thing, a data scientist seeking solutions to thorny problems, or a technology enthusiast venturing into new territory, you will find something to inspire you in this book.

    Reference

    See Also

    Monday, March 12, 2018

    Package Management in .NET Core

    When developing .NET applications, developers use Nuget, Visual Studio and PowerShell. But what about .NET Core?
    Photo by Christian Wiediger on Unsplash
    In development, it's common to import and reuse routines from external libraries. That code is often code is bundled into packages/libraries that could contain compiled code (as DLLs) along with other content needed in the projects that consume these packages (CSS, images, JavaScript, etc). If you use .NET you have already dealt with NuGet one way or another. But how does that work in .NET Core?

    NuGet and .NET Core

    NuGet remains strong in .Net Core as it's the default tool to download packages in the .NET world. Nuget.org is where most of these packages (free or not) are hosted and can be downloaded. If you have Visual Studio installed in your machine, probably already have nuget.exe installed since  VS only wraps NuGet when managing the packages of your projects. With .NET Core however, package maintenance is done using the new .NET Core CLI.

    Managing NuGet Packages

    In order to build, push, modify and manage nuget packages, we need to use the nuget.exe tool. If you're interested in managing your own packages, please take a look at this documentation.

    Installing nuget.exe

    As nuget.exe itself is not included with any version of Visual Studio, if you want to use it, you can then go and download it on nuget.org/downloads. Please download the latest.
    Don't forget to set it on your %path% so it's accessible from the command prompt everywhere.

    Nuget Package Exlorer

    I also recommend installing NuGet Package Explorer for a friendly nuget.exe user interface. It will help you understand your package, its dependencies and metadata. NuGet Package Explorer looks like this:

    The Global Packages Folder

    After NuGet 3, there is now a packages folder that is shared for all projects that you work with. Packages are downloaded and stored in the %userprofile%\.nuget\packages folder so you will only have one copy of that package on your machine. The same package can be reused for new projects saving time and band, especially on build servers.

    The following command line lists you all packages installed in your system, for your user:
    nuget locals global-packages -list

    Adding nuget packages to your .Net Core project

    Adding packages a your solution can be done two ways: by using Visual studio or from the command line.

    Adding packages using the command line:

    I'm a strong supporter of using the CLI. So let's start with it first. In your project folder, run the following command:
    dotnet add package <package-name>
    As an example, this is how I added jQuery to an old JS project:

    Adding packages using Visual Studio:

    Adding with VS is simply right-clicking the web project name -> Manage NuGet packages:

    Restoring packages

    To finish, you can also to restore packages using the command line. In your solution folder, run:
    dotnet restore

    Conclusion

    On this post we reviewed how to manage NuGet packages using both Visual Studio and the new .NET Core CLI. Since most developers are used to Visual Studio and the experience won't change much between .NET Framework and .NET Core projects, on this post I empathized how to use the CLI. Using the command line it a great way to learn your tools, write scripts, automate and manage your packages more quickly.

    References

    See Also

      About the Author

      Bruno Hildenbrand      
      Principal Architect, HildenCo Solutions.