Showing posts with label Javascript. Show all posts
Showing posts with label Javascript. Show all posts

Wednesday, April 1, 2020

Monitor ASP.NET applications using Application Insights and Azure Alerts

Using Application Insights? Learn how to trigger custom alerts for your application.
Photo by Hugo Jehanne on Unsplash

On this third article about Application Insights, we will review how to create custom alerts that trigger based on telemetry emitted by our applications. Creating Alerts on Azure is simple, cheap, fast and adds a lot of value to our operations.

Azure Alerts

But what are Azure Alerts? According to Microsoft:
Azure Alerts proactively notify you when important conditions are found in your monitoring data. They allow you to identify and address issues before the users of your system notice them.
With Azure Alerts we can create custom alerts based on metrics or logs using as data sources:
  • Metric values
  • Log search queries
  • Activity log events
  • Health of the underlying Azure platform
  • Tests for website availability
  • and more...

Customization

The level of customization is fantastic. Alert rules can be customized with:
  • Target Resource: Defines the scope and signals available for alerting. A target can be any Azure resource. For example: a virtual machine, a storage account or an Application Insights resource
  • Signal: Emitted by the target resource, can be of the following types: metric, activity log, Application Insights, and log.
  • Criteria: A combination of signal and logic. For example, percentage CPU, server response time, result count of a query, etc.
  • Alert Name: A specific name for the alert rule configured by the user.
  • Alert Description: A description for the alert rule configured by the user.
  • Severity: The severity of the alert when the rule is met. Severity can range from 0 to 4 where 0=Critical, 4=Verbose.
  • Action: A specific action taken when the alert is fired. For more information, see Action Groups.

Can I use Application Insights data and Azure Alerts?

Application Insights can be used as source for your alerts. Once you have your app hooked with Application Insights, creating an alert is simple. We will review how later on this post.

Creating an Alert

Okay so let's jump to the example. In order to understand how that works, we will:
  1. Create and configure an alert in Azure targeting our Application Insights instance;
  2. Change our application by creating a slow endpoint that will be used on this exercise.

To start, go to the Azure Portal and find the Application Insights instance your app is hooked to and click on the Alerts icon under Monitoring. (Don't know how to do it? Check this article where it's covered in detail).
And click on Manage alert rules and New alert rule to create a new one:

Configuring our alert

You should now be in the Create rule page. Here we will specify a resource, a condition, who should be notified, notes and severity. As previously, I'll use the same aspnet-ai AppInsights instance as previously:

Specifying a Condition

For this demo, we will be tracking response time so for the Configure section, select Server response time:

Setting Alert logic

On the Alert logic step we specify an operator and a threshold value. I want to track requests that take more than 1s (1000ms), evaluating every minute and aggregating data up to 5 minutes:

Action group

On the Add action group tab we specify who should be notified. For now, I'll send it only to myself:

Email, SMS and Voice

If you want to be alerted by Email and/or SMS, enter them below. We'll need them to confirm that notifications are sent to our email and phone:

Confirming and Reviewing our Alert

After confirming, saving and deploying your alert, you should see a summary like:

Testing the Alerts

With our alert deployed, let's do some code. If you want to follow along, download the source for this article and switch to the alerts branch by doing:
git clone https://github.com/hd9/aspnet-ai.git
cd aspnet-ai
git branch alerts

# insert your AppInsights instrumentation key on appSettings.Development.json
dotnet run
That code contains the endpoint SlowPage to simulate slow pages (and exceptions). To test it, make sure you have correctly set your instrumentation key and send a request to the endpoint at https://localhost:5001/home/slowpage and. It should throw an exception after 3s:

Reviewing the Exception in AppInsights

It may take up to 5 minutes to get the notifications. In the meantime, we can explore how AppInsights tracked our exception by going to the Exceptions tab. This is what was captured by default for us:
Clicking on the SlowPage link, I can see details about the error:

So let's quickly discuss the above information. I added an exception on that endpoint because I also wanted to highlight that without any extra line of code, the exception was automatically tracked for us. And look how much information we have there for free! Another reason we should use these resources proactively as much as possible.

Getting the Alert

Okay but where's our alert? If you remember, we configured our alerts to track intervals of 5 minutes. That's good for sev3 alerts but probably to much for sev1. So, after those long 5 minutes, you should get an easier alert describing the failure:
If you registered your phone number, you should also get an SMS. I got both with the SMS arriving a few seconds before the email.

Reviewing Alerts

After the first alert send, you should now have a consolidate view under your AppInsights instance where you'd be able to view previously send alerts group by severity. You can even click on them to get information and telemetry related to that event. Beautiful!

Types of Signals

Before we end, I'd like to show some of the signals that are available for alerts. You can use any of those (plus custom metrics) to create an alert for your cloud service.

Conclusion

Application Insights is an excellent tool to monitor, inspect, profile and alert on failures of your cloud resources. And given that it's extremely customizable, it's a must if you're running services on Azure.

More about AppInsights

Want to know more about Application Insights? Consider reading the following articles:
  1. Adding Application Insights telemetry to your ASP.NET Core website
  2. How to suppress Application Insights telemetry
  3. How to profile ASP.NET apps using Application Insights

References

See Also

For more posts about AppInsights, please click here.

Monday, March 16, 2020

How to suppress Application Insights telemetry

Application Insights is an excellent tool to capture telemetry for your application but it may be too verbose. On this post we will review how to exclude unnecessary requests.
Photo by Michael Dziedzic on Unsplash

On a previous post we learned how to add Application Insights telemetry to ASP.NET applications. Turns out that we also saw that a lot of unnecessary telemetry was being sent. How to filter some of that telemetry?

On this post we will explain how to suppress telemetry by using AppInsights' DependencyTelemetry and ITelemetryProcessor.

Understanding telemetry pre-processing

You can write and configure plug-ins for the Application Insights API to customize how telemetry can be enriched and processed before it's sent to the Application Insights service in four different ways:
  1. Sampling - reduces the volume of telemetry without affecting statistics. Keeps together related data points so we can navigate between points when diagnosing a problem.
  2. Filtering with Telemetry Processors - filters out telemetry before it is sent to the server. For example, you could reduce the volume of telemetry by excluding requests from robots. Filtering is a more basic approach to reducing traffic than sampling. It allows you more control over what is transmitted, but affects your statistics.
  3. Telemetry Initializers - lets you add or modify properties to any telemetry sent from your app. For example, you could add calculated values; or version numbers by which to filter the data in the portal.

Suppressing Dependencies

So let's create a custom filter that will suppress some of that telemetry. For example, below I show all the telemetry that was automatically sent from my application to Azure with just one line of code. Our code will suppress the data shown in yellow.

Creating a Telemetry Processor

There are different types of telemetry filters being RequestTelemetry and DependencyTelemetry the most common. These classes contain properties that are useful for our logic. So let's create a filter that excludes requests to favicon.ico, bootstrap, jquery, site.css and site.js.

To get rid of some of these requests, we will implement a custom filter that I named SuppressStaticResourcesFilter to ignore requests those static resources. Our telemetry processor should implement the interface ITelemetryProcessor. Note the logic to exclude requests in the Process method:

Registering a Telemetry Processor

Now, we just need to wire it up on the initialization of our app. As stated on this document, the initialization is different for ASP.NET Core and ASP.NET MVC. Let's take a look at each of them.

The registration of a telemetry processor in ASP.NET Core is done in Startup.cs:
Configuring a telemetry processor on ASP.NET is done in Global.asax:
Wait a couple of minutes for the telemetry to show up online. You should no longer see those requests targeting static resources:

How many filters?

Well, that's up to you. You could create as many filters as you want. The only change would be on the initialization. For example, assuming For ASP.NET Core web apps, it would look like:
public void ConfigureServices(IServiceCollection services)
{
    services.AddApplicationInsightsTelemetry();
    services.AddApplicationInsightsTelemetryProcessor<SuppressStaticResourcesFilter>();
    services.AddApplicationInsightsTelemetryProcessor<YourOtherFilter1>();
    services.AddApplicationInsightsTelemetryProcessor<YourOtherFilter2>();
    services.AddControllersWithViews();
}
For classic ASP.NET MVC Web, the initialization would be:
var builder = TelemetryConfiguration.Active.DefaultTelemetrySink.TelemetryProcessorChainBuilder;
builder.Use((next) => new SuppressStaticResourcesFilter(next));
builder.Use((next) => new YourOtherFilter1(next));
builder.Use((next) => new YourOtherFilter2(next));
builder.Build();

Enabling based on LogLevel

To finish up, we could  extend the solution above by enabling the telemetry processors based on your LogLevel. For example, the code below wouldn't register processors to suppress data if LogLevel == debug:
public void ConfigureServices(IServiceCollection services)
{
     services.AddApplicationInsightsTelemetry();
     if (logLevel != "debug")
     {
          services.AddApplicationInsightsTelemetryProcessor<SuppressStaticResourcesFilter>();
     }
     services.AddControllersWithViews();
}

Conclusion

On this post we extended the discussion on How to add Application Insights telemetry to ASP.NET websites and reviewed how to suppress some of that telemetry. I hope it's clear how to register your telemetry processor and how/when to use RequestTelemetry and DependencyTelemetry.

If you're using Azure, AppInsights may be a valuable asset for your organization as it provides a simple, fast, centralized and customizable service to access your logs. For more information about AppInisights, check the official documentation.

References

More about AppInsights

Want to know more about Application Insights? Consider reading previous articles on the same topic:
  1. Adding Application Insights telemetry to your ASP.NET Core website
  2. Monitor ASP.NET applications using Application Insights and Azure Alerts
  3. How to profile ASP.NET apps using Application Insights

Source Code

The source code used on this article is available on GitHub on the telemetry branch.

See Also

For more posts about .NET Core, please click here.

Monday, March 2, 2020

Adding Application Insights to a ASP.NET Core website

Application Insights may be an excellent resource for your applications and services running on Azure. Read to understand why.
Photo by SOCIAL.CUT on Unsplash

On a previous post, we discussed how to build and deploy WebJobs on Azure. Given that today it's common to have multiple instances of our apps and services (like WebJobs) running at the same time on the cloud, it's important to plan and carefully implement our logging strategy. Turns out that on Azure, the best way to do so is by using Application Insights.

On this post we will learn:
  • What's Application Insights (AppInsights)
  • What do we gain by using AppInsights
  • How to add AppInsights to an ASP.NET Core website
  • How to create and configure AppInsights on Azure
  • How to send telemetry to our Azure AppInsights instance
  • How to interpret AppInsights data on Azure

What is Application Insights

So first, let's understand what's Application Insights. According to Microsoft:
Application Insights is an extensible Application Performance Management (APM) service for developers and DevOps professionals. Use it to monitor your live applications. It will automatically detect performance anomalies, and includes powerful analytics tools to help you diagnose issues and to understand what users actually do with your app. It's designed to help you continuously improve performance and usability. It works for apps on a wide variety of platforms including .NET, Node.js and Java EE, hosted on-premises, hybrid, or any public cloud.

Services provided by AppInsights

Below I list some services provided by AppInsigths:
  • HTTP request rates, response times, and success rates
  • Dependency (HTTP & SQL) call rates, response times, and success rates
  • Exception traces from both server and client
  • Diagnostic log traces
  • Page view counts, user and session counts, browser load times, and exceptions
  • AJAX call rates, response times, and success rates
  • Server performance counters
  • Custom client and server telemetry
  • Segmentation by client location, browser version, OS version, server instance, custom dimensions, and more
  • Availability tests
Along with the preceding, there are associated diagnostic and analytics tools available for alerting and monitoring with various different customizable metrics. With its own query language and customizable dashboards, Application Insights is an excellent tool for any cloud service.

Why AppInsights?

Let's review then why and when should we use AppInsights. I usually recommend if:
  • Your project runs on the cloud
  • You run services on Azure
  • You want to gain insights on how your application runs on the cloud
  • You want to consolidate telemetry
  • You're building an App Service, Cloud Service, WebJob or Azure Functions
  • You have a moderately modern application
Desktop apps could also benefit from AppInsights this but I'd only recommend it if the developers want to learn from the telemetry submitted by their users. Remember, that an active connection should exist for the data to be pushed to Azure.

How does Application Insights work?

Once plugged to your application, AppInsights will send automatic and custom telemetry to Azure and will be available from multiple sources. The diagram below summarizes how it works:
Source: Microsoft Docs

Building our App

So let's now build our app. This post focus on .NET Core but you could also target .NET Framework. I will also try do do use .NET Core's CLI as much as possible because it not only helps us solidify the tool but also because I'm running .NET Core 3.0 which's still not supported yet by Visual Studio 2017.

Scaffolding a simple ASP.NET Core MVC WebApp using the CLI

So let's scaffold a simple ASP.NET MVC web app using the CLI. Open a Windows Terminal, navigate to the folder where you store your projects and type:
C:\src>dotnet new mvc -n aspnet-ai
The template "ASP.NET Core Web App (Model-View-Controller)" was created successfully.
This template contains technologies from parties other than Microsoft, see https://aka.ms/aspnetcore/3.1-third-party-notices for details.

Processing post-creation actions...
Running 'dotnet restore' on aspnet-ai\aspnet-ai.csproj...
  Restore completed in 136.74 ms for C:\src\aspnet-ai\aspnet-ai.csproj.

Restore succeeded.
In case you're insterested in knowing which options you have available for any project, use the --help flag. For example, to view the options for a React project, type:
dotnet new react --help
The --help flag can also be used to get information about different operations including scaffolding console, Windows forms, etc:
dotnet new --help

Testing our App

Runnning our app is as simple as typing dotnet run on the command line. The CLI then builds and runs our app:
C:\src\aspnet-ai>dotnet run
info: Microsoft.Hosting.Lifetime[0]
      Now listening on: https://localhost:5001
info: Microsoft.Hosting.Lifetime[0]
      Now listening on: http://localhost:5000
info: Microsoft.Hosting.Lifetime[0]
      Application started. Press Ctrl+C to shut down.
info: Microsoft.Hosting.Lifetime[0]
      Hosting environment: Development
info: Microsoft.Hosting.Lifetime[0]
      Content root path: C:\src\aspnet-ai
We can confirm the site is running by navigating to :

Adding AppInisights telemetry to our website

With the app running, let's add a reference to the Microsoft.ApplicationInsights NuGet package in our application. Back to the terminal, type  dotnet add package Microsoft.ApplicationInsights.AspNetCore :

C:\src\aspnet-ai>dotnet add package Microsoft.ApplicationInsights.AspNetCore
  Writing C:\Users\bruno.hildenbrand\AppData\Local\Temp\tmp3FE8.tmp
info : Adding PackageReference for package 'Microsoft.ApplicationInsights.AspNetCore' into project 'C:\src\aspnet-ai\aspnet-ai.csproj'.
info : Restoring packages for C:\src\aspnet-ai\aspnet-ai.csproj...

(...)

info : Package 'Microsoft.ApplicationInsights.AspNetCore' is compatible with all the specified frameworks in project 'C:\src\aspnet-ai\aspnet-ai.csproj'.
info : PackageReference for package 'Microsoft.ApplicationInsights.AspNetCore' version '2.12.1' added to file 'C:\src\aspnet-ai\aspnet-ai.csproj'.
info : Committing restore...
info : Writing assets file to disk. Path: C:\src\aspnet-ai\obj\project.assets.json
log  : Restore completed in 8.68 sec for C:\src\aspnet-ai\aspnet-ai.csproj.
To confirm that the reference was successfully added to our project, check the contents of ItemGroup/PackageReference  section:
C:\src\aspnet-ai>more aspnet-ai.csproj
<Project Sdk="Microsoft.NET.Sdk.Web">
  <PropertyGroup>
    <TargetFramework>netcoreapp3.1</TargetFramework>
    <RootNamespace>aspnet_ai</RootNamespace>
  </PropertyGroup>

  <ItemGroup>
    <PackageReference Include="Microsoft.ApplicationInsights.AspNetCore" Version="2.12.1" />
  </ItemGroup>
</Project>

Initializing the AppInsights Framework

Next, we initialize AppInsights by adding the following call on your Startup class:
public void ConfigureServices(IServiceCollection services)
{
    // enable Application Insights telemetry
    services.AddApplicationInsightsTelemetry();
}

Creating an Application Insights resource

Let's now get started with AppInsights on Azure. On this section we will understand how to create an AppInsights instance on Azure and how to integrate it with our .NET Core app. The first thing to do to upload telemetry to AppInisghts in Azure is to create a resource for it. In Azure, click Create Resource and type Application Insights. Enter the requested information, for example:
After created, open your AppInsights resource and copy that instrumentation key (upper-right corner) as we'll need in the next step.

I like to create dashboards for each of my projects in Azure and also let available there the related AppInsights resources.

Setting the Instrumentation Key

Then, we need to configure a telemetry key in our appSettings.json file. Just paste the snippet below above the "Logging" section replacing <appInsights-key> with your Instrumentation Key:
"ApplicationInsights": {
    "InstrumentationKey": "<appInsights-key>"
}
Tip: You could also replace that via the command line if you're running a more capable OS 😊:
sed -i 's/<your-key>/<instrumentation-key>/' *.json
Run your application again. Your data should be available online in a couple of minutes. Let's now try to understand some of that data.

Understanding the telemetry

If everything was correctly configured, you should already see telemetry on Azure. Wondering why? It's because AppInisghts monitors a lot of things by default. To view your data, click on the Search tab of your AppInishgts resource and click Refresh:
It's also important to remember the main categories your data will fall into:
  • Request - a regular request to a resource on your site
  • Exception - an exception on your site
  • View - a custo page view 
  • Dependency - a call to an external resource (such as Sql Database, mail server, cache, etc)
  • Availability - will list availabilit requests to your app (another AppInsights feature).
Let's now create a custom telemetry to understand how it works.

Tracking Custom Events

To create custom telemetry we should use the TrackEvent method from the TelemetryClient class. TelemetryClient is required to post telemetry data to Azure. For an ASP.NET Core project, the TelemetryClient can be injected in our controller with:

Then, we use our telemetry client to post custom events with:

Reviewing our custom telemetry

In the above code we used the TrackEvent method of The TelemetryClient. That means that we're telling Azure to classify our telemetry as a Custom event. You can see in yellow This is the result of that message:
Clicking on the event shows the transaction:
And clicking on Custom Event we see more info. Note that we can also add custom metrics to our events:

Tracking Exceptions

A common solution to tracking exceptions would be leveraging the global exception handler. In .NET Core 3.0 that's done with:
Then we can review the result as shown below:

Common Questions

I think we covered the most essential bits with respect to AppInsights and ASP.NET Core web apps. Let's now review related and commonly asked questions regarding this implementation.

Suppressing Telemetry

You may be thinking that we have a lot more data than simply those Privacy requested logs I added to the code. As mentioned, AppInishgts automatically monitors lots of events in our applications. Yes, that's awesome but also brings drawbacks.We'll see in a future post how to suppress unnecessary data from our telemetry.

Where's my data?

If you don't see your data, please wait a little longer. It usually takes some time (up to 5 minutes) to get your telemetry online. Check also your Instrumentation Key. Remember, the SDK won't crash your application so debug to confirm it's working before deploying.

Deleting telemetry

You got your data on AppInsights and you realize that there's a lot there that you'd like to delete. How do we do it? We don't. Microsoft describes it on Data collection, retention, and storage in Application Insights, that the data cannot be deleted. And it makes sense. You should think of your telemetry as Event Sourcing. It's also better to have a lot than none.

But in case you really need it, a simple workaround would be creating another instance and using it.

Conclusion

On this post we reviewed how to add, configure, customize and inspect Application Insights telemetry from ASP.NET Core web applications and Azure. AppInsights is a fantastic tool to capture telemetry from your application if you're using Azure. I hope that this post demoed the most essential aspects of AppInsights including what's necessary within Azure.

What could we do next?

Looking for other ideas? Please consider:
  1. Plugging alerts on some of our events: we could configure Azure Alerts hooked into our application insights telemetry to alert on specific events. While this is out of scope for this post, I urge you to take a look at the feature. It's very nice indeed.
  2. Suppressing telemetry: you may think that you got more data than you need. Try to extend this example by suppressing some of the auto-submitted telemetry;
  3. Adding AppInsights to other types of applications: as discussed, we could also leverage AppInsights in everything that can communicate to the cloud - including our console applications, windows services and Windows Form apps.
  4. .NET Framework: this example was designed for .NET Core and won't run on .NET Framework. Despite the major changes in the API, the functionality on Azure and use of TelemetryClient class remains the same.
  5. .NET Core 3.0 and VS17: this example was written against the version 3.0 .NET Core framework and as of this post was not friendly with my Visual Studio 2017. Run it from the command line as instructed above as it will fail from VS17.

More about AppInsights

Want to know more about Application Insights? Consider reading the following articles:
  1. How to suppress Application Insights telemetry
  2. Monitor ASP.NET applications using Application Insights and Azure Alerts
  3. How to profile ASP.NET apps using Application Insights

Source Code

As always, the source code used on this post is available on GitHub.

References

See Also

Monday, February 3, 2020

How to enable ASP.NET error pages using Azure Serial Console

It's possible to enable ASP.NET error pages on Azure by using the new Azure Serial Console. Let's see how.
By default, ASP.NET web applications running on a remote server set the customErrors property to "RemoteOnly". That means that, unless you're running on the local server, you won't be able to view the original error and the stack trace related it. And that's a good thing! A lot of successful hacks derive from understanding the exception messages and working around them.

But what if you're testing a new server, a new deployment process or just released a new feature and need to enable the error pages very quickly? Well, if you're using Azure, you can use Azure Serial Console to do the job. No SSHing, no RDPing or uploading of configurations to the remote environment. Let's see how.

Azure Serial Console

Today we will use Azure Serial Console. According to Microsoft:
The Serial Console in the Azure portal provides access to a text-based console for virtual machines (VMs) and virtual machine scale set instances running either Linux or Windows. This serial connection connects to the ttyS0 or COM1 serial port of the VM or virtual machine scale set instance, providing access independent of the network or operating system state. The serial console can only be accessed by using the Azure portal and is allowed only for those users who have an access role of Contributor or higher to the VM or virtual machine scale set.
In other words, Azure Serial Console is a nice, simple and accessible tool that can be run from the Azure portal allowing us to interact with our cloud resources including our Azure App Services.

Accessing the console

To access the console for your web application, first we find our Azure App Service in the Portal by clicking on App Services:
Selecting the web site we want to open:
And click on Console on the Development Tools section. You should then see a shell similar to:

Using the Console

Now the fun part. We are ready to interact with our App Service directly from that shell. For starters, let's get some help:
The above screenshot shows some of the administrative commands available on the system. Most of them are standard DOS command prompt utilities that you probably used on your Windows box but never cared to learn. So what can we do?

Linux Tools on Azure Serial Console

Turns out that Redmond is bending to the accessibility, ubiquity and to the power of POSIX / open source tools used and loved by system administrators such as ls, diff, cat, ps, more, less, echo, grep, sed and others. So before jumping to the solution, let's review what we can do with some of these tools.
Example 1: a better dir with ls
Example 2: Creatting and appending content to files using echo, pipes and cat
Example 3: getting disk information with df
Example 4: viewing mounted partitions with mount
Example 5: Displaying differences between files using diff
Example 6: Getting kernel information using uname
Example 7: Even curl and scp is available!

Disabling Custom Errors

Okay, back to our problem. If you know some ASP.NET, you know that the trick is to modify the customErrors Element (ASP.NET Settings Schema) and set the property to   Off   . So let's see how we can change that configuration using a command line tool.

Backing up

Obviously we want to backup our web.config. I hope that's obvious with:
cp web.config web.config.orig

Using sed to replace configuration

Now, we will use sed (a tool available on the GNU operating system that Linux hackers can't live without) to change the setting directly from the console. I'm a sed geek and use it extensively in a Hugo project I've been working on (thousands of markdown files). Together with Go, the i3 window manager, Vim, ranger and grep, my Fedora workstation becomes an ideal development environment. Now, back to .NET...

Testing the Patch

We can safely test if our changes will work by typing:
sed 's/RemoteOnly/Off' web.config

Applying the Patch

Let's jump right to how to replace our customErrors element from   RemoteOnly   to   Off   ? The solution is this simple one-liner script:
sed -i 's/RemoteOnly/Off/' web.config

Switching Back

Now, obviously we may want to switch back. That's why it was important to backup your web.config before. We can switch back by replacing the changed web.config with the original:
rm web.config
mv web.config.orig web.config
Or by running sed again, this time with the parameters inverted:
sed -i 's/Off/RemoteOnly/' web.config

Security Considerations

I hope I don't need to repeat that it's unsafe to leave error pages off on your cloud services. Even if they are simply a playground, there are risks of malicious users pivoting to different services (like your database) and accessing confidential data. Please disable them as soon as possible.

What about Kudu?

Yes, Azure Kudu allows editing files on a remote Azure App Service by using a WISIWYG editor. However, we can't count on that always, everywhere. Remember, with the transition to a microservice-based architecture, more and more our apps will run on serverless and containerized environments meaning tools like that wouldn't be available. So the tip presented on this post will definitely stand the test of time! 😉

Final Thoughts

Wow, that seems a long post for such a small hack but I felt the need to stress certain things here:
  1. Developers shouldn't be afraid to use the terminal - I see this pattern especially with Microsoft developers assuming that there should always be a button to do something. The more you use the terminal, the more confident you'll be with the tools you're using regardless of where you are. 
  2. Microsoft is moving towards Linux and you should too - The GNU tools prove an unimaginable asset to know. Once you know how to use them better, you'll realize that your toolset grows and you get more creative getting things faster. Plus, the ability to pipe output between them yields unlimited possibilities. Don't know where to start? WSL is the best way to learn the Linux on Windows 10.
  3. Be creative, use the best tool for the job - choose wise the tool you use. Very frequently the command line is the fastest (and quickest) way to accomplish most of the automatic workflow. And it can be automated!

Conclusion

The Azure Serial Console can be a powerful tool to help you manage, inspect, debug and run quick commands against your Azure App Service and your Virtual Machines. And combined with the Linux tools it becomes even more powerful!

And you, what's your favorite hack?

References

See Also

Monday, January 28, 2019

A simple chat room in Vue.Js

Let's see how to build a simple chat-room using Vue.Js and why Vue.js is awesome.

Vue.Js is awesome. I've used other JavaScript libs throughout the years and as a long time Scrum master and developer, I use to do a technical retrospective after each project. In my experience, when compared to Vue.Js, other libs are heavy, slow or  over-complicated. Just do a quick search to learn why people love it.

On this post, I want to demo how elegant and simple Vue is so I built a very simple chat room frontend using Vue.js, Bootstrap and a few CDNs. You can find the source code here. It's one simple html page. Just download and open in your browser.

The source code

You can find the code for a simple chat room below. A little more then 30 lines of code to build this frontend. Wow, that's efficiency.


Posting Messages to the Room

Of course, we would like to send messages to the chat room. It's as simple as running the code below on your Developer Tools (F12) console:
app.messages.push({name: 'Some Name', msg: 'Test123', tm: (new Date()).toLocaleTimeString(), color: 'red'});
See that new messages are auto-added to the room on the bottom with very little effort. Simple, clean, elegant.

Future Enhancements

Out of the box, the code above simulates a chat room frontend for just one user. Yes you can still hack it trough the console to add other users (by pushing to the participants array) but it's not usable yet. One interesting exercise would be enhancing the above code with WebSockets so that multiple users can connect at the same room and chat trough it.

I already have that code written in SignalR Core and a ASP.NET Core web app as the backend and will soon open source it. Stay tuned.

Source Code

Vue did all the work. All the rest is available on my GitHub.

See Also

Monday, November 26, 2018

JavaScript caching best practices in ASP.NET

Looking to improve the performance of your site with a few simple tips? Read to understand.

One of the first things a developer should do when inspecting the performance of their ASP.NET website is to understand how the static assets of the site are served and if they are correctly being cached by the visitor's browser. On this post let's review a solution that works well, is built into the ASP.NET framework and helps you reduce costs and improve the performance of your site.

How caching works

Before getting to solution, it's important to understand how web browsers cache static resources. In order to improve performance and reduce network usage, browsers usually cache JavaScript and external resources a web page may depend on. However, under certain circumstances, modifications on those files won’t be automatically detected by the browser resulting in users seeing stale content.

Approaching the Problem

A common solution to this problem is to add query string parameters to the referenced JavaScript files. Assuming that the suffix has changed, the browser treats the new url as a new resource (even if it’s the same file name), issuing a new request and refreshing the content. Whenever that resource changes, the query string changes and the browser automatically refreshes the resource making users always seeing the most up to date content. Also note how we can use that approach for all linked resources: JavaScript files, CSS, images and even icons.

So how to ensure that our pages output references as showed above? In ASP.NET, the solution to that is to use bundles.

Bundling and Minification

According to Microsoft,
Bundling and minification are two techniques you can use in ASP.NET 4.5 to improve request load time. Bundling and minification improves load time by reducing the number of requests to the server and reducing the size of requested assets (such as CSS and JavaScript.)
Another benefit of bundling is that, by default, the Asp.Net framework comes with a cache-prevention feature
Bundles set the HTTP Expires Header one year from when the bundle is created. If you navigate to a previously viewed page, Fiddler shows IE does not make a conditional request for the bundle, that is, there are no HTTP GET requests from IE for the bundles and no HTTP 304 responses from the server. You can force IE to make a conditional request for each bundle with the F5 key (resulting in a HTTP 304 response for each bundle). You can force a full refresh by using ^F5 (resulting in a HTTP 200 response for each bundle.)

Bundling in practice

Before reviewing the actual solution, let’s compare a page serving bundled content and a page showing unbundled content and understand how the browser behaves on each of them.

Reviewing a request with bundled content

Let's take a look at a request using bundles to understand this a little better:
From the above, note that:
  • in yellow, the auto-added suffix. We'll see later why this is there and how to do it ourselves;
  • in red, a confirmation that that resource was cached by the browser and loaded locally, not issuing any request to the server;
  • the time column on the right, shows the time spent by the browser to load that resource. For most files it was 0ms (or, from cache);
Now if you expand the bundle request, we can see that the cache is set to 1 year from the request date, meaning that the file won't be refreshed for the next 365 days:

Serving unbundled content

Now if you take a look at a request to a page that has unbundled content, you'll see that the resources are also being cached (column Size). However, individual requests are still being issued for each file referenced causing two problems:
  1. In HTTP 1.1, there’s a virtual limit of max 6 concurrent requests per domain. On that case, after the 6th request is issued, others will be waiting until a new slot is available
  2. Because after each deploy file names won’t change, the browser will reuse the cached version event if modified (because by seeing the same file name, it understands the resource didn’t change). Our problem is that updates on those files won’t be automatically refreshed making users see stale content.

Bundling in ASP.NET

Because the ASP.NET framework already provides bundling, it's Simply by using bundles and referencing them in our scripts solves the issue. In ASP.NET we do it by registering our bundle in our Global.asax.cs file in the Application_Start method:
protected void Application_Start()
{
    // ...
    BundleConfig.RegisterBundles(BundleTable.Bundles);
    // ...
}
Where the RegisterBundles method, looks like:
public static void RegisterBundles(BundleCollection bundles)
{
    bundles.Add(new StyleBundle("~/Content/css")
        .Include("~/Content/images/icon-fonts/entypo/entypo.css")
        .Include("~/Content/site.css"));
}
Now, build and deploy your site. Just don't forget to remove debug=true on your web.config, compilation element:
<compilation targetFramework="4.7.2" />
Whenever that content changes or a new deploy happens, that querystring value for each bundle is automatically modified by the framework ensuring that users will always have your latest JavaScript, optimized, without risking serving an obsolete version to your users.

Conclusion

On this post we reviewed some practices for increasing the performance of your site. Caching, bundling and minification are standard practices that developers should consider to increase the performance of their ASP.NET websites.

References

See Also

Monday, November 12, 2018

Windows Subsystem for Linux, the best way to learn Linux on Windows

Want to learn Linux but don't know how/where to start? WSL may be a good option.
In 2018, Microsoft released the Windows Subsystem for Linux (WSL). WSL lets developers run the GNU/Linux shell on a Windows 10 PC, a very convenient way to access the beloved tools, utilities and services Linux offers without the overhead of a VM.
WSL is also the best way to learn Linux on Windows!

About WSL

Currently WSL supports Ubuntu, Debian, Suse and Kali distributions and can:
  • run bash shell scripts 
  • run GNU/Linux command-line applications including: vim, emacs, tmux
  • run programming languages like JavaScript, Node.js, Ruby, Python, Golang, Rust, C/C++, C# & F#, etc.
  • run background services like ssh shells, MySQL, Apache, lighttpd;
  • install additional software using own GNU/Linux distribution package manager.
  • invoke Windows applications.
  • access your Windows filesystem

Installing WSL on Windows 10

Installing WSL is covered by Microsoft on this article and is as easy is two steps.

Step 1 - Run a Powershell Command

On your Windows PC, you will need to run this PowerShell script as Administrator (shift + right-click):
Enable-WindowsOptionalFeature -Online -FeatureName Microsoft-Windows -Subsystem-Linux
After the installation ends, restart your PC.

Step 2 - Install WSL from the Windows Store

After the reboot, WSL can be installed through the Windows Store. To open the Windows Store on your Windows 10, click:
Start -> Type Store -> Click on the Windows Store:
Then type "Linux" on the search box and you should get something similar results to this:

Click on the icon, accept the terms and Windows will download and install WSL for you.

Running WSL

After installation started, you will be prompted to enter your username and password. After done, you'll get a cool Linux terminal to start playing with. You can even have multiple Distros installed on your Windows 10 machine. On mine, I installed Debian and Ubuntu.

Using the Terminal

Okay, so now that we have access to our Linux shell, what to do next? Let's go through these use cases:
  • Accessing my Windows files
  • Access internet resources
  • I install software

Accessing Windows Files

WSL mounts your Windows files on the /mnt/c mount point. To verify on yours type mount on the command prompt and look for C: on it. Your windows files should be there.
In case you don't know Linux, listing files is done with   ls  . This is the content of my C drive as as seen from WSL:

Accessing the Internet

Your WSL instance should have access to the internet. Testing the internet is as simple as doing a ping to Google:
You can also verify your network info with ifconfig:
 

Installing Software

Installing software on Ubuntu/Debian is done by the apt command. For example, this is how we search packages:
To install packages, use apt-get install. For example, to install Ruby on the Ubuntu WSL, run the command below:
sudo apt-get install ruby-full

Using git

We can leverage apt and install git with:
sudo apt-get install git
... # apt installs git
git --help # to get help
And, I'd recommend learn to use it on the terminal. Atlassian has an excellent tutorial to learn git.

Getting Help

Need help? The man tool is there to help you. For example, we could run the commands below to get help on git for example:
man git

Additional tip: try the new Windows Terminal

And, if you want to invest more time on your WSL, I'd suggest that you install the new Windows Terminal. Download the last release from GitHub and install it on your box. It's very customizeable and contains shells for WSL, PowerShell, Azure CLI and the traditional Windows terminal.

What's next?

Now that you know how to locate your files, have access to the internet and installed some software, I'd recommend that you:

Conclusion

Congratulations! You have the WSL installed on your machine and now you have a Linux terminal to starting playing with. Now what? The first thing I'd recommend is to get comfortable with basic system commands, understand the filesystem, learn to add/remove software and run administrative tasks on the terminal. WSL is perfect for users who want to learn Linux and to those who spent a lot of time on Windows but need access to a Linux terminal.

If you want to know more about my setup, here's why I use Fedora Linux with the fantastic i3 window manager on the desktop and CentOS on servers. Happy hacking!

References

See Also

    About the Author

    Bruno Hildenbrand      
    Principal Architect, HildenCo Solutions.