Visual Studio Application Insights – Application Availability and up-time monitoring for free!

I know there are already apps out there that perform application uptime monitoring (and I’ve had to build one or two myself in the past), but now Microsoft have released (in preview) a way to monitor your on-premise and cloud applications using Visual Studio Online – Application Insights. Best of all, it’s free!

From what I can tell thus far, it allows one to monitor not only server metrics such as CPU, network utilization etc, but also perform uptime tests, by either pinging application urls (as defined by you), or running Visual Studio web tests which you’ve already written and exported to file.

Setting up application insights for The Kangaroo Court (or any Azure web role)

After reading about application insights today on Brian Harry’s blog, I thought why not try implementing it on The Kangaroo Court? FYI – The Kangaroo Court is a site hosted on Windows Azure, using a web role as part of a cloud service, so I won’t be covering the steps you’d go through for an on-premise application.

Below are the steps to go through to get it up & running:

  1. Get a preview invitation code – I got mine from http://blogs.msdn.com/b/bharry/archive/2013/11/16/another-application-insights-invitation-code.aspx
  2. Go to the Visual Studio online site, hit the Try application insights link, then enter your invitation code (which hopefully is still valid)
  3. Follow the wizard – I selected the hosted in Azure cloud service option – and then download the two files as explained in the wizard
  4. Copy the downloaded files into Visual Studio as explained in the wizard
  5. Update your serviceconfiguration files. Note that the wizard only said I needed to update my serviceconfiguration.cloud.csdef file, however, I had to update both the cloud and local files. If you watch the video provided, it shows the user updating both anyhow, so no problem there. Note also that if you already have a ConfigurationSettings or Startup section defined in your file, you’ll simply need to update the sections with the new configuration variables and startup task from the code provided by the wizard, rather than copying over the entire ConfigurationSettings and Startup sections. I initially did this, and as expected, Visual Studio told me things were broken!
  6. Check in and redeploy the cloud service to Azure, then wait 10 minutes or so
  7. Go to the Application Insights page on the Visual Studio website. My url looked like this: https://my-username.visualstudio.com/_appanalytics/_overviews/dashboards.
  8. Check out the data that should be streaming in! (If there’s no data there, wait a while longer, and make sure that your cloud service is up & running too.)

Image

I particularly like the Dashboards tab, which allowed me to setup a dedicated dashboard for the Kangaroo Court, showing key metrics. I could then drill down into the Performance tab if I needed to see more information. See the screenshot below for my example dashboard:

Monitoring site uptime and availability

The next step was to setup some uptime monitoring, so I could make sure The Kangaroo Court website was up & running, and that if it wasn’t, I would know about it. As an aside – In my view this is an often-overlooked and yet essential feature for any website, as most developers I know (and certainly business stakeholders) would rather know that their site was down BEFORE any customers did, rather than finding out later on that their site was down for hours before they knew about it, and they’ve lost business because of it… This is particularly true for sites where the only way a customer can inform you of any errors is via a Contact Us page, which is pretty useless if the site isn’t available for users in the first place.

With that in mind, I went to the Availability tab, and added a new ‘Synthetic monitor’. By default the first one your create is called my_first_test, andI couldn’t figure out how to rename it. I also couldn’t find a way to add additional tests, so simply had to update the single test to suit my scenario (pinging the Kangaroo Court homepage) as shown below (note that my email isn’t shown):

Image

As you can see, I set it up to ping the site every five minutes from two locations:

  • One in Sydney
  • One in the US

If 2 locations failed within a ten minute window, I will be sent an email notifying me that the site is down, so that I can then take appropriate action. Awesome!!!!

Well that’s all I’ve done thus far, but as you can see, Application Insights provides some pretty cool features out of the box, and I haven’t even dug too deep into it yet. I’ll continue to explore and hopefully post more soon. Till then – enjoy!

Ps. If you have a sports team with a fines system, or are thinking of starting one – please do check out The Kangaroo Court

Advertisements

Azure finally gets a job scheduler!

Yes I know they already had scheduling available in mobile services, but now it looks like job scheduling (aKa Cron jobs) will be one of the standard features in Windows Azure – woohoo!

As per the documentation – http://www.windowsazure.com/en-us/services/scheduler/ – the scheduler can be used to:

  • Perform GET/POST requests against your application urls and/or
  • Send messages to Azure service bus for your worker roles to act upon

I’ve been waiting for this feature for ages, as I’ve had to rely on other Cron services such as SetCronJob for this sort of functionality in the past (where the Cron service calls one of my website urls, which then performs the work and returns a 200 status code for success). Whilst SetCronJob is great, it’s nice to be able to perform the scheduling within Azure, so it can sit alongside the other components of whatever service you’re building. 

Currently the scheduling service is in preview and available via the REST API (there’s a Nuget package to use it), but no doubt it will soon be added to the management portal. 

Thanks for adding this Microsoft!!!!

LocalDB and SQL Profiler with SSMS 2012

I’m not sure why, but for sometime now I thought that it wasn’t possible to run SQL Server Profiler on LocalDb databases to see the SQL queries being issued. To my surprise I found out that it actually is possible after all if you have SQL Server Management Studio 2012!

All you have to do when connecting with SQL Server Profiler 2012 is to connect to the instance named (localdb)\v11.0 

Thanks to http://www.stratospher.es/blog/post/connecting-to-localdb-with-sql-server-management-studio-2012-ssms for pointing this neat trick out

 

Keeping sensitive config settings secret with Azure Websites and GitHub

During my foray into Azure web sites as part of the Windows Azure Developer challenge, I came up with what I think is a useful pattern for keeping your sensitive config settings secret when using Azure websites and GitHub. It allows you to:

  • Access them when debugging locally
  • Access them when deploying to the cloud
  • Stop them getting into source control on GitHub where others can see them

You can read all about it at http://www.codeproject.com/Articles/602146/Keeping-sensitive-config-settings-secret-with-Azur 

Don’t forget to vote for the article if you like it. Also, keep an eye on my daily progress in the challenge at http://www.codeproject.com/Articles/584534/YouConf-Your-Live-Online-Conferencing-Tool

Azure Developer Challenge – YouConf – Day 12 (May 10)

Today I spend most of my time writing up the final article content for challenge two. I also implemented the SignalR functionality for keeping the live video url up-to-date as below.

SignalR

Keeping the live video feed url up to date with SignalR

SignalR is a great tool for providing realtime updates to clients, and the Jabbr chat site provides a great example of how to harness this technology. For live conference page I used SignalR in a similar way to dotNetConf to ensure that if a conference presenter updated the Google Hangout id for the conference, viewers would be provided with the updated url without having to refresh their page.

To install SignalR, I installed the SignalR Nuget package as below:

NugetSignalRClient

I then set about building a SignalR hub and client. My main issue came with how to push the notification to my SignalR hub from the Conference Controller. To give some context, here’s my YouConfHub class:

public class YouConfHub : Hub
{
    public Task UpdateConferenceVideoUrl(string conferenceHashTag, string url)
    {
        //Only update the clients for the specific conference
        return Clients.All.updateConferenceVideoUrl(url);
    }
    public Task Join(string conferenceHashTag)
    {
        return Groups.Add(Context.ConnectionId, conferenceHashTag);
    }
}  

and my client javascript code:

<script type="text/javascript" src="http://www.codeproject.com/ajax.aspnetcdn.com/ajax/signalr/jquery.signalr-.0.1.min.js"></script>
<script type="text/javascript">
// <![CDATA[$.signalR || document.write('<scr' + 'ipt src="~/scripts/jquery.signalr-1.0.1.min.js")></sc' + 'ript>');
// ]]>
</script>
<script type="text/javascript" src="~/signalr/hubs"></script>
<script type="text/javascript">
        $(function () {
            $.connection.hub.logging = true;
            var youConfHub = $.connection.youConfHub;
            youConfHub.client.updateConferenceVideoUrl = function (hangoutId) {
                $("#video iframe").attr("src",
                   "http://youtube.com/embed/" + hangoutId + "?autoplay=1");
            };
            var joinGroup = function () {
                youConfHub.server.join("@Model.HashTag");
            }
            //Once connected, join the group for the current conference.
            $.connection.hub.start(function () {
                joinGroup();
            });
            $.connection.hub.disconnected(function () {
                setTimeout(function () {
                    $.connection.hub.start();
                }, 5000);
            });
        });

   </script>
} 

See the UpdateVideoUrl method in the Hub? I wanted to call that from my ConferenceController when a user updated the conference hangout id/url, and thought I could do so by getting an instance of the Hub, then calling the method on it. E.g.

var context = GlobalHost.ConnectionManager.GetHubContext();
context.UpdateConferenceVideoUrl("[conference hashtag]", "[new hangout id]");

Sadly, it turns out that you can’t actually call methods on the hub from outside the hub pipeline 😦 You can, however, call methods on the Hub clients, and groups. So, in my conference controller’s edit method, I was able to use the following code to notify all clients for the specific conference that they should update their url as follows:

if (existingConference.HangoutId != conference.HangoutId)
{
    //User has changed the conference hangout id, so notify any listeners/viewers
    // out there if they're watching (e.g. during the live conference streaming)
    var context = GlobalHost.ConnectionManager.GetHubContext();
    context.Clients.Group(conference.HashTag).updateConferenceVideoUrl(conference.HangoutId);
}

Not too bad in the end eh?

Article progress

My article is now almost complete, with just a few touchups required. I’ll probably spend the next day or two tidying up the site’s css, javascript etc and making sure I haven’t missed anything!

Azure Developer Challenge – YouConf – Day 11 (May 9th)

Quite a bit to report on today….

Setting up a Dupal blog website

Since about day 3 I’d been thinking of moving the posts on my daily progress into a separate blog, as there’s enough information to make some of them worth an entire entry. I was also aware that one of the competition points for this section was around what we do with our other 9 websites. So I figured I’d see if it really was as easy to setup a blog as they made out in http://www.windowsazure.com/en-us/develop/php/tutorials/website-from-gallery/.

I found the above post, and a few others, which setup WordPress blogs, so I thought why not try a different one to make things a bit more interesting. In the end I went with Drupal, as an old workmate of mine used to rave about it. I found an article for guidance on installing WordPress at http://www.windowsazure.com/en-us/develop/php/tutorials/website-from-gallery/, so used this as a guide. Here’s what I did:

  1. Selected the web sites note in the Azure management screen and clicked New
  2. Selected Compute > Web site > From gallery
  3. Selected Acquia Drupal 7 (Note that later I realized there were specific blog applications, so if doing this again I would use one of those…)
    DrupalSelectFromGallery
  4. Chose the url for my blog – youconfblog – and chose to create a new mysql database.
    DrupalConfigureUrl
  5. Followed the rest of the prompts and provided my email address etc, and let it complete. I was then able to browse to my vanilla Drupal installation at http://youconfblog.azurewebsites.net/
  6. I then wanted to install a blog theme, and I found a nice looking one at http://drupal.org/project/responsive_blog
  7. To install it on my site, I found the .tar.gz url for the installation package – http://ftp.drupal.org/files/projects/responsive_blog-7.x-1.6.tar.gz – and in the admin section of my Drupal site, selected Appearance from the top menu, then Install new theme.
  8. I provided the url to the responsive blog package, and then let Drupal do its thing and complete the installation.
  9. I then configured the theme by going to the Settings page for the theme, and added my own YouConfBlog logo, and disabled the slideshow on the homepage.
    DrupalConfigureTheme

And now I have a nice themed Drupal site! http://youconfblog.azurewebsites.net

I then added a couple of blog entries for day one & two, by copying & pasting the html code from my CodeProject article into the blog entry.

What, wait a minute, aren’t we supposed to avoid duplication?

After getting my second day’s progress blogpost into my Drupal site, I realized that if I was to copy & paste all the articles:

  1. It could take a while
  2. I’d have to do the same in future for all my other daily progress updates
  3. If I changed one, I’d have to update the other
  4. I wouldn’t be keeping in line with the CodeProject terms, which discourage you from posting content from CodeProject elsewhere
  5. I might make it harder for the judges to assess me article, as now they’d have to look in two places

In light of the above, I left my two initial blog posts intact, and decided that for now I’ll only post updates in my CodeProject article, since the goal of setting up the blog was to see if it really was as easy as others had made out (whilst learning along the way), which indeed it was. I’ll leave the blog in place though, as it deserves to be part of my entry for challenge two as one of the other 9 websites.

Error Logging

Usually one of the first things I do when creating a project is setting up Error Logging. Sometimes it’s to a text file, sometimes to xml, sometimes to a database, depending on the application requirements. My favourite logging framework for .Net web apps is Elmah, as it takes care of catching unhandled exceptions and logging them to a local directory right out-of-the-box. It has an extension for MVC too, which is awesome.

Elmah allows you to specify the route url you’re like to use for viewing errors in your web.config. It also allows you to restrict access to the log viewer page if needed, using an authorization filter so you can specify which user roles should have access. At this stage I haven’t implemented membership, and so can’t restrict access via roles. Thus I’m going to leave remote access to the logs off (which it is by default). For part 3 when I implement membership I’ll update this. Note that for any production application I’d never leave the error log page open to the public, as it would give away far too much to anyone who happens to come snooping.

Right – to setup Elmah logging I did the following:

  1. Opened the nuget package manager for my YouConf project in Visual Studio, and searched for Elmah as below
    ElmahNuget
  2. Selected the Elmah.Mvc package and installed it. This added an section to my web.config, and also some appsettings for configuring Elmah.
  3. Opened up my web.config and (using the ol’ security through obfuscation mantra) updated my elmah.mvc.route appsetting value to be a long complicated url – superdupersecretlogdirectorythatwillbeprotectedonceweimplementregistrationwithSqlandsimplemembership
  4. Fired up the local debugger and navigated to http://localhost:60539/superdupersecretlogdirectorythatwillbeprotectedonceweimplementregistrationwithSqlandsimplemembership
  5. Voila – we have an error viewer!
    ElmahLogViewer
  6. Now if I trigger an error by going to a dodgy url e.g. http://localhost:60539/// I should see an error appear in my list.
    ElmahLogViewerError
  7. And voila – there it is!
Logging to persistent storage

By default Elmah logs exceptions in-memory, which is great when you’re developing, but not so good when you deploy to another environment and want to store your errors so you can analyze them later. So, how do we setup persistent storage?

In the past I’ve used local xml file, which is really easy to configure in Elmah by adding the following line to the section of your web.config as follows:

<elmah>
  <errorLog type="Elmah.XmlFileErrorLog, Elmah" logPath="~/App_Data" />
</elmah>

This is fine if you’re working on a single server, or can log to a SAN or similar and then aggregate your log files for analysis. However, in our case we’re deploying to Azure, which means there are no guarantees that our site will stay on a single server for its whole lifetime. Not to mention that the site will be cleared each time we redeploy, along with any local log files. So what can we do?

One option is to setup Local Storage in our Azure instance. This will give us access to persistent storage will not be affected by things like web role recycles or redeployments. To use this, we would need to:

  1. Setup local storage as per the following article (http://msdn.microsoft.com/en-us/library/windowsazure/ee758708.aspx)
  2. Configure our error logger to use this directory instead of App_Data.
  3. Sit back and relax

The above solution would work fine, however, since I’m already using Azure Table storage, I thought why not use it for storing errors as well? After some googling I came upon the following package for using table storage with Elmah, but upon downloading the code realized it wasn’t up-to-date with the Azure Storage v2 SDK. It was easy to modify though, with the end result being the class below.

namespace YouConf.Infrastructure.Logging
{
    /// <summary>
	/// Based on http://www.wadewegner.com/2011/08/using-elmah-in-windows-azure-with-table-storage/
    /// Updated for Azure Storage v2 SDK
    /// </summary>
	public class TableErrorLog : ErrorLog
    {
        private string connectionString;
        public const string TableName = "Errors";
        private CloudTableClient GetTableClient()
        {
            // Retrieve the storage account from the connection string.
            CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
               CloudConfigurationManager.GetSetting("StorageConnectionString"));
            // Create the table client.
            return storageAccount.CreateCloudTableClient();
        }
        private CloudTable GetTable(string tableName)
        {
            var tableClient = GetTableClient();
            return tableClient.GetTableReference(tableName);
        }
        public override ErrorLogEntry GetError(string id)
        {
            var table = GetTable(TableName);
            TableQuery<ErrorEntity> query = new TableQuery<ErrorEntity>();
            TableOperation retrieveOperation = TableOperation.Retrieve<ErrorEntity>("", id);
            TableResult retrievedResult = table.Execute(retrieveOperation);
            if (retrievedResult.Result == null)
            {
                return null;
            }
            return new ErrorLogEntry(this, id,
              ErrorXml.DecodeString(((ErrorEntity)retrievedResult.Result).SerializedError));
        }
        public override int GetErrors(int pageIndex, int pageSize, IList errorEntryList)
        {
            var count = 0;
            var table = GetTable(TableName);
            TableQuery<ErrorEntity> query = new TableQuery<ErrorEntity>()
            .Where(TableQuery.GenerateFilterCondition(
              "PartitionKey", QueryComparisons.Equal, TableName))
            .Take((pageIndex + 1) * pageSize);
            //NOTE: Ideally we'd use a continuation token
            // for paging, as currently we're retrieving all errors back
            //then paging in-memory. Running out of time though
            // so have to leave it as-is for now (which is how it was originally)
            var errors = table.ExecuteQuery(query)
                .Skip(pageIndex * pageSize);
            foreach (var error in errors)
            {
                errorEntryList.Add(new ErrorLogEntry(this, error.RowKey,
                    ErrorXml.DecodeString(error.SerializedError)));
                count += 1;
            }
            return count;
        }
        public override string Log(Error error)
        {
            var entity = new ErrorEntity(error);
            var table = GetTable(TableName);
            TableOperation upsertOperation = TableOperation.InsertOrReplace(entity);
            table.Execute(upsertOperation);
            return entity.RowKey;
        }
        public TableErrorLog(IDictionary config)
        {
            Initialize();
        }
        public TableErrorLog(string connectionString)
        {
            this.connectionString = connectionString;
            Initialize();
        }
        void Initialize()
        {
            CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
               CloudConfigurationManager.GetSetting("StorageConnectionString"));
            var tableClient = storageAccount.CreateCloudTableClient();
            CloudTable table = tableClient.GetTableReference("Errors");
            table.CreateIfNotExists();
        }
    }
    public class ErrorEntity : TableEntity
    {
        public string SerializedError { get; set; }
        public ErrorEntity() { }
        public ErrorEntity(Error error)
            : base(TableErrorLog.TableName,
              (DateTime.MaxValue.Ticks - DateTime.UtcNow.Ticks).ToString("d19"))
        {
            PartitionKey = TableErrorLog.TableName;
            RowKey = (DateTime.MaxValue.Ticks - DateTime.UtcNow.Ticks).ToString("d19");
            this.SerializedError = ErrorXml.EncodeString(error);
        }
    }
} 

This will log all errors to the Errors table in Azure table storage, and also take care of reading them back out again.

I also had to update my web.config to use the new logger class as follows:

<elmah>
    <errorLog type="YouConf.Infrastructure.Logging.TableErrorLog, YouConf" />
</elmah>

Now if I generate an error I’ll still see it on the Elmah log viewpage, but I can also see it in my table storage. I’m using dev storage locally, so I can fire up the wonderful Azure Storage Explorer and view my Error Log table as shown below:

AzureStorageExplorerErrors

and also on-screen:

ElmahLogViewerWithTableStorage

Lovely!