Visual Studio Application Insights – Application Availability and up-time monitoring for free!

I know there are already apps out there that perform application uptime monitoring (and I’ve had to build one or two myself in the past), but now Microsoft have released (in preview) a way to monitor your on-premise and cloud applications using Visual Studio Online – Application Insights. Best of all, it’s free!

From what I can tell thus far, it allows one to monitor not only server metrics such as CPU, network utilization etc, but also perform uptime tests, by either pinging application urls (as defined by you), or running Visual Studio web tests which you’ve already written and exported to file.

Setting up application insights for The Kangaroo Court (or any Azure web role)

After reading about application insights today on Brian Harry’s blog, I thought why not try implementing it on The Kangaroo Court? FYI – The Kangaroo Court is a site hosted on Windows Azure, using a web role as part of a cloud service, so I won’t be covering the steps you’d go through for an on-premise application.

Below are the steps to go through to get it up & running:

  1. Get a preview invitation code – I got mine from
  2. Go to the Visual Studio online site, hit the Try application insights link, then enter your invitation code (which hopefully is still valid)
  3. Follow the wizard – I selected the hosted in Azure cloud service option – and then download the two files as explained in the wizard
  4. Copy the downloaded files into Visual Studio as explained in the wizard
  5. Update your serviceconfiguration files. Note that the wizard only said I needed to update my file, however, I had to update both the cloud and local files. If you watch the video provided, it shows the user updating both anyhow, so no problem there. Note also that if you already have a ConfigurationSettings or Startup section defined in your file, you’ll simply need to update the sections with the new configuration variables and startup task from the code provided by the wizard, rather than copying over the entire ConfigurationSettings and Startup sections. I initially did this, and as expected, Visual Studio told me things were broken!
  6. Check in and redeploy the cloud service to Azure, then wait 10 minutes or so
  7. Go to the Application Insights page on the Visual Studio website. My url looked like this:
  8. Check out the data that should be streaming in! (If there’s no data there, wait a while longer, and make sure that your cloud service is up & running too.)


I particularly like the Dashboards tab, which allowed me to setup a dedicated dashboard for the Kangaroo Court, showing key metrics. I could then drill down into the Performance tab if I needed to see more information. See the screenshot below for my example dashboard:

Monitoring site uptime and availability

The next step was to setup some uptime monitoring, so I could make sure The Kangaroo Court website was up & running, and that if it wasn’t, I would know about it. As an aside – In my view this is an often-overlooked and yet essential feature for any website, as most developers I know (and certainly business stakeholders) would rather know that their site was down BEFORE any customers did, rather than finding out later on that their site was down for hours before they knew about it, and they’ve lost business because of it… This is particularly true for sites where the only way a customer can inform you of any errors is via a Contact Us page, which is pretty useless if the site isn’t available for users in the first place.

With that in mind, I went to the Availability tab, and added a new ‘Synthetic monitor’. By default the first one your create is called my_first_test, andI couldn’t figure out how to rename it. I also couldn’t find a way to add additional tests, so simply had to update the single test to suit my scenario (pinging the Kangaroo Court homepage) as shown below (note that my email isn’t shown):


As you can see, I set it up to ping the site every five minutes from two locations:

  • One in Sydney
  • One in the US

If 2 locations failed within a ten minute window, I will be sent an email notifying me that the site is down, so that I can then take appropriate action. Awesome!!!!

Well that’s all I’ve done thus far, but as you can see, Application Insights provides some pretty cool features out of the box, and I haven’t even dug too deep into it yet. I’ll continue to explore and hopefully post more soon. Till then – enjoy!

Ps. If you have a sports team with a fines system, or are thinking of starting one – please do check out The Kangaroo Court


Azure finally gets a job scheduler!

Yes I know they already had scheduling available in mobile services, but now it looks like job scheduling (aKa Cron jobs) will be one of the standard features in Windows Azure – woohoo!

As per the documentation – – the scheduler can be used to:

  • Perform GET/POST requests against your application urls and/or
  • Send messages to Azure service bus for your worker roles to act upon

I’ve been waiting for this feature for ages, as I’ve had to rely on other Cron services such as SetCronJob for this sort of functionality in the past (where the Cron service calls one of my website urls, which then performs the work and returns a 200 status code for success). Whilst SetCronJob is great, it’s nice to be able to perform the scheduling within Azure, so it can sit alongside the other components of whatever service you’re building. 

Currently the scheduling service is in preview and available via the REST API (there’s a Nuget package to use it), but no doubt it will soon be added to the management portal. 

Thanks for adding this Microsoft!!!!

LocalDB and SQL Profiler with SSMS 2012

I’m not sure why, but for sometime now I thought that it wasn’t possible to run SQL Server Profiler on LocalDb databases to see the SQL queries being issued. To my surprise I found out that it actually is possible after all if you have SQL Server Management Studio 2012!

All you have to do when connecting with SQL Server Profiler 2012 is to connect to the instance named (localdb)\v11.0 

Thanks to for pointing this neat trick out


Responsive design – part one – tables

One of my recent tasks involved updating The Kangaroo Court to be usable on mobile devices, including phones and tablets. We already had a separate mobile site at, however, I didn’t want to maintain this anymore. Why, you ask?

  • It only offered a small subset of the functionality available on the main site
  • It required additional maintenance to keep it up to date with the main site whenever I made changes
  • I wanted to enhance my knowledge of responsive design

In this series I’ll look at some of the main tasks involved in making the site responsive, starting with how I approached what was initially a bit of a showstopper – tables! The Kangaroo Court has a large number of screens with tables, such as the games list (as shown below), the fines report, and others.


As you can see, there a number of columns, and each contains enough data that I couldn’t simply leave the tables as they were as the columns would become very narrow on small screens. At first I thought that I could simply hide non-essential columns on mobile devices, however, this would have the downside that not all data would be available to mobile users, which wouldn’t be much of an improvement on the existing solution (a separate mobile site). Thankfully there are already some really cool solutions for making tables responsive, and I’ve listed the ones I looked at below.

  • – cool idea, but would force a lot of vertical scrolling in tables with lots of rows/columns
  • Media table – uses javascript to show/hide columns depending on the size of the device, and allows the user to also choose which columns to show/hide.
  • FooTable – hides non-essential columns on smaller devices, and allows the user to expand the table row to show the hidden data.

I decided to use FooTable as it looked simple to configure, with an end-result that was simply beautiful to behold. I’d recommend reading the article on css-tricks as it gives a good rundown of how FooTable works, and also outlines some alternative approaches that the author had already reviewed. It also had plugins which allowed for filtering and sorting on the client side, which is something I thought would be useful.

Adding FooTable – the steps

Downloading the source code

To get started, I first went to GitHub and downloaded the source code for FooTable –

Adding the necessary files to our web project in Visual Studio

I unzipped the downloaded file, and copied the footable.js and footable.sortable.js files into my /scripts/ folder in Visual Studio. I then copied the footable-0.1.css and footable.sortable-0.1.css into a footable folder within my content folder.

I also renamed the images folder in the downloaded file to be footable, and then copied the whole folder into the images folder in my Visual Studio project, as shown below:


Finally, I updated the image paths in the two CSS files to point to the /images/footable/ folder that I’d created earlier.

Updating our html

Next I had to update the html for each table that I wanted to use FooTable on.

As per the FooTable documentation, FooTable allows you to use data-attributes to specify which columns should always be displayed, and which should be hidden on tablets or mobile. By default, the breakpoints to define tablet and mobile are 480px and 1024px respectively, but you can customize those if you need to.

I marked up the code for the Games list page as follows (note the data-attributes on the <th> elements, and the footable class on the <table> element):

<table class="footable">
            <th data-class="expand" style="width: 20%">Venue
            <th style="width: 15%">
                @Html.DisplayNameFor(model => model.Date)
            <th style="width: 20%">
                @Html.DisplayNameFor(model => model.Opponent)
            <th data-hide="phone"></th>
        @foreach (var item in Model)
                    @Html.DisplayFor(modelItem => item.FieldName)
                    @item.Date.ToString("ddd, dd MMM yyyy")
                    @Html.DisplayFor(modelItem => item.Opponent)
                <td class="actions">
                    @if ((bool)ViewBag.CurrentUserCanEdit)
                        @Html.ActionLink("Edit", "Edit", new { id = item.ID }) 
                        @Html.ActionLink("Delete", "Delete", new { id = item.ID })
                        @Html.ActionLink("Add fine", MVC.Fines.Create(item.ID, Request.Url.PathAndQuery)) 
                        @Html.ActionLink("Bulk fine", MVC.Fines.BulkCreate(item.ID, Request.Url.PathAndQuery))
                    @Html.ActionLink("View Fines", MVC.Game.Fines(item.ID))

Adding javascript to turn our tables into footables

This one was easy – I simply updated my main.js file to apply FooTable to any table with the footable class:

//Any footable will be made responsive
    $("#content table.footable").footable();

Customising the CSS

Finally, I updated some of the CSS rules in the footable-0.1.css file with colours that matched the existing Kangaroo Court scheme, for example:


.footable > tfoot > tr > th, .footable > tfoot > tr > td {
  background-color: #e2e2e2;
  background-image: -webkit-gradient(linear, left top, left bottom, from(#f5eeee), to(#e2e2e2));
  background-image: -webkit-linear-gradient(top, #f5eeee, #e2e2e2);
  background-image: -moz-linear-gradient(top, #f5eeee, #e2e2e2);
  background-image: -ms-linear-gradient(top, #f5eeee, #e2e2e2);
  background-image: -o-linear-gradient(top, #f5eeee, #e2e2e2);
  background-image: linear-gradient(to bottom, #f5eeee, #e2e2e2);
  -webkit-box-shadow: 0 1px 0 rgba(255,255,255,.8) inset;
  -moz-box-shadow: 0 1px 0 rgba(255,255,255,.8) inset;
  box-shadow: 0 1px 0 rgba(255,255,255,.8) inset;
  border-top: 1px solid #ccc;
  text-shadow: 0 1px 0 rgba(255,255,255,.5);
  padding: 10px;

Viewing the Results

Now for the exciting part – the results! On the desktop:


and at a screen width of 320px (e.g. mobile) once I’d selected the first game:



As you can see, it was pretty easy to add FooTable, and the end result is fantastic.

What’s next?

In the next article I’ll look at adding sorting with FooTable, and also some of the other issues I had to resolve while making the site responsive.


Keeping sensitive config settings secret with Azure Websites and GitHub

During my foray into Azure web sites as part of the Windows Azure Developer challenge, I came up with what I think is a useful pattern for keeping your sensitive config settings secret when using Azure websites and GitHub. It allows you to:

  • Access them when debugging locally
  • Access them when deploying to the cloud
  • Stop them getting into source control on GitHub where others can see them

You can read all about it at 

Don’t forget to vote for the article if you like it. Also, keep an eye on my daily progress in the challenge at

Azure Developer Challenge – YouConf – My article for Challenge Two

Since I’m going to have to update my article for Challenge 3 (and hence remove a lot of the content from challenge two) I thought I’d post the whole contents of my article from challenge two (minus the daily progress reports) here for posterity. Here goes!



Building on Scott Hanselman’s excellent post regarding hosting a conference in the cloud ( why not make the ability to host a conference and stream it available to everyone? We’re going to use the same principle as dotnetConf, but build on it so that anyone can create their own conference with speakers and presentations, then record and stream to a live audience.


When I visited the dotNetConf site and saw Scott Hanselman’s blog post about it, I thought that this could be useful to a much wider audience, particularly smaller groups such as .Net User groups who might like to record and stream their presentations. Having seen the Azure developer contest brief, I figured it would be a good chance to learn more about Azure, MVC, and the .Net technology stack. Hence my entry into this competition.

How will Azure Benefit me?

Azure will allow me to focus on what I do best – development – and not have to worry about the intricacies of infrastructure or hosting concerns. It provides a robust, scalable platform which should allow me to scale-up my app/site as needed, and hopefully stay in control of costs through increased visibility. This will allow me to evolve the application rapidly and deploy to the cloud automatically using the built-in integration with TFS/Git. I’ve often found myself in situations where in order to release code I have to:

  1. Build and test it locally
  2. Prepare a release package
  3. Give it to someone else and prepare installation instructions, then sit with them as they do the deployment
  4. Discover that a file was missing from the release package and start the process all over again

With Azure and automated deployments I should be able to streamline the development process so I can make and deploy changes rapidly, with much less overhead than in than in the past.

Challenge Two – Build a website


For this challenge I’m going to implement the ideas discussed earlier, and try to get a fully functional site up & running in Azure. I’ll provide details of the discoveries I make, and issues encountered, in the sections below. Note that I’ll also be recording daily progress as I go, in the History section of this article. For more detail on some of the daily items I covered, please read that section as I’ll be referring to it in other parts of this article.

If you’d like to skip ahead and view the end-product, you can do so at
You can also view all source code, including history, at my GitHub repository –


For challenge one, the application has a number of initial goals:

  • Allow users to create conferences, including both presentation and speaker details
  • Give them a nice SEO-friendly url that the can direct their audience to so they can view conference and session details before the conference begins
  • Provide attractive pages for audiences to view conference details
  • Provide an embedded Google Hangout video feed on either the conference page, or an auxiliary page, so users can view presentations in realtime (and also view the relevant YouTube videos once the conference has finished).
  • When users are viewing a conference live, ensure they always have the most up-to-date feed url by using SignalR to pushing updates directly to their browser
  • Allow users to chat and interact with each other, and also the presenter, via a chat window on the same page as the live video feed
  • Implement some basic responsive design features (although not to the point of perfection as it takes a long time, and I have to do that in challenge 5!)
  • Technical – Allow me to push changes directly to Azure from my source-control repository without having to prepare a release package, with the additional option of deploying from my local machine if needed
  • Technical – Implement error logging with logs stored in some form of persistent storage
  • Financial – Try and minimize hosting costs by reducing the amount of outbound data where possible, and only scaling up when necessary
  • Plus one more little secret, which you’ll have to read to the end of this section to find out….

Let’s get started!

Creating the website

The first thing I needed was a website. I opened up Visual Studio 2012, and followed along with the following tutorial on how to build an MVC4 website, naming my project/solution YouConf. Note that since I’m not using SQL for this part of the competition I left the membership parts out (by commenting out the entire AccountController class so it doesn’t try to initialize the membership database). Whilst this means that users won’t be able to register, they will still be able to create and edit conferences, it’s just that they will all be publicly available for editing. More detail on this is in my daily progress report.

Once I had it building locally, the next step was to get it into Azure. To do this, I went to the Azure Management Portal, select the Web Sites Node, and hit the New button. I wanted the url to start with YouConf, so I entered youconf in the url field, and selected West US as the region since it’s closest to me (I’m in New Zealand!) as per the screenshot below:


Once I’d hit the Create site button I had a new site up & running just like that!

Next up I wanted to deploy to it, which required me to download the publish profile and import it into Visual Studio. To do so, I clicked on my YouConf site in the Azure Management Portal, then selected the Download the publish profile link. This opened up a window with the publish profile details, which I saved locally.

I then right-clicked on my YouConf web project in Visual Studio, and hit Publish. In the Publish dialog, I selected Import publish profile, and selected the .publishsettings file I’d saved earlier. I validated the connection using the button, chose Web Deploy as the publishing option, hit Next, and in the Settings section chose Release as the Configuration. I hit Next again, then hit Publish, and after about a minute I was able to browse my site in Azure. Now wasn’t that easy?!


Source Control Integration

Next up was getting source-control in place so that it would deploy automatically to Azure. I chose to use Git, mainly because I haven’t used it before and thought this would be a good opportunity to learn about it. I also wanted to be able to have a publicly-available source repository available for anyone to view, and having seen GitHub being used for this by others, thought I’d give it a go. Make no mistake, I love TFS, and use it on every other project, but for this I really wanted to push myself (although Azure makes it so easy that this wasn’t quite the case as you’ll see).

In order to get this working, I downloaded the Git explorer from, and setup a local youconf repository. I committed my changes locally, then synced my local changes to Git using the explorer. My Git repository is available at if you’d like to see the code.

Rather than pushing local changes directly to Azure, I wanted them first to go to GitHub so they’d be visible to anyone else who might want to have a poke around. To accomplish this I followed the steps in this article under the heading “Deploy files from a repository web site like BitBucket, CodePlex, Dropbox, GitHub, or Mercurial“.

*IMPORTANT* After publishing my changes to Git I realised that I’d included all of my publish profile files as well, which contained some sensitive Azure settings (not good). To remove them, I did a quick search and found the following article The commands I ran in the Git shell were as follows:


I also added an entry to my .gitignore file so that I wouldn’t accidentally checkin anything in the Publish profile folder again:


After fixing those, I clicked on my website in the Azure portal, clicked the link under Integrate Source Control, and followed the steps, selecting my youconf repository in GitHub. About 20 seconds later – voila! – my site has been deployed to Azure from GitHub. Seriously, how easy was that?!! It took next to no time, and left me set to focus on development, as I’d set out to do from the beginning.

Building the application

From here on, most of my time was spent on building the functionality of the web app, which as I mentioned earlier was an MVC 4 web application. I started building some basic forms for creating/editing/deleting a conference, and was faced with my next challenge – where to store data? I wanted persistent storage with fast access, and an easy api to use.Since SQL wasn’t available (till challenge 3), Azure Table Storage seemed like the logical option. See this daily progress update for more on why I chose this.

Azure Table Storage, so many options….

As per this daily progress update, I got setup and read about Partition and Row Keys, and found this article very helpful – . There are plenty of tutorials available about Azure Table storage which is helpful, and I created a table as per

Azure allows you to use the storage emulator when developing locally, and then update your settings for Azure so that your app will use Azure Table storage when deployed to the cloud. I added the following line to my appsettings in web.config to tell Azure to use the development storage account locally:

<add key="StorageConnectionString" value="UseDevelopmentStorage=true" /> 

I created a YouConfDataContext class (link to GitHub) and accessed this connection string using the following code:

CloudStorageAccount storageAccount = CloudStorageAccount.Parse(

Things seemed to be going well, but once I tried to save a conference I soon realized that I didn’t quite understand table storage quite as well as I’d thought! Basically I planned to store each conference, including speakers and presentations, as a single table entity, so that I could store/retrieve each conference in one go (as you would in a document oriented database). I started out writing code for the Conference class as below:

public class conference
    public conference()
        presentations = new list<presentation>();
        speakers = new list<speaker>();

    public string hashtag { get; set; }
    public string name { get; set; }
    public string description { get; set; }
    public ilist<presentation> presentations { get; set; }
    public ilist<speaker> speakers { get; set; }
    public string abstract { get; set; }
    public datetime startdate { get; set; }
    public datetime endtime { get; set; }
    public string timezone { get; set; }

When I tried to save one of these I ran into a bit of a roadblock though…Unfortunately you can only store primitive properties for a table entity, but not child collections or complex child objects. DOH! So, how could I work around this? I found a number of options:

  • Store each object type as a separate entity, E.g. Conference, Speaker, Presentation all get their own rows in the table. I wasn’t too keen on this as it seemed like more work than it was worth. Plus it seemed far more efficient to retrieve the whole conference in one hit rather than having to retrieve each entity separately then combine them in the UI.
  • FatEntities – – this looked very thorough, although I don’t think it wasn’t up to date with the latest Azure Table storage api
  • Lucifure – – this also looked like it wasn’t up to date with the latest Azure Table storage api
  • Use an object deriving from TableEntity, with a single property containing the Conference serialized as a JSON string. In the end I chose this option as it was easy to implement and allowed me to store the whole conference in a single table row. I used JSON.Net as it’s already included in the default MVC4 project, and allows me to serialize/deserialize in one line.

Some sample code from my YouConfDataContext.cs class for doing Inserts/Updates is below:

public void UpsertConference(Conference conference)
    //Wrap the conference in our custom AzureTableEntity
    var table = GetTable("Conferences");
    var entity = new AzureTableEntity()
        PartitionKey = "Conferences",
        RowKey = conference.HashTag,
        Entity = JsonConvert.SerializeObject(conference)
    TableOperation upsertOperation = TableOperation.InsertOrReplace(entity);
    // Insert or update the conference

where AzureTableEntity is just a wrapper class for a Table Entity:

public class AzureTableEntity : TableEntity
    public string Entity { get; set; }

An advantage of this approach is that it makes it easy to visualize conference data as well. To view my data in the Azure storage emulator, I downloaded the wonderful Azure Storage Explorer and viewed my Conferences table as shown below (note that I can see each conference serialized as JSON easily):


So now I had my data being stored using Azure Table Storage locally, how could I get it working when deployed in the cloud? I just had to setup a storage account and update my Azure Cloud settings as per

I created a storage account name youconf, then copied the primary access key. I then went to the websites section, selected my youconf site, clicked Configure, then added my StorageConnectionString to the app setttings section with the following value:

DefaultEndpointsProtocol=https;AccountName=youconf;AccountKey=[Mylongaccountkey] [/code]Now when I deployed to Azure I could save data to table storage in the cloud.

Note that I ran into an issue when updating a conference’s hashtag, as this is also used for the rowkey in Azure Table storage, and in order to make an update I first had to delete the existing record, then insert the new one (with the new hashtag/rowkey). See this daily progress report for more details.

Site Features

As mentioned earlier, most of my time was spent on working with MVC and finding/fixing issues with the site as they arose, rather than having any issues with Azure itself. The following section outlines some of the application highlights, and how they address the goals described in the introduction. Note that I’ve created an example conference – WgtnDeveloperConf2013 – which contains an example presentation from dotNetConf to illustrate how the video and chat feeds work. Please feel free to add your own conference/speakers/presentations and give it a run-through.

Viewing Conferences – for participants

The conference listing page – – lists available conferences, and allows users to drill into the conference/speaker/presentation details if they wish to. It also provides users with an SEO-friendly url for their conference, based on their chosen conference hashtag. In order to achieve this I had to add a custom route for conferences which automatically routed the request to the Conference Controller when applicable, and also a route constraint to ensure that this didn’t break other controller routes. The code for adding my custom route is below (from the /App_Start/RouteConfig.cs file – abbreviated for brevity):

public static void RegisterRoutes(RouteCollection routes)
        name: "ConferenceFriendlyUrl",
        url: "{hashTag}/{action}",
        defaults: new { controller = "Conference", action = "Details" },
        constraints: new { hashTag = new IsNotAControllerNameConstraint() }

and the end result at


Easy to use conference management/maintenance screens

I used a number of techniques to help make it easier for those running conferences to maintain them. For example:

  • Inline tooltips using the jQuery Tools Tooltip functionality
  • jQuery Date/time picker for easy date/time selection (see daily progress report for detail)
  • Help and FAQ pages
  • Inline validation, including a dynamic lookup on the conference creation page to show whether a conference hashtag is available or not
  • A right-hand sidebar containing tips for end-users

See for an example of an edit form.

Embedded videos and Twitter Chat

Both of these involved obtaining code from Google/Twitter which created an embedded widget on the conference live page, based on the hangout id/twitter widget id associated with the conference. The dotNetConf site uses Jabbr for chat, however, I thought that I’d try and go for something that allowed for chat to be on the same page as the video feed. One of the commenters on my article suggested Twitter, which seemed like a good choice as it’s already so widely used. In the next stage I might also look at using SignalR for this if time permits.

The image below shows an example of a page with embedded video and chat (note that I used the hangout id for one of the dotNetConf videos for demonstration, and had to shrink the screenshot to fit into the CodeProject window):


Keeping the live video feed url up to date with SignalR

SignalR is a great tool for providing realtime updates to clients, and the Jabbr chat site provides a great example of how to harness this technology. For live conference page I used SignalR in a similar way to dotNetConf to ensure that if a conference presenter updated the Google Hangout id for the conference, viewers would be provided with the updated url without having to refresh their page.

To install SignalR, I installed the SignalR Nuget package as below:


I then set about building a SignalR hub and client. My main issue came with how to push the notification to my SignalR hub from the Conference Controller. To give some context, here’s my YouConfHub class:

public class YouConfHub : Hub
    public Task UpdateConferenceVideoUrl(string conferenceHashTag, string url)
        //Only update the clients for the specific conference
        return Clients.All.updateConferenceVideoUrl(url);
    public Task Join(string conferenceHashTag)
        return Groups.Add(Context.ConnectionId, conferenceHashTag);

and my client javascript code:

<script src="//"></script>
    <script>$.signalR || document.write('<scr' + 'ipt src="~/scripts/jquery.signalr-1.0.1.min.js")></sc' + 'ript>');</script>
    <script src="~/signalr/hubs" type="text/javascript"></script>
        $(function () {
            $.connection.hub.logging = true;

            var youConfHub = $.connection.youConfHub;

            youConfHub.client.updateConferenceVideoUrl = function (hangoutId) {
                $("#video iframe").attr("src", "" + hangoutId + "?autoplay=1");

            var joinGroup = function () {
            //Once connected, join the group for the current conference.
            $.connection.hub.start(function () {

            $.connection.hub.disconnected(function () {
                setTimeout(function () {
                }, 5000);

See the UpdateVideoUrl method in the Hub? I wanted to call that from my ConferenceController when a user updated the conference hangout id/url, and thought I could do so by getting an instance of the Hub, then calling the method on it. E.g.

var context = GlobalHost.ConnectionManager.GetHubContext();
context.UpdateConferenceVideoUrl("[conference hashtag]", "[new hangout id]"); 

Sadly, it turns out that you can’t actually call methods on the hub from outside the hub pipeline 😦 You can, however, call methods on the Hub clients, and groups. So, in my conference controller’s edit method, I was able to use the following code to notify all clients for the specific conference that they should update their url as follows:

if (existingConference.HangoutId != conference.HangoutId)
    //User has changed the conference hangout id, so notify any listeners/viewers
    //out there if they're watching (e.g. during the live conference streaming)
    var context = GlobalHost.ConnectionManager.GetHubContext<YouConfHub>();

Not too bad in the end eh?

Responsive design – Basic features

Responsive design is all the rave these days, and fair enough too given the proliferation of web-enabled devices out there. I won’t spend too long on this, except to say I’ve implemented a number of specific styles using media queries to make the bulk of the site look good on both desktop, tablet, and mobile device resolutions. There’s a huge amount of information out there about responsive design, and I found articles by the Filament Group and Smashing Magazine very helpful in both understanding and fixing some of the issues. An example of one of my media queries for devices width widths below 760px (mobiles or small tablets) is below:

*   Mobile Styles   *
@media only screen and (max-width: 760px) {
    .main-content aside.sidebar, .main-content .content {
        float: none;
        width: auto;
    .main-content .content {
        padding-right: 0;

I’ve included a screenshot below to show the homepage on a mobile device. It looks good, but there’s still work to do for future challenges….


Financial – reducing outbound traffic and scaling up only when necessary

For Azure websites, you’re only charged for outbound traffic, hence it makes sense both financially, and for usability, to reduce the amount of bandwidth your site consumes. I used a number of techniques to achieve this:

  • CSS and Javascript bundling/minification using the System.Web.Optimization framework provided by MVC
  • Using CDN-host javascript javascript libraries where possible

For example, in the code below I try to load the jQuery library from the Microsoft Ajax CDN if possible, but if it’s not available, fallback to a local copy, which has already been minified to reduce bandwidth:

 <script src=""></script>
    <script>window.jQuery || document.write('<scr' + 'ipt src="@Scripts.Url("~/bundles/jquery")></sc' + 'ript>');</script> 

I do the same for other CSS/Javascript too – see my code on GitHub for examples.


Being able to log and retrieve exceptions for further analysis is key for any application, and it’s easy to get good quality logging setup in Azure, along with persistent storage of the logs for detailed analysis.

I’ve written quite a large article up on how I implemented logging in this daily progress report, so please see it for further technical details. In brief, I used Elmah for logging errors, with a custom logger that persisted errors to Azure Table storage. This means I can view my logs both on the server, and locally using Azure Storage explorer. Awesome!

… and the little secret – a blog!

Since about day 3 I’d been thinking of moving the posts on my daily progress into a separate blog, as there’s enough information to make some of them worth an entire entry. I was also aware that one of the competition points for this section was around what we do with our other 9 websites. So I figured I’d see if it really was as easy to setup a blog as they made out in

As with logging, the bulk of the implementation details are included in this daily progress report. I managed to get the blog up & running without too much fuss, but thought I’d better not move all my content there as it would mean having to cross-post content, and possibly make it harder to assess my article of content was in different places. Here’s a screenshot from


In Conclusion

It’s been quite an adventure this far, but I think I’ve managed to complete what I set out to achieve for challenge two, namely getting the site up & running in Azure with source control integration, and delivering the main features it was required to. I’ve used table storage both in the emulator and the cloud, and become much more familiar with the Azure platform as a whole. I’ve also gone through the process of setting up a blog, which was even easier than I thought it would be.

Finally – where are my tests? You may have noticed a distinct lack of unit tests, which I’m ashamed to say is at least partially intentional. Thus far my api has been changing so often that I felt adding tests would slow me down more than it was worth. I know this would drive TDD purists insane, but in my experience it’s sometimes helpful to wait till one’s api is more stable before adding tests, particularly when it comes to testing controllers. In addition to this, I’m going to be swapping out my table-based data access layer for SQL in challenge 3, so things are likely to change a lot more throughout the application. I will, however, at least add tests for my controllers at the start of challenge 3, so that I can verify I haven’t broken anything once I start adding SQL membership etc.

So what’s next?

Future challenges

For future challenges, there a number of additional features to focus on:

  1. Adding membership and registration, so users can manage their own conferences privately. This is reliant on having SQL available, which ties into challenge 3 nicely
  2. Adding unit and integration tests, particularly for the controllers
  3. Adding the ability to upload Speaker photos and store them in BLOB storage
  4. Add SSL to secure the registration and authentication process
  5. Adding live feeds of the slides, possibly using SlideShare
  6. Doing further testing to ensure the site is fully responsive across desktop, tablet, and mobile devices, which will be the focus for challenge 5
  7. Add the ability to setup reminders either using vcards or SMS so a user can register interest in a session, then be reminded when it’s coming up.
  8. Perform further security hardening, such as removing parameter-tampering and possible XSS issues associated with MVC model-binding


Part one: I’ve registered for the free Azure offering of up to 10 websites ( and just realised how generous the offer really is. Up to 10 websites!!! Hopefully we won’t need all of those, but you never know….

*I’ll try and post daily project updates, but if there are no entries for a given day, I either didn’t find time to work on the project, or was so caught-up in working on the project that I forgot to post an update.