Azure, Web & Mobile

How to fix the IPv4 loopback interface: port already in use error.

Super quick post here. Sometimes when debugging your .NET Core application on Mac, you’ll find the port won’t free up, and thus you can’t redeploy without getting the following fatal error:

Unable to start Kestrel. System.IO.IOException: Failed to bind to address http://localhost:5000 on the IPv4 loopback interface: port already in use.

To fix this, you’ll need to fire up Terminal and enter the following:

sudo lsof -i :5000

In my case, this outputted the following:

Screen Shot 2017-10-20 at 18.54.54.png

I know the error is referencing the IPv4 Type which allows me to quickly find the PID number, which I’ll use to kill the connection. I do this with the following command

kill -9 18057

With that done, I can now get back to debugging my .Core web API on macOS.

Azure, Security, Web & Mobile

App Services Custom Domain, SSL & DNS

We’ve all seen tutorials which demonstrate how to deploy a simple todo list backend to Azure but how many have you read that go onto secure it? In this post, I’m going to cover how I’m securing the Bait News v2 backend infrastructure as well as covering how to configure custom domains.

Why bother?

Apple announced in 2015 that Apps and their corresponding backend servers would need to support App Transport Security (ATS).

ATS was introduced with iOS 9 as a security enhancement to ensure all connections by apps use HTTPs. Initially slated to go into effect for all new app store submissions from January 2017, it has since been postponed with no update on when it’ll be coming into effect. Although the requirement has been delayed, it’s still something that all app developers should be implementing as it provides our users with added security, making man in the middle attacks impossible to go unnoticed.

Historically, you’ll see most developers (including myself), opt to turn ATS off to make our lives easier. Some will take a lighter touch and only disable ATS for a single domain (they’re backend) which is not much more secure than turning ATS off altogether. Either approach opens up your users and data to attack and should be avoided.

So what do we need to do to secure our app? Lets first register a domain for our backend.

Custom Domains

DNS

I’ve been using 123-Reg as my domain registrar for 10 years and continue to use them as I migrate my websites to Azure. Most domain registrars will also provide some basic DNS functionality but you would normally want to use a 3rd party DNS Service for more advance situations. In my case, I’m using 123-Regs DNS service and have added a number of CNAMEs pointing to Azure.

Adding records

Below you see the minimum required records needed to enable my custom domain.

Permanent Records

Screen Shot 2017-09-02 at 15.20.53

Temporary Records

Screen Shot 2017-09-02 at 15.32.15

To get started, I have added an A record pointing to the App Service instance using its IP address. You can find your App Service IP address by going into it’s Custom Domain blade within the Azure portal.

Once you’ve added the A record, you can then create the CNAME which will map www requests to your backends url. You can find your destination in the Overview blade of the App Service.

Verify Domain Ownership

Azure needs to know I own the domain I’m trying to map. To prove this, I’ll add two records to my DNS settings which are the temporary records listed above.

Once I’ve added the verify CNAME records, I can save and sit tight. DNS Records need to propagate across the globe, which can take up to 24 hours.

This is end result of what my DNS setting configuration looked like. I also created some CNAMEs to redirect traffic from subdomains to other Azure services.

Screen Shot 2017-08-17 at 11.50.36

Portal Configuration

To finish off, I need to configure the App Service Custom Domain settings.

Screen Shot 2017-09-02 at 15.53.03.png

Hit the ‘Add Hostname’ button and enter the custom domain.

Screen Shot 2017-09-02 at 15.54.03

After hitting Validate, Azure will check the DNS records to confirm the domain exists and that you own it. You should see something like this.

Screen Shot 2017-09-02 at 15.55.53

Hitting ‘Add hostname’ will complete the process of configuring a custom domain for your App Service. If you’re deploying a mobile backend, you may want to create CNAME record which maps api.domain.net to your mobile backend and whilst keeping http://www.domain.net mapped to a ASP.NET website.

Adding Security

SSL Certificates

As mentioned at the start of this post, enabling HTTPS prevents MITM attacks and ensures your communication between server and client is secure. Its pretty straight forward to enable within App Services but much like DNS, it can take a while (but this time its human factors rather than waiting for computers to sync up).

First things first, You’ll need to purchase a certificate.  I opted to 123-Reg as they provide a few options to meet most users requirements and its integration with my domain management make it a no brainer to use.

I should admit that I did make a mistake when I first purchase a certificate, which caused a few days of delays, so its important to double check the type of certificate you’re purchasing. I had purchased a certificate which was for only http://www.baitnews.io. This mean that my mobile api of api.baitnews.io couldn’t use the certificate. 123-Reg refunded the first certificate and I tried again, but this time making sure to purchase a certificate which support unlimited subdomains. You can see below the original certificate has been revoked and the new certificate supports wildcards.

When you apply for a certificate, you’ll be provided a download which includes your certificate request (CSR) in a PEM format. You also get the private key which you’ll use later to create a new certificate.

Screen Shot 2017-09-02 at 16.13.32

Once you’ve been issued the certificate, you’re ready to create a new certificate which you’ll use in Azure for everything. This is a pretty easy process as we can use OpenSSL on almost any platform. I’m on a Mac but this works the same on both Windows and Linux.


openssl pkcs12 -export -out baitnews.pfx -inkey
/Users/michaeljames/Downloads/SSL-CSR5/private-key.key -in
/Users/michaeljames/Desktop/wildcard.cert

cert

Variables

  • [Output file name] -| What do you want to call the certificate?
  • [private-key.key path] The location of the private key. This would have been provided when requesting the certificate.
  • [wildcard.cert path] The location of the freshly issued certificate.

Once you press enter, you’ll need to type in a couple of passwords and then you’ll be set. It’ll look something like this:

Screen Shot 2017-09-02 at 16.38.43

You now have your certificate ready for uploading to Azure. The conversion of certificates isn’t the easiest of procedures to wrap your head around on the first few goes. If you’re worried about this step then keep in mind you can purchase SSL certificates through the Azure Portal, which skips many of the above steps! It does however add a small premium to the cost involved in securing your backend as you’ll find the certificate a little more expensive but your also required to store it in KeyVault.

Binding Domains with Certificates

Lets upload our new certificate to our App Service. To do this, head over to SSL Certificate blade and hit ‘Upload Certificate’. You’ll need to provide the password used to create the certificate.

Screen Shot 2017-09-07 at 12.26.17.png

If successful, you’ll see your certificate as been imported and is ready to use with your custom domains.

Screen Shot 2017-09-07 at 12.29.33.png

Add Binding

The last step is to bind out SSL certificate with our custom domain. Clicking ‘Add Binding’ will allow you to select both the custom domain and SSL from a drop down.

Screen Shot 2017-09-07 at 12.29.40

Hitting Add Binding will finish the process. You now have a custom domain mapped your App Service instance that supports HTTPS. Any users visiting your backend will be greeted  with the familiar green padlock in the address bar.

Screen Shot 2017-09-07 at 12.31.59.png

Wrapping Up

Adding custom domains and enabling secure connectivity between your mobile app and backend is extremely simple and theres no good reason not to enable it (unless you’re hacking on a demo or POC).

In the next post I’m going to cover how to expand our setup to to route traffic to the nearest App Service instance.

Azure, YouTube

Creating a simple Azure backend POC for your mobile app

Most mobile apps require some form of infrastructure to function correctly. In the case of something like Instagram, they’ll likely have some blob storage for storing images and then a SQL database for storing user information like comments and likes. They’ll have a REST API which the mobile app uses to interact with these services rather than having a direct connection to each service within the backend.

Within the context of Azure, we would usually opt to use Azure App Service as our middleware/orchestration layer. Our mobile apps will connect to this layer and we can deal with ensuring the users have the correct permissions to access the data in our storage services.

In this video, I show how I created a proof of concept backend for Bait News. I had an Excel spreadsheet which I wanted to host in the cloud and make avaiaible to all my mobile users. To do this, I used Azure App Service Easy Tables. Watch below to find out more:

Azure, Beer Drinkin

Creating a 5 star search experience

Search is a feature that can make or break your mobile app, but it can be incredibly difficult to get right. In this blog post, I’m going to share how I’m solving search with Beer Drinkin.

There are many options for us developers looking to implement search solutions into our projects. Some of us may decide to use Linq and Entity Framework to look through a table, and the more adventurous may opt to create an instance of Elastic Search, which requires a lot of work to set up and maintain. For Beer Drinkin, I’m using Microsoft’s Azure Search service as it has proved to be easy to configure and requires zero maintenance.

The reason that Beer Drinkin uses Azure Search is simple: the BreweryDB search functionality is too limited for my needs. One example of this is that the end point often returns zero results if the user misspells a search term. If I searched for “Duval” rather than “Duvel,” BreweryDB’s search would return zero beers. Even if I were to spell the search term correctly, BreweryDB would return all beers from the Duvel Moortgat brewery. Although this is minor, I would prefer that Maredsous 6 and Vedett Extra White not be returned as these do not have “Duvel” in the name.

Screen Shot 2015-12-21 at 15.36.14

Spelling Mistakes

Another issue with using the default search functionality of BreweryDB is its inability to deal with spelling mistakes or offer suggestions. Simple off-by-one-letter spelling mistakes yield no results, something that should be easy to resolve.

Screen Shot 2015-12-21 at 15.40.40

I’ve had sleepless nights worrying that on release, users fail to find results due to simple spelling mistakes. One way to address spelling mistakes is to utilize a spell checking service like WebSpellChecker.net.

The issue with a service such as WebSpellChecker is that it has no context in which to make corrections when it comes to names of products and it also doesn’t support multiple languages.

Another way to minimize spelling mistakes is to provide a list of suggestions as the user types in a search query. You’re probably familiar with this in search engines like Google and Bing. This approach to searching is intuitive to users and significantly reduces the number of spelling mistakes.

Enter Azure Search

Azure Search aims to remove the complexity of providing advanced search functionality by offering a service that does the heavy lifting for implementing a modern and feature-rich search solution. Microsoft handles all the infrastructure required to scale as it gains more users and indexes more data. Not to mention that Azure Search supports 50 languages, which use technologies from multiple teams within Microsoft (such as Office and Bing). What this equates to is Azure Search understands the languages and words of the search requests.

Some of my favorite features

Fuzzy search – Find strings that match a pattern approximately.

Proximity search – Geospatial queries. Find search targets within a certain distance of a particular point.

Term boosting –  Boosting allows me to promote results based on rules I create. One example might be to boost old stock or discounted items.

Getting Started

The first step I took was to provision an Azure Search service within the Azure Portal. I had two options for setting up the service; I could have opted for a free tier or have paid for dedicated resources. The free tier offers up to 10,000 documents and 50MB storage, which is a little limited for what I need.

Because my index already contains over 50,000 beers, I had no option but to opt for the Standard S1 service, which comes at a cool $250 per month (for Europeans, that’s €211). With the fee comes a lot more power with the use of dedicated resources, and I’m able to store 25GB of data. When paying for Search, you’ll be able to scale out to 36 units, which provides plenty of room to grow.

Creating an index

Before I could take advantage of Azure Search, I needed to upload my data to be indexed. Fortunately, with the .NET SDK the Azure Search team provides, it’s exceptionally easy to interact with the service. Using the .NET library I wrote a few weeks ago, which calls BreweryDB, I was able to iterate quickly through each page of beer results and upload them in blocks to the search service.

Screen Shot 2016-01-04 at 10.18.02.png

Uploading documents

Parallel.For(1, totalPageCount, new ParallelOptions {MaxDegreeOfParallelism = 25}, index =>
{
    var response = client.Beers.GetAll(index).Result;
    var beersToAdd = new List();
    foreach (var beer in response.Data)
    {
        var indexedBeer = new IndexedBeer
        {
            Id = beer.Id,
            Name = beer.Name,
            Description = beer.Description,
            BreweryDbId = beer.Id,
            BreweryId = beer?.Breweries?.FirstOrDefault()?.Id,
            BreweryName = beer?.Breweries?.FirstOrDefault()?.Name,
            AvailableId = beer.AvailableId.ToString(),
            GlassId = beer.GlasswareId.ToString(),
            Abv = beer.Abv
         };

         if (beer.Labels != null)
         {
            indexedBeer.Images = new[] {beer.Labels.Icon, beer.Labels.Medium, beer.Labels.Large};
         }
         beersToAdd.Add(indexedBeer);
    }
    processedPageCount++;
    indexClient.Documents.Index(IndexBatch.Create(beersToAdd.ToArray().Select(IndexAction.Create)));

    Console.Write( $"\rAdded {beersToAdd.Count} beers to Index | Page {processedPageCount} of {totalPageCount}");
});

Other data import methods

Azure Search also supports the ability to index data stored in Azure SQL or DocumentDB, which enables you to point a crawler to your SQL table and ensures it is always up to date rather than requiring you to manually manage the document index yourself. There are a few reasons you may not want to use a crawler. The best reason for not using a crawler is that it introduces the possibility of a delay between your DB changing and your search index reflecting the changes. The crawler will only crawl on a schedule, which results in an out-of-data index.

If you opt for the self-managed approach, you can add, remove, and edit your indexed documents yourself as the changes happen in your back end. This provides you with live search results as you know the data is always up to date. Using the crawler is an excellent way to get started with search and quickly get some data in place, but I wouldn’t consider it a good strategy for long-term use.

I mentioned earlier that the free tier is limited to 10,000 documents, which translates to 10,000 rows in a table. If your table has more than 10,000 rows, then you’ll need to purchase the Standard S1 tier.

Suggestions

Before we can use suggestions, we’ll need to ensure that we’ve created a suggester within Azure.

Screen Shot 2016-01-04 at 10.27.15.png

In the current service release, there is support for limited index schema updates. Any schema updates that would require re-indexing, such as changing field types, are not currently supported. Although existing fields cannot be modified or deleted, new fields can be added to an existing index at any time.

If you’ve not checked the suggester checkbox at the time of creating a field, then you’ll need to create a secondary field as Azure Search doesn’t currently support editing the fields. The Azure Search team recommends that you create new fields if you require a change in functionality.

The simplest way to get suggestions would use the following API.

var response = await indexClient.Documents.SuggestAsync(searchBar.Text, "nameSuggester");
foreach(var r in response)
{
    Console.WriteLine(r.Text);
}

Having fun with the suggestion API

The API suggestion provides properties for enabling fuzzing matching and hit highlighting. Let’s see how we might enable that functionality within our app.

var suggestParameters = new SuggestParameters();
suggestParameters.UseFuzzyMatching = true;
suggestParameters.Top = 25;
suggestParameters.HighlightPreTag = "[";
suggestParameters.HighlightPostTag = "]";
suggestParameters.MinimumCoverage = 100;

What do the properties do?

UseFuzzyMatching – The query will find suggestions even if there’s a substituted or missing character in the search text. While this provides a better search experiance, it comes at the cost of slower operations and consumes more resources.

Top – Number of suggestions to retreive. It must be a number between 1 and 100, with its default to to 5.

HightlightPreTag – Gets or sets the tag that is prepended to hit highlights. It MUST be set with a post tag.

HightlightPostTag – Gets or sets the tag that is prepended to hit highlights. It MUST be set with a pre tag.

MinimumCoverage – Represents the precentage of the index that must be covered by a suggestion query in order for the query to be reported a sucess. The default is 80%.

How do the results look?

Simulator Screen Shot 4 Jan 2016, 10.39.25.png

Search

The Search API itself is even easier (assuming we don’t use filtering, which is a topic for another day).


var searchParameters = new SearchParameters() { SearchMode = SearchMode.All };

indexClient.Documents.SearchAsync(searchBar.Text, searchParameters);

Special Thanks

I’d like to take a moment to thank Janusz Lembicz for helping me get started with Azure Search suggestions by answering all my questions. I  appreciate your support (especially given it was on a weekend!).

Azure

Facebook Authentication with Azure

Important Note

Microsoft have recently released Azure App Services which aims to replace Mobile Services. I will be writing an updated guide shortly.


 

Azure Mobile Services (AMS) is a great platform for .NET developers to build complex apps that require a backend in almost no time and at very competitive prices. I’ve been using it for about 6 months to develop a beer tracking app called BeerDrinkin.

One of the requirements of BeerDrinkin was its ability to sync account data accross all the users devices and AMS makes this incredibly easy with offline sync.

I couldn’t help but feel that if I’m going to be storing data in Azure, showing what beers people love and hate, then I really should be doing something more interesting than simply syncing to a handful of devices. This is why I decided to try and build a beer recommendation engine into BeerDrinkins backend. The aim is to make use of Azure Machine Learning to make suggestions about what beers you might like based on your constumption history and those of your peers.

In order to build a users profile to feed into machine learning, I needed more information than the simple GUID that AMS returns when calling the FacebookLoginProvier within the ConfigOptions of a Mobile Service WebApiConfig class.

The information that I wanted to have about the user:

  • Email Address
  • First Name
  • Last Name
  • Gender
  • Date of Birth

I will be adding additional data about users at various points during the users interaction with the app. One example is location data and I even plan on recroding local weather conditions for each beer the user checks in. With this, I can use machine learning to predict beers based on current weather conditions. This is important as on a warm day, most people will likely want a lager rather than a thick, blood warming Belgian double.

Creating a custom LoginProvider

To fetch the extra information, I needed to do a couple of things. First things first, I needed to remove the default FacebookLoginProvider from my ConfigOptions. To do this I called the following:


options.LoginProviders.Remove(typeof(FacebookLoginProvider));

I then went ahead and created a new class which I named CustomFacebookLoginProvider which importantly overrides the CreateCentials method.

public class CustomFacebookLoginProvider : FacebookLoginProvider
{
public CustomFacebookLoginProvider(HttpConfiguration config, IServiceTokenHandler tokenHandler)
: base(config, tokenHandler)
{
}

public override ProviderCredentials CreateCredentials(ClaimsIdentity claimsIdentity)
{
var accessToken = string.Empty;
var emailAddress = string.Empty;
foreach (var claim in claimsIdentity.Claims)
{
if (claim.Type == "Zumo:ProviderAccessToken")
{
accessToken = claim.Value;
}

if (claim.Type == "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/emailaddress")
{
emailAddress = claim.Value;
}
}
}
}

I now had some basic information regarding the user, such as their Facebook token (which is essential for gathering more information) and their email address. I then use a 3rd party Nuget package to query Facebook’s opengraph for more information regarding the user.

The entire method in BeerDrinkin’s Azure backend looks something like this:

public class CustomFacebookLoginProvider : FacebookLoginProvider
{
public CustomFacebookLoginProvider(HttpConfiguration config, IServiceTokenHandler tokenHandler)
: base(config, tokenHandler)
{
}

public override ProviderCredentials CreateCredentials(ClaimsIdentity claimsIdentity)
{
var accessToken = string.Empty;
var emailAddress = string.Empty;
foreach (var claim in claimsIdentity.Claims)
{
if (claim.Type == "Zumo:ProviderAccessToken")
{
accessToken = claim.Value;
}

if (claim.Type == "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/emailaddres")
{
emailAddress = claim.Value;
}
}

if (string.IsNullOrEmpty(accessToken))
return null;

var client = new FacebookClient(accessToken);
dynamic user = client.Get("me");

DateTime dateOfBirth;
DateTime.TryParse(user.birthday, out dateOfBirth);

//Keeping userItem for the moment but may well kill it. I was going to seperate userItem into public info and accountItem into public
var userItem = new UserItem
{
Id = user.id,
};

var accountItem = new AccountItem
{
Id = userItem.Id,
Email = emailAddress,
FirstName = user.first_name,
LastName = user.last_name,
IsMale = user.gender == "male",
DateOfBirth = dateOfBirth,
AvatarUrl = $"https://graph.facebook.com/{userItem.Id}/picture?type=large"
};

var context = new BeerDrinkinContext();
if (context.UserItems.FirstOrDefault(x => x.Id == userItem.Id) != null)
return base.CreateCredentials(claimsIdentity);

context.AccountItems.Add(accountItem);
context.UserItems.Add(userItem);
context.SaveChanges();

return base.CreateCredentials(claimsIdentity);
}
}

Using the CustomFacebookLoginProvider

In order to use my implementation of the login provider, I needed to go ahead and add it to the ConfigOptions.

options.LoginProviders.Add(typeof(CustomFacebookLoginProvider));

Scopes

One thing I forgot when I first implemented this approach was ensuring that I updated my scopes. By default, Facebook wont provide me with the information I require. In order to have Azure Mobile Service pass the correct request to Facebook, I needed to log into my Azure management portal and add MS_FacebookScope to my App Settings. The exact scope I’m requesting is email user_birthday user_friends.

Disclaimer

BeerDrinkin (espically the backend) is worked on mostly whilst ‘testing’ the app (drinking beer). Some of the code is horrible and need to be refactored. The above code could 100% do with a tidy up but works so I’ve left it as is. The project is on GitHub.so please do contribute if you can.