Xamarin with macOS 10.14 (mojave)

It’s that time of year again where we all ask ourselves “should I install this beta software on my devices and risk my development setup?”. If you’ve only one iPhone and Mac then it can be difficult to decide when it’s the right time to install the latest and greatest offerings from Apple, for fear of breaking your development environment.

This year I’ve not needed to be so worried about breaking my environment as my current projects see me developing more with ASP.NET and Swift than Xamarin, so I went ahead and downloaded both iOS 12 and macOS 10.14 as soon as I could.

Screenshot 2018-06-05 at 00.03.18

How does it run?
First impressions of running macOS Mojave and Xamarin are better than expected! I fired up Visual Studio for Mac and opened the workshop myself and Robin-Manuel Thiel created to see if I could get the iOS app to build using Xcode 9. The good news is that it works without any modification assuming you had a working setup before upgrading.

With that said, I have experienced some crashes with resizing VS4Mac but issues are expected at this stage.

Screenshot 2018-06-05 at 00.26.15

Advice
If you can avoid it, don’t update just yet if your day-to-day development requires the Xamarin tooling to work. The Xamarin engineers will need some time to test and ensure things work properly as they too have only just downloaded a copy of the OS.

On the other hand, if you simply cannot wait to play with macOS Mojave,  then know that at a minimum, you can continue to build and deploy with the latest beta software from our friends in Cupertino.


– Opinions are my own and not the views of my employer

Consuming Microsoft Cognitive Services with Swift 4

This post is a direct result of a conversation with a colleague in a taxi in Madrid. We were driving to Santiago Bernabéu (the Real Madrid Stadium) to demonstrate to business leaders the power of artificial intelligence.

The conversation was around the ease of use of Cognitive Services for what we call “native native” developers. We refer to those that use Objective-C, Swift or Java as ‘native native’ as frameworks like ReactNative and Xamarin are also native, but we consider these “XPlat Native”. He argued that the lack of Swift SDKs prevented the adoption of our AI services such as our Vision APIs.

I maintained that all Cognitive Service APIs are well documented, and we provide an easy to consume suit of REST APIs, which any Swift developer worth their salt should be able to use with minimal effort.

Putting money where my mouth is

Having made such a statement, it made sense for me to test if my assertion was correct by building a sample app that integrates with Cognitive Services using Swift.

Introducing Bing Image Downloader. A fully native macOS app for downloading images from Bing, developed using Swift 4.

Screen Shot 2018-05-10 at 11.11.55.png

I’ve put the code on Github for you to download and play with if you’re interested in using Cognitive Services within your Swift apps, but I’ll also explain below how I went about building the app.

Where the magic happens

In the interest of good development practices, I started by creating a Protocol (C# developers should think of these as Interfaces) to define what functions the ImageSearch class will implement.

Protocol

protocol ImageServiceProtocol {
// We will take the results and add them to hard-coded singleton class called AppData. 
func searchForImageTerm(searchTerm : String)

// We pass in a completion handler for processing the results of this func
func searchForImageTerm(searchTerm : String, completion : @escaping ([ImageSearchResult]) -> ())
}

Two Implementations for one problem

I’ve made sure to include two implementations to give you options on how you’d want to interact with Cognitive Services. The approach used in the App makes use of the Singleton class for storing AppData as well as using Alamofire for handling network requests. We’ll look at this approach first.

search For Image Term

This is the public func, which is easiest to consume.

func searchForImageTerm(searchTerm : String) {

    //Search for images and add each result to AppData
    DispatchQueue.global.(qos: .background).async {
        let totalPics = 100
        let picsPerPage = 50 
        let numPages = totalPics / picsPerPage 
        (0 ..< numPages)             
            .compactMap { self.createUrlRequest(searchTerm: searchTerm, pageOffset: $0 }             
            .foreach{ self.fetchRequest(request: $0 as NSURLRequest) }         
        .RunLoop.current.run()     } 
} 

create Url Request

private func createUrlRequest(searchTerm : String, pageOffset : Int) -> URLRequest {

    let encodedQuery = searchTerm.addingPercentEncoding(withAllowedCharacters: .urlQueryAllowed)!
    let endPointUrl = "https://api.cognitive.microsoft.com/bing/v7.0/images/search"

    let mkt = "en-us"
    let imageType = "photo"
    let size = "medium" 

    // We should move these variables to app settings
    let imageCount = 100
    let pageCount = 2
    let picsPerPage = totalPics / picsPerPage 

    let url = URL(string: "\(endPointUrl)?q=\(encodedQuery)&count=\(picsPerPage)&offset=\(pageOffset * picsPerPage)&mkt=\(mkt)&imageType=\(imageType)&size=\(size)")!
        
    var request = URLRequest(url: url)
    request.setValue(apiKey, forHTTPHeaderField: "Ocp-Apim-Subscription-Key")
        
    return request
}

fetch Request

This is where we attempt to fetch and parse the response from Bing. If we detect an error, we log it (I’m using SwiftBeaver for logging).

If the response contains data we can decode, we’ll loop through and add each result to our AppData singleton instance.

private func fetchRequest(request : NSURLRequest){
    //This task is responsbile for downloading a page of results
    let task = URLSession.shared.dataTask(with: request as URLRequest){ (data, response, error) -> Void in
            
    //We didn't recieve a response
    guard let data = data, error == nil, response != nil else {
        self.log.error("Fetch Request returned no data : \(request.url?.absoluteString)")
        return
    }
            
    //Check the response code
    guard let httpResponse = response as? HTTPURLResponse,
        (200...299).contains(httpResponse.statusCode) else {
        self.handleServerError(response : response!)
        return
    }
            
    //Convert data to concrete type
    do
    {
        let decoder = JSONDecoder()
        let bingImageSearchResults = try decoder.decode(ImageResultWrapper.self, from: data)
                
        let imagesToAdd = bingImageSearchResults.images.filter { $0.encodingFormat != EncodingFormat.unknown }
            AppData.shared.addImages(imagesToAdd)            
        } catch {
            self.log.error("Error decoding ImageResultWrapper : \(error)")
            self.log.debug("Corrupted Base64 Data: \(data.base64EncodedString())")
        }     
     }
        
     //Tasks are created in a paused state. We want to resume to start the fetch.
     task.resume()
}   

Option two (with no 3rd party dependancies)

As a .NET developer, the next approach threw me for a while and took a little bit of reading about Closures to fully grasp. With this approach, I wanted to return an Array of ImageSearchResult type, but this proved not to be the best approach. Instead, I would need to pass in a function that can handle the array of results instead.

// Search for images with a completion handler for processing the result array
func searchForImageTerm(searchTerm : String, completion : @escaping ([ImageSearchResult]) -> ()) {
        
    //Because Cognitive Services requires a subscription key, we need to create a URLRequest to pass into the dataTask method of a URLSession instance..
    let request = createUrlRequest(searchTerm: searchTerm, pageOffset: 0)
       
    //This task is responsbile for downloading a page of results
    let task = URLSession.shared.dataTask(with: request, completionHandler: { (data, response, error) -> Void in
            
    //We didn't recieve a response
    guard let data = data, error == nil, response != nil else {
        print("something is wrong with the fetch")
        return
    }
            
    //Check the response code
    guard let httpResponse = response as? HTTPURLResponse,
    (200...299).contains(httpResponse.statusCode) else {
        self.handleServerError(response : response!)
        completion([ImageSearchResult]())
        return
    }
            
    //Convert data to concrete type
    do
    {
        let decoder = JSONDecoder()
        let bingImageSearchResults = try decoder.decode(ImageResultWrapper.self, from: data)
                
        //We use a closure to pass back our results.
        completion(bingImageSearchResults.images)
                
    } catch { self.log.error("Decoding ImageResultWrapper \(error)") }
    })
    task.resume()
}

Wrapping Up

You can find the full project on my Github page which contains everything you need to build your own copy of this app (maybe for iOS rather than macOS?).

If you have any questions, then please don’t hesitate to comment or email me!

 

Updated Resilient Networking with Xamarin

Rob Gibbons wrote a fantastic blog post back in 2015 on how best to write network requests layers for your Xamarin Apps. I’ve personally used this approach many times, but I felt that it needed updating for 2018, so here it is. A slightly updated approach to resilient networking services with Xamarin. And when I say ‘slightly update’, I honestly mean it’s a minor change!

we-dont-throw

Refit

For those of you who are familiar with Rob’s approach, he uses pulls together a few libraries to create a robust networking layer. One of the critical elements of his strategy is the use of Refit. Refit is a REST library which allows us to interact with remote APIs with minimal boiler-plate code. It makes heavy use of generics and abstractions to define our REST API calls as a C# Interfaces which are then used with am instance HTTPClient to handle all the requests. All serialisation is dealt with for us! I still believe Refit to be a great library to use so we’ll keep this as the core of this pattern.

Let’s have a look at an example interface for use with Refit.

public interface IBeerServiceAPI`
{
    [Get("/beer/")]
    Task GetBeers();
}

We use attributes to define the request type as well as its path (relative to the HTTPClients base URL).

We then define what we expect back from the API and leave Refit to handle making the call, deserialising the response and handing it back to us as a concrete type.

To expand on this, we can add many more types of requests.

[Get("/beer/{id}/")]
Task GetBeerById(string id);

[Post("/beer/")]
Task CreateBeer([Body] Beer beer);

[Delete("/beer/{id}/")]
Task DeleteBeer(string id);

[Put("/beer/{id}/")]
Task UpdateBeer(string id, [Body] Beer beer);

We can now use the interface to make calls to our remote endpoint. I usually place these methods within a class that is unique to the service I’m calling. I’m this example; it’d be a “BeersService.”

//Create new beer item
public async Task<Beer> CreateBeerAsync(Beer beer)
 {
    var apiInstance = RestService.For<IBeerServiceAPI>(Helpers.Constants.BaseUrl);
    return await apiInstance.CreateBeer(beer);
}

//Get by ID
public async Task<Beer> GetBeerByIdAsync(string id)
{
    var apiInstance = RestService.For<IBeerServiceAPI>(Helpers.Constants.BaseUrl);
    return await apiInstance.GetBeerById(id);
}

That’s all it takes for us to starts interacting with a remote API. If you’re wondering how to test this, it’s incredibly easy to swap out implementations with mock services when using this architecture!

Resiliency

Building a resilient networking service requires a few things. We need to understand what our current connectivity looks like, as well as find a solution for caching data locally to ensure our app still ‘works’ in offline situations.

We can achieve both of these tasks by leveraging packages from Motz. He’s created a plugin for checking connectivity status as well as developed a library for caching.

Lets first take a look at connectivity status.

You’ll want to add the Connectivity Plugin nuget package to every client project in the solution as well as the PCL. The following platforms are supported:

  • Xamarin.iOS
  • tvOS (Xamarin)
  • Xamarin.Android
  • Windows 10 UWP
  • Xamarin.Mac
  • .NET 4.5/WPF
  • .NET Core
  • Samsung Tizen

To use the connectivity plugin, we can simple make the following call:

var isConnected = CrossConnectivity.Current.IsConnected;

Caching

Now that we can check for connectivity, we detect that we’re offline. Let’s have a look at how to implement that.

public async Task<List<Beer>> GetBeersAsync()
{
    Handle online/offline scenario
    if (!CrossConnectivity.Current.IsConnected)
    {
        //If no connectivity, we need to fail... :(
        throw new Exception("No connectivity");
    }
    //Create an instance of the Refit RestService for the beer interface.
    var apiInstance = RestService.For<IBeerServiceAPI>(Helpers.Constants.BaseUrl);
    var beers = await apiInstance.GetBeers());

    return beers;
}

Returning no results for most requests isn’t a great solution. We can dramatically improve the user experience by keeping a cache of data to show in offline situations. To implement that, we’re going to use Monkey Cache. To use Monkey Cache, we have to first configure the ApplicationId.  A folder created for your app on disk with the ApplicationId, so you should avoid changing it.

Barrel.ApplicationId = "your_unique_name_here";

Adding Monkey Cache is super simple. First, of, we want to define a key. Think of this as the collection (barrel) name. After that, we implement the necessary logic to handle caching.

public async Task<List<Beer>> GetBeersAsync()
{
    var key = "Beers";

    Handle online/offline scenario
    if (!CrossConnectivity.Current.IsConnected && Barrel.Current.Exists(key))
    {
        //If no connectivity, we'll return the cached beers list.
        return Barrel.Current.Get<List<Beer>>(key);
    }

    //If the data isn't too old, we'll go ahead and return it rather than call the backend again.
    if (!Barrel.Current.IsExpired(key) && Barrel.Current.Exists(key))
    {
        return Barrel.Current.Get<List<Beer>>(key);
    }            

    //Create an instance of the Refit RestService for the beer interface.
    var apiInstance = RestService.For<IBeerServiceAPI>(Helpers.Constants.BaseUrl);
    var beers = await apiInstance.GetBeers());

    //Save beers into the cache
    Barrel.Current.Add(key: key, data: beers, expireIn: TimeSpan.FromHours(5));

    return beers;
}

Polly

Returning to Rob’s original post, we’ll want to add Polly. Polly helps us handle network requests sanely. It allows us to retry, and process failures robustly.

We’re going to use Polly to define a retry logic that forces the service to retry five times, each time waiting twice as long as before.

public async Task<List<Beer>> GetBeersAsync()
{
    var key = "Beers";

    Handle online/offline scenario
    if (!CrossConnectivity.Current.IsConnected && Barrel.Current.Exists(key))
    {
        //If no connectivity, we'll return the cached beers list.
        return Barrel.Current.Get<List<Beer>>(key);
    }

    //If the data isn't too old, we'll go ahead and return it rather than call the backend again.
    if (!Barrel.Current.IsExpired(key) && Barrel.Current.Exists(key))
    {
        return Barrel.Current.Get<List<Beer>>(key);
    }            

    //Create an instance of the Refit RestService for the beer interface.
    var apiInstance = RestService.For<IBeerServiceAPI>(Helpers.Constants.BaseUrl);

    //Use Polly to handle retrying (helps with bad connectivity) 
    var beers = await Policy
        .Handle<WebException>()
        .Or<HttpRequestException>()
        .Or<TimeoutException>()
        .WaitAndRetryAsync
        (
            retryCount: 5,
            sleepDurationProvider: retryAttempt => TimeSpan.FromSeconds(Math.Pow(2, retryAttempt))
        ).ExecuteAsync(async () => await apiInstance.GetBeers());


    //Save beers into the cache
    Barrel.Current.Add(key: key, data: beers, expireIn: TimeSpan.FromSeconds(5));

    return beers;
}

Wrapping Up

This is a great way to implement your networking layer within your apps as it can sit within a .NET Standard library and be used in all your client apps.

If you’d like to see a more real-world example of this approach, then check out the Mobile Cloud Workshop I created with Robin-Manuel. The Xamarin.Forms app uses this approach and, it’s been working very well for us!

Big thanks to Rob for the original post and documenting such a simple solution to complex problem!

How to fix the IPv4 loopback interface: port already in use error.

Super quick post here. Sometimes when debugging your .NET Core application on Mac, you’ll find the port won’t free up, and thus you can’t redeploy without getting the following fatal error:

Unable to start Kestrel. System.IO.IOException: Failed to bind to address http://localhost:5000 on the IPv4 loopback interface: port already in use.

To fix this, you’ll need to fire up Terminal and enter the following:

sudo lsof -i :5000

In my case, this outputted the following:

Screen Shot 2017-10-20 at 18.54.54.png

I know the error is referencing the IPv4 Type which allows me to quickly find the PID number, which I’ll use to kill the connection. I do this with the following command

kill -9 18057

With that done, I can now get back to debugging my .NET Core web API on macOS.

App Services Custom Domain, SSL & DNS

We’ve all seen tutorials which demonstrate how to deploy a simple todo list backend to Azure but how many have you read that go onto secure it? In this post, I’m going to cover how I’m securing the Bait News v2 backend infrastructure as well as covering how to configure custom domains.

Why bother?

Apple announced in 2015 that Apps and their corresponding backend servers would need to support App Transport Security (ATS).

ATS was introduced with iOS 9 as a security enhancement to ensure all connections by apps use HTTPs. Initially slated to go into effect for all new app store submissions from January 2017, it has since been postponed with no update on when it’ll be coming into effect. Although the requirement has been delayed, it’s still something that all app developers should be implementing as it provides our users with added security, making man in the middle attacks impossible to go unnoticed.

Historically, you’ll see most developers (including myself), opt to turn ATS off to make our lives easier. Some will take a lighter touch and only disable ATS for a single domain (they’re backend) which is not much more secure than turning ATS off altogether. Either approach opens up your users and data to attack and should be avoided.

So what do we need to do to secure our app? Lets first register a domain for our backend.

Custom Domains

DNS

I’ve been using 123-Reg as my domain registrar for 10 years and continue to use them as I migrate my websites to Azure. Most domain registrars will also provide some basic DNS functionality but you would normally want to use a 3rd party DNS Service for more advance situations. In my case, I’m using 123-Regs DNS service and have added a number of CNAMEs pointing to Azure.

Adding records

Below you see the minimum required records needed to enable my custom domain.

Permanent Records

Screen Shot 2017-09-02 at 15.20.53

Temporary Records

Screen Shot 2017-09-02 at 15.32.15

To get started, I have added an A record pointing to the App Service instance using its IP address. You can find your App Service IP address by going into it’s Custom Domain blade within the Azure portal.

Once you’ve added the A record, you can then create the CNAME which will map www requests to your backends url. You can find your destination in the Overview blade of the App Service.

Verify Domain Ownership

Azure needs to know I own the domain I’m trying to map. To prove this, I’ll add two records to my DNS settings which are the temporary records listed above.

Once I’ve added the verify CNAME records, I can save and sit tight. DNS Records need to propagate across the globe, which can take up to 24 hours.

This is end result of what my DNS setting configuration looked like. I also created some CNAMEs to redirect traffic from subdomains to other Azure services.

Screen Shot 2017-08-17 at 11.50.36

Portal Configuration

To finish off, I need to configure the App Service Custom Domain settings.

Screen Shot 2017-09-02 at 15.53.03.png

Hit the ‘Add Hostname’ button and enter the custom domain.

Screen Shot 2017-09-02 at 15.54.03

After hitting Validate, Azure will check the DNS records to confirm the domain exists and that you own it. You should see something like this.

Screen Shot 2017-09-02 at 15.55.53

Hitting ‘Add hostname’ will complete the process of configuring a custom domain for your App Service. If you’re deploying a mobile backend, you may want to create CNAME record which maps api.domain.net to your mobile backend and whilst keeping www.domain.net mapped to a ASP.NET website.

Adding Security

SSL Certificates

As mentioned at the start of this post, enabling HTTPS prevents MITM attacks and ensures your communication between server and client is secure. Its pretty straight forward to enable within App Services but much like DNS, it can take a while (but this time its human factors rather than waiting for computers to sync up).

First things first, You’ll need to purchase a certificate.  I opted to 123-Reg as they provide a few options to meet most users requirements and its integration with my domain management make it a no brainer to use.

I should admit that I did make a mistake when I first purchase a certificate, which caused a few days of delays, so its important to double check the type of certificate you’re purchasing. I had purchased a certificate which was for only www.baitnews.io. This mean that my mobile api of api.baitnews.io couldn’t use the certificate. 123-Reg refunded the first certificate and I tried again, but this time making sure to purchase a certificate which support unlimited subdomains. You can see below the original certificate has been revoked and the new certificate supports wildcards.

When you apply for a certificate, you’ll be provided a download which includes your certificate request (CSR) in a PEM format. You also get the private key which you’ll use later to create a new certificate.

Screen Shot 2017-09-02 at 16.13.32

Once you’ve been issued the certificate, you’re ready to create a new certificate which you’ll use in Azure for everything. This is a pretty easy process as we can use OpenSSL on almost any platform. I’m on a Mac but this works the same on both Windows and Linux.

openssl pkcs12 -export -out baitnews.pfx -inkey
/Users/michaeljames/Downloads/SSL-CSR5/private-key.key -in
/Users/michaeljames/Desktop/wildcard.cert

cert

Variables

  • [Output file name] -| What do you want to call the certificate?
  • [private-key.key path] The location of the private key. This would have been provided when requesting the certificate.
  • [wildcard.cert path] The location of the freshly issued certificate.

Once you press enter, you’ll need to type in a couple of passwords and then you’ll be set. It’ll look something like this:

Screen Shot 2017-09-02 at 16.38.43

You now have your certificate ready for uploading to Azure. The conversion of certificates isn’t the easiest of procedures to wrap your head around on the first few goes. If you’re worried about this step then keep in mind you can purchase SSL certificates through the Azure Portal, which skips many of the above steps! It does however add a small premium to the cost involved in securing your backend as you’ll find the certificate a little more expensive but your also required to store it in KeyVault.

Binding Domains with Certificates

Lets upload our new certificate to our App Service. To do this, head over to SSL Certificate blade and hit ‘Upload Certificate’. You’ll need to provide the password used to create the certificate.

Screen Shot 2017-09-07 at 12.26.17.png

If successful, you’ll see your certificate as been imported and is ready to use with your custom domains.

Screen Shot 2017-09-07 at 12.29.33.png

Add Binding

The last step is to bind out SSL certificate with our custom domain. Clicking ‘Add Binding’ will allow you to select both the custom domain and SSL from a drop down.

Screen Shot 2017-09-07 at 12.29.40

Hitting Add Binding will finish the process. You now have a custom domain mapped your App Service instance that supports HTTPS. Any users visiting your backend will be greeted  with the familiar green padlock in the address bar.

Screen Shot 2017-09-07 at 12.31.59.png

Wrapping Up

Adding custom domains and enabling secure connectivity between your mobile app and backend is extremely simple and theres no good reason not to enable it (unless you’re hacking on a demo or POC).

In the next post I’m going to cover how to expand our setup to to route traffic to the nearest App Service instance.

Creating a simple Azure backend POC for your mobile app

Most mobile apps require some form of infrastructure to function correctly. In the case of something like Instagram, they’ll likely have some blob storage for storing images and then a SQL database for storing user information like comments and likes. They’ll have a REST API which the mobile app uses to interact with these services rather than having a direct connection to each service within the backend.

Within the context of Azure, we would usually opt to use Azure App Service as our middleware/orchestration layer. Our mobile apps will connect to this layer and we can deal with ensuring the users have the correct permissions to access the data in our storage services.

In this video, I show how I created a proof of concept backend for Bait News. I had an Excel spreadsheet which I wanted to host in the cloud and make avaiaible to all my mobile users. To do this, I used Azure App Service Easy Tables. Watch below to find out more:

Stretchy UITableView Headers with Xamarin

The Yahoo News Digest app includes a couple of interesting user interface elements that I wanted to use within my own apps. The feature that I was most keen to recreate was the stretching UITableViewHeader. Its an effect seen in lots of iOS (sometimes referred to as a parallax header). As Beer Drinkin is going to support multi-tasking on iOS, I needed to ensure my implementation continues to support Storyboards and Auto Layout. Fortunately it proved very simple to get everything setup. In this blog post I’ll be sharing how I went about implementing it

beerdrinkinStretchy

Setting up the Storyboard

Adding a header view

To get started I opened my existing Storyboard and selected the UIViewController that requires the tableview header. In my case the scene (or view controller) isn’t a UITableViewController because I require a floating ‘Check in’ button to be visible at all times. Its worth noting that all the steps in the tutorial work with both UITableViewControllers and UIViewControllers.

Screen Shot 2016-02-01 at 11.36.06

Once I had the storyboard loaded, I dragged a UIView from the toolbox and made sure to insert it above the UITableViewCells as a the header view for the table. I then added a UIImageView to the new UIView and set its constraints to be 0,0,0,0. This way when the parent view (the UIView) resizes, the UIImageView will resize as well. I also made sure to set the UIImageView view mode property to Aspect Fit, which makes sure the image looks great no matter what size the view.

Screen Shot 2016-02-01 at 11.39.13

Adding some C#

Adding the resize code

If I were to have run this now, the table header would be displayed but wouldn’t resize with scroll events. To add scaling, I needed to add a code into my ViewController to setup the stretchy goodness that I wanted.

Because I use the header views height in a number of locations throughout the beer description view controller, I went ahead and created a variable rather than scattering magic numbers over my class.

[sourcecode language=”csharp”]
private nfloat headerViewHeight = 200;
[/sourcecode]

Managing the header view

To allow me to manage the table header, I needed to remove it from the UITableView and keep it as a variable for use later. To do this I created a variable in the beer description view controller.

[sourcecode language=”csharp”]
private UIView headerView;
[/sourcecode]

When we load the view controller, we’ll want to set our headerView variable and then set the UITableViews header property to null. This means the tableview has no header view to manage anymore, instead I’ve taken control of the view which allows me to ensure it resizes correctly as the table view scrolls.Despite having just removed the header view from the UITableView, I actually want to go ahead and add it to the table view hierarchy (but not as the header view property of the UITableView)

[sourcecode langauge =”csharp”]
headerView = tableView.TableHeaderView;
tableView.TableHeaderView = null;
tableView.AddSubview (headerView);
tableView.ContentInset = new UIEdgeInsets (headerViewHeight, 0, 0, 0);
tableView.BackgroundColor = UIColor.Clear;
[/sourcecode]

Listening to TableViewDidScroll

In order to successfully respond to the DidScroll event of the UITableViewSource, I’ll need to create an event in the table views delegate. This is because of an issue with the UITableView DidScroll event not firing when a delegate has been set.

[sourcecode language=”csharp”]
public override void Scrolled (UIScrollView scrollView)
{
DidScroll ();
}

public event DidScrollEventHandler DidScroll;
[/sourcecode]

We can now hook up the table DidScroll event with a small piece of logic for resizing the view.

[sourcecode language=”csharp”]
//Update Tableview
tableView.Source = new BeerDescriptionDataSource(ref cells);
var deleg = new DescriptionDelegate (ref cells);
deleg.DidScroll += UpdateHeaderView;
tableView.Delegate = deleg;

tableView.ReloadData ();
View.SetNeedsDisplay ();
//…
void UpdateHeaderView ()
{
var headerRect = new CGRect (0, -headerViewHeight, tableView.Frame.Width, headerViewHeight);
if (tableView.ContentOffset.Y &amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;amp;lt; -headerViewHeight)
{
headerRect.Location = new CGPoint (headerRect.Location.X, tableView.ContentOffset.Y);
headerRect.Size = new CGSize (headerRect.Size.Width, -tableView.ContentOffset.Y);
}
headerView.Frame = headerRect;
}
[/sourcecode]

Conclusion

Its very easy to use this approach to add resizing animations to any number of controls within your UITableView. My favourite part of this solution is that it works perfectly across all iOS devices and doesn’t force me to drop support of Autolayout.

Creating a 5 star search experience

Search is a feature that can make or break your mobile app, but it can be incredibly difficult to get right. In this blog post, I’m going to share how I’m solving search with Beer Drinkin.

There are many options for us developers looking to implement search solutions into our projects. Some of us may decide to use Linq and Entity Framework to look through a table, and the more adventurous may opt to create an instance of Elastic Search, which requires a lot of work to set up and maintain. For Beer Drinkin, I’m using Microsoft’s Azure Search service as it has proved to be easy to configure and requires zero maintenance.

The reason that Beer Drinkin uses Azure Search is simple: the BreweryDB search functionality is too limited for my needs. One example of this is that the end point often returns zero results if the user misspells a search term. If I searched for “Duval” rather than “Duvel,” BreweryDB’s search would return zero beers. Even if I were to spell the search term correctly, BreweryDB would return all beers from the Duvel Moortgat brewery. Although this is minor, I would prefer that Maredsous 6 and Vedett Extra White not be returned as these do not have “Duvel” in the name.

Screen Shot 2015-12-21 at 15.36.14

Spelling Mistakes

Another issue with using the default search functionality of BreweryDB is its inability to deal with spelling mistakes or offer suggestions. Simple off-by-one-letter spelling mistakes yield no results, something that should be easy to resolve.

Screen Shot 2015-12-21 at 15.40.40

I’ve had sleepless nights worrying that on release, users fail to find results due to simple spelling mistakes. One way to address spelling mistakes is to utilize a spell checking service like WebSpellChecker.net.

The issue with a service such as WebSpellChecker is that it has no context in which to make corrections when it comes to names of products and it also doesn’t support multiple languages.

Another way to minimize spelling mistakes is to provide a list of suggestions as the user types in a search query. You’re probably familiar with this in search engines like Google and Bing. This approach to searching is intuitive to users and significantly reduces the number of spelling mistakes.

Enter Azure Search

Azure Search aims to remove the complexity of providing advanced search functionality by offering a service that does the heavy lifting for implementing a modern and feature-rich search solution. Microsoft handles all the infrastructure required to scale as it gains more users and indexes more data. Not to mention that Azure Search supports 50 languages, which use technologies from multiple teams within Microsoft (such as Office and Bing). What this equates to is Azure Search understands the languages and words of the search requests.

Some of my favorite features

Fuzzy search – Find strings that match a pattern approximately.

Proximity search – Geospatial queries. Find search targets within a certain distance of a particular point.

Term boosting –  Boosting allows me to promote results based on rules I create. One example might be to boost old stock or discounted items.

Getting Started

The first step I took was to provision an Azure Search service within the Azure Portal. I had two options for setting up the service; I could have opted for a free tier or have paid for dedicated resources. The free tier offers up to 10,000 documents and 50MB storage, which is a little limited for what I need.

Because my index already contains over 50,000 beers, I had no option but to opt for the Standard S1 service, which comes at a cool $250 per month (for Europeans, that’s €211). With the fee comes a lot more power with the use of dedicated resources, and I’m able to store 25GB of data. When paying for Search, you’ll be able to scale out to 36 units, which provides plenty of room to grow.

Creating an index

Before I could take advantage of Azure Search, I needed to upload my data to be indexed. Fortunately, with the .NET SDK the Azure Search team provides, it’s exceptionally easy to interact with the service. Using the .NET library I wrote a few weeks ago, which calls BreweryDB, I was able to iterate quickly through each page of beer results and upload them in blocks to the search service.

Screen Shot 2016-01-04 at 10.18.02.png

Uploading documents

[sourcecode language=”csharp”]
Parallel.For(1, totalPageCount, new ParallelOptions {MaxDegreeOfParallelism = 25}, index =&amp;amp;amp;amp;amp;amp;gt;
{
var response = client.Beers.GetAll(index).Result;
var beersToAdd = new List();
foreach (var beer in response.Data)
{
var indexedBeer = new IndexedBeer
{
Id = beer.Id,
Name = beer.Name,
Description = beer.Description,
BreweryDbId = beer.Id,
BreweryId = beer?.Breweries?.FirstOrDefault()?.Id,
BreweryName = beer?.Breweries?.FirstOrDefault()?.Name,
AvailableId = beer.AvailableId.ToString(),
GlassId = beer.GlasswareId.ToString(),
Abv = beer.Abv
};

if (beer.Labels != null)
{
indexedBeer.Images = new[] {beer.Labels.Icon, beer.Labels.Medium, beer.Labels.Large};
}
beersToAdd.Add(indexedBeer);
}
processedPageCount++;
indexClient.Documents.Index(IndexBatch.Create(beersToAdd.ToArray().Select(IndexAction.Create)));

Console.Write( $”\rAdded {beersToAdd.Count} beers to Index | Page {processedPageCount} of {totalPageCount}”);
});
[/sourcecode]

Other data import methods

Azure Search also supports the ability to index data stored in Azure SQL or DocumentDB, which enables you to point a crawler to your SQL table and ensures it is always up to date rather than requiring you to manually manage the document index yourself. There are a few reasons you may not want to use a crawler. The best reason for not using a crawler is that it introduces the possibility of a delay between your DB changing and your search index reflecting the changes. The crawler will only crawl on a schedule, which results in an out-of-data index.

If you opt for the self-managed approach, you can add, remove, and edit your indexed documents yourself as the changes happen in your back end. This provides you with live search results as you know the data is always up to date. Using the crawler is an excellent way to get started with search and quickly get some data in place, but I wouldn’t consider it a good strategy for long-term use.

I mentioned earlier that the free tier is limited to 10,000 documents, which translates to 10,000 rows in a table. If your table has more than 10,000 rows, then you’ll need to purchase the Standard S1 tier.

Suggestions

Before we can use suggestions, we’ll need to ensure that we’ve created a suggester within Azure.

Screen Shot 2016-01-04 at 10.27.15.png

In the current service release, there is support for limited index schema updates. Any schema updates that would require re-indexing, such as changing field types, are not currently supported. Although existing fields cannot be modified or deleted, new fields can be added to an existing index at any time.

If you’ve not checked the suggester checkbox at the time of creating a field, then you’ll need to create a secondary field as Azure Search doesn’t currently support editing the fields. The Azure Search team recommends that you create new fields if you require a change in functionality.

The simplest way to get suggestions would use the following API.

[sourcecode language=”csharp”]
var response = await indexClient.Documents.SuggestAsync(searchBar.Text, “nameSuggester”);
foreach(var r in response)
{
Console.WriteLine(r.Text);
}
[/sourcecode]

Having fun with the suggestion API

The API suggestion provides properties for enabling fuzzing matching and hit highlighting. Let’s see how we might enable that functionality within our app.

[sourcecode language=”csharp”]
var suggestParameters = new SuggestParameters();
suggestParameters.UseFuzzyMatching = true;
suggestParameters.Top = 25;
suggestParameters.HighlightPreTag = “[“;
suggestParameters.HighlightPostTag = “]”;
suggestParameters.MinimumCoverage = 100;
[/sourcecode]

What do the properties do?

UseFuzzyMatching – The query will find suggestions even if there’s a substituted or missing character in the search text. While this provides a better search experiance, it comes at the cost of slower operations and consumes more resources.

Top – Number of suggestions to retreive. It must be a number between 1 and 100, with its default to to 5.

HightlightPreTag – Gets or sets the tag that is prepended to hit highlights. It MUST be set with a post tag.

HightlightPostTag – Gets or sets the tag that is prepended to hit highlights. It MUST be set with a pre tag.

MinimumCoverage – Represents the precentage of the index that must be covered by a suggestion query in order for the query to be reported a sucess. The default is 80%.

How do the results look?

Simulator Screen Shot 4 Jan 2016, 10.39.25.png

Search

The Search API itself is even easier (assuming we don’t use filtering, which is a topic for another day).

[sourcecode language=”csharp”]

var searchParameters&amp;amp;amp;amp;amp;nbsp;= new SearchParameters() { SearchMode = SearchMode.All };

indexClient.Documents.SearchAsync(searchBar.Text, searchParameters);
[/sourcecode]

Special Thanks

I’d like to take a moment to thank Janusz Lembicz for helping me get started with Azure Search suggestions by answering all my questions. I  appreciate your support (especially given it was on a weekend!).

Updated BreweryDB .NET client

687474703a2f2f7777772e6272657765727964622e636f6d2f696d672f62616467652e706e67

header.png


Over the weekend I decided to look at a project which hasn’t recieved much love since I orginally wrote it earlier in the year. That project is PCL for interacting with the awesome beer database that is BreweryDB.

My orginal implementation was very simplistic in its design, only exposed a handful of the endpoint, lacked any error checking and also had a few too little unit tests. With BeerDrinkin’s development coming along nicey, I thought it time to take a look at some of its dependencies and see how I could improve them. First up was BreweryDB and so I set about improving the PCL to hopefully make it useful for other .NET developers.

Supported enpoints

This new version of BreweryDB includes many more supported endpoints. Its now possible to query almost all of BreweryDB using .NET.

  • Adjuncts
  • Beers
  • Breweries
  • Categories
  • Events
  • Features
  • Fermentables
  • FluidSizes
  • Guilds
  • SocialSites
  • Yeasts

 

Getting Started

Beers

  • GET/beers
  • GET/beer/beerId
  • GET/search/

Get all

This returns a list of all beers but will be paginated with 50 beers per page.

[sourcecode language=”csharp”]
//Will return the first page
var response = await client.Beers.GetAll();

//Will return the fourth page
var response = await client.Beers.GetAll(4);
[/sourcecode]

Get by id

This returns a single beer

[sourcecode language=”csharp”]
var response = awaitclient.Beers.Get(“cBLTUw”);
[/sourcecode]

Using request parameters

This returns a  list of beers

[sourcecode language=”csharp”]
var parameters = new NameValueCollection {{BeerRequestParameters.Name, “duvel single”}};
var response = await client.Beers.Get(parameters);
[/sourcecode]

Search by name

This returns a  list of beers

[sourcecode language=”csharp”]
var response = await client.Beers.Search(“duvel”);
[/sourcecode]


Breweries

  • GET/breweries
  • GET/brewery/breweryId
  • GET/search/

Get all

This returns a list of all breweries but will be paginated with 50 breweries per page.

[sourcecode language=”csharp”]
//Will return the first page
var response = await client.Breweries.GetAll();

//Will return the forth page
var response = await client.Breweries.GetAll(4);
[/sourcecode]

Get by Id

This returns a single Brewery

[sourcecode language=”csharp”]
var response = await client.Breweries.Get(“YXDiJk”);
[/sourcecode]

Search by name

This returns a  list of breweries

[sourcecode language=”csharp”]
var response = await client.Breweries.Search(“duvel”);
[/sourcecode]

Naturally its open source

As always, I’ve made this avaiaible on GitHub and Nuget for you to use in your own apps.