Replacing lighting programmers with AI

The title is click bait, but I can picture situations where we could replace lighting programmers with artificial intelligence using language understanding techniques that power Alexa, Google Assistance and Cortana. I’m going to discuss how I’m working on this using Microsoft language understanding and intelligence technology.

Console Syntax

Most DMX Lighting control systems have a command line interface which supports a domain specific language for programming lights for theatre shows, concerts, tv and everything in between. The syntax is usually pretty similar across different manufacturers, but there always lies some subtle differences that can trip up experienced users when switching systems.

As I’ve been investigating how to create my language for users to program their shows on my control system, I’ve continually come back to the idea that the lighting industry has standardised what they do but its the tools that offer the variables.

An extreme of this is the theatre world where the lighting designer may call for specific fixtures and exact intensities. They may say something like “one and five at fifty percent”. The lighting programmer will then type in something like 1 + 5 @ 50 Enter on one brand of lighting consoles and perhaps Fixture 1 + 5 @ 5 on another console. The intent is the same but the specific syntax changes.

It’s currently the role of the lighting programmer to understand the intent of the designer and execute that on the hardware in front of them. The best lighting programmers can translate more abstract and complex queries into machine understandable commands using the domain-specific language of the control system their using. They understand the lighting consoles every feature and are master translators. They’re a bridge between the creative and the technical, but they are still fundamentally just translating intents.

Removing the need for human translation

Voice to text would go some way to being able to remove the lighting programmer as for simple commands like the one demonstrated earlier, it’s easy to convert the utterance to an action, but most designers don’t build scenes like this. For more complex commands, the console will likely get it wrong, and with no feedback loop, it won’t have the opportunity to learn from its mistakes like a human.

This is where utilising AI will significantly help. I’m currently working on my console featuring the ability to use machine learning, powered by the cloud so that eventually, even the most complex of requests should be fulfilled by the console alone. While cloud-connected in a lighting console probably seems strange, it is just a stepping stone to a full-offline supported system.

Language Understanding with AI

Let me walk you through the training process as I go about teaching it to understand a range of syntax and utterances. I’ve started this in the blog post from scratch, but in reality, I have a fleshed out version of this with many more commands and supports syntax from a variety of consoles.
The first step is to create a new ‘App’ within the Language Understanding and Intelligence service (LUIS).

NewAPp

By default, our app has no intents as we’re able to build a solution to suit our needs and this isn’t a pre-canned or prebuilt solution. To get started we’ll try and define an intent to effect the Intensity parameter of a fixture.

empty intents

EmptyIntent

We need to provide some examples of what the user might say to trigger this intent. To make this powerful, we want to give a variety of examples. The most natural being something like “1 @ 50”. This contains numeric symbols because that’s what our consoles interfaces provide us with, but if we’re using voice to text solutions, we will get the following response “One at fifty”. To solve this, we need to create some entities so that our AI understands that one is also 1. Thankfully to make developers lives easier, Microsoft provides a whole host of prebuilt entities so we can use their number entity rather than build our own.

entities

Matching to numbers is helpful, but we also need to provide information about other types of lighting specific entities. Below I define a few SourceTypes as the offline syntax for my console follows the grammar rules of ‘Source, Command Destination’.

Creating custom entities

I also provide synonyms which mean if a lighting designer for some crazy reason calls all lighting fixtures “device” then we can arcuately calculate the intent. Synonyms are incredibly powerful when we’re building out language understanding as you’ll see below in the PaletteType entity. I’ve created synonyms for Intensity which allows designers to say things like “Fixture 1 brightness at 50 percent” rather than knowing that the console thinks of brightness as intensities. I’ve also made sure to misspell Colour for Americans…

paletteTypes

Even with just three entities, our intent is more useful than just setting intensities. We can now handle basic fixture manipulation. For example “Group 3 @ Position 5” would work correctly with this configuration. For this reason, I renamed the intent to something more sensible (Programmer.SourceAtDestination).

renamming

Training and testing the AI

Before we can use our AI service, we must train it. The more data, the better but we can get some great results with what we already have.

TrainedApp

Below you can see I passed in the utterance of “fixture 10 @ colour 5”.

testresults

The top scoring intent (we only have one so its a little bit of a cheat) is Programmer.SourceAtDestination. The source type is Fixture and the number 10.

What’s next?

Language and conversation will be used in many technologies that may not be obvious right now. I believe it won’t be long until a lighting control manufacturer releases some form of AI bot or language understanding within their consoles and these get better with every day there used. Maybe I’ll be first, but I can’t believe no one else has seen this technology and not thought how to build it into a control system so perhaps we’ll start to see this as the norm in a few years.
Right now its early days but I’d put money on there being some virual lighting programmers shipping with consoles. What type of personality the virtual programmers have will be down to each manufacturer. I hope that they realise that their virtual programmer needs some personality or it’ll be no more engaging than voice to text. I’ve given some thought to my bots personality, and it stems from real people. I’m hoping to provide both female and male assistance, and they’ll be named Nick and Rachel.

Takeaways

It’s never too early to start investigating how AI can disrupt your industry. This post focuses on the niche that is the lighting control industry, but this level of disruption will be felt across all industries. Get ahead of the curve and learn how you can shape the future of your industry with Microsoft AI services.

 

 

 

Reviving a dead pet (project)

I’ve been working on a problem for more than a decade on and off now, and last week I decided I was going to take it seriously.  Seriously enough to set up an Office 365 subscription attached to the project’s domain along with an Azure DevOps subscription for keeping everything in one place.  I’ll regularly be blogging about the project so I thought it’d probably best to write this to provide context for future posts.

History of the project

It’ the first week at university (did I mention I went to Drama School?) and I’m being forced to use a lighting control desk called the ETC Express. It’s like an idiots version of the Strand 500 series, which is the system that I know and love. If you’re not experienced with lighting control then know that lighting programmers often define themselves by their tools, just like we programmers frequently do with our languages and frameworks of choice.

 

Programming a show on my parents dinning table on a Strand 520i.

 

I was whining profusely about how limited the ETC Express was and my tutor said: “well why don’t you build your own then?”. In hindsight, she probably said this out of frustration but I agreed that I could do a better job, and thus commenced my journey back into .NET development and ultimately a career at Microsoft.
Lighting control systems haven’t advanced past the innovations of the Hog 2 created by my good friend Nick Archdale. I wanted to create something unique, but most importantly it had to be as intuitive as the light switches we use at home every day. Initially, I was picturing a huge multi-touch screen, but the technology wasn’t available back then (the original iPhone hadn’t even been announced). I wanted to create a console that could be played like an instrument as lighting can be just as expressive as any musical instrument but frankly, I lacked the skills required to deliver my vision.
Hog 2 Lighting Control Console
Non the less, I started building lots of proof of concepts using WPF to see how it might work. Eventually, had a pretty solid idea of what I wanted to build but I couldn’t even match the features of existing systems with the knowledge I had at the time.
An old screenshot of a proof of concept.

Rebooting the project

Earlier this year I visited Nicks business in West London to discuss licensing some of his technology for a mobile app. One of the chaps there asked if the app would control any lights. It wasn’t in my spec as controlling lights is much more complex than you’d reasonably imagine, but this simple question has derailed the app and reminded me of an itch I’ve been ignoring for years.

I went home and started creating some POCs using my experience gained from a decade of .NET development. I think I’ve cracked the secret sauce for creating a workable, scalable control system. The system HAS to be modular in every aspect from C# projects to physical hardware.

The future

Right now I’ve got the beginnings of a the important components of the control system working and I’m tying them together to build a minimal viable product before I start on the multi-touch instrument like parts that I’ve dream of for the last decade.

I’ve not yet decided how it’ll be released yet. I’m hoping to release bits of this as OSS but can’t promise anything just yet but if you’re interested in getting involved then ping me a message and we can chat!

Right now my UI is HEAVILY inspired by Nicks Hog 3 console but using Prism makes this insanely easy to change! Below is a video I recorded showing one of my small bugs in the windowing system so you can an idea of where I’m at (very early days)

 

Special Thanks

I feel a need to thank a few influential people who’ve helped me over the years to reach the point of being able to tackle this technical problem with some degree of competence.

Rachel Nicholson for the idea and belief that I could create a control system.

Nick Hunt for mentoring me through my dissertation as I investigated what an intuitive lighting control might look like.

Nick Archdale and Richard Mead for hiring me out of university and encouraging me to be a better developer and licensing their fixture data to the project while I develop the control system.

You can reach me anything DMX related at mike@litemic.com