The title is click bait, but I can picture situations where we could replace lighting programmers with artificial intelligence using language understanding techniques that power Alexa, Google Assistance and Cortana. I’m going to discuss how I’m working on this using Microsoft language understanding and intelligence technology.
Most DMX Lighting control systems have a command line interface which supports a domain specific language for programming lights for theatre shows, concerts, tv and everything in between. The syntax is usually pretty similar across different manufacturers, but there always lies some subtle differences that can trip up experienced users when switching systems.
As I’ve been investigating how to create my language for users to program their shows on my control system, I’ve continually come back to the idea that the lighting industry has standardised what they do but its the tools that offer the variables.
An extreme of this is the theatre world where the lighting designer may call for specific fixtures and exact intensities. They may say something like “one and five at fifty percent”. The lighting programmer will then type in something like 1 + 5 @ 50 Enter on one brand of lighting consoles and perhaps Fixture 1 + 5 @ 5 on another console. The intent is the same but the specific syntax changes.
It’s currently the role of the lighting programmer to understand the intent of the designer and execute that on the hardware in front of them. The best lighting programmers can translate more abstract and complex queries into machine understandable commands using the domain-specific language of the control system their using. They understand the lighting consoles every feature and are master translators. They’re a bridge between the creative and the technical, but they are still fundamentally just translating intents.
Removing the need for human translation
Voice to text would go some way to being able to remove the lighting programmer as for simple commands like the one demonstrated earlier, it’s easy to convert the utterance to an action, but most designers don’t build scenes like this. For more complex commands, the console will likely get it wrong, and with no feedback loop, it won’t have the opportunity to learn from its mistakes like a human.
This is where utilising AI will significantly help. I’m currently working on my console featuring the ability to use machine learning, powered by the cloud so that eventually, even the most complex of requests should be fulfilled by the console alone. While cloud-connected in a lighting console probably seems strange, it is just a stepping stone to a full-offline supported system.
Language Understanding with AI
Let me walk you through the training process as I go about teaching it to understand a range of syntax and utterances. I’ve started this in the blog post from scratch, but in reality, I have a fleshed out version of this with many more commands and supports syntax from a variety of consoles.
The first step is to create a new ‘App’ within the Language Understanding and Intelligence service (LUIS).
By default, our app has no intents as we’re able to build a solution to suit our needs and this isn’t a pre-canned or prebuilt solution. To get started we’ll try and define an intent to effect the Intensity parameter of a fixture.
We need to provide some examples of what the user might say to trigger this intent. To make this powerful, we want to give a variety of examples. The most natural being something like “1 @ 50”. This contains numeric symbols because that’s what our consoles interfaces provide us with, but if we’re using voice to text solutions, we will get the following response “One at fifty”. To solve this, we need to create some entities so that our AI understands that one is also 1. Thankfully to make developers lives easier, Microsoft provides a whole host of prebuilt entities so we can use their number entity rather than build our own.
Matching to numbers is helpful, but we also need to provide information about other types of lighting specific entities. Below I define a few SourceTypes as the offline syntax for my console follows the grammar rules of ‘Source, Command Destination’.
I also provide synonyms which mean if a lighting designer for some crazy reason calls all lighting fixtures “device” then we can arcuately calculate the intent. Synonyms are incredibly powerful when we’re building out language understanding as you’ll see below in the PaletteType entity. I’ve created synonyms for Intensity which allows designers to say things like “Fixture 1 brightness at 50 percent” rather than knowing that the console thinks of brightness as intensities. I’ve also made sure to misspell Colour for Americans…
Even with just three entities, our intent is more useful than just setting intensities. We can now handle basic fixture manipulation. For example “Group 3 @ Position 5” would work correctly with this configuration. For this reason, I renamed the intent to something more sensible (Programmer.SourceAtDestination).
Training and testing the AI
Before we can use our AI service, we must train it. The more data, the better but we can get some great results with what we already have.
Below you can see I passed in the utterance of “fixture 10 @ colour 5”.
The top scoring intent (we only have one so its a little bit of a cheat) is Programmer.SourceAtDestination. The source type is Fixture and the number 10.
Language and conversation will be used in many technologies that may not be obvious right now. I believe it won’t be long until a lighting control manufacturer releases some form of AI bot or language understanding within their consoles and these get better with every day there used. Maybe I’ll be first, but I can’t believe no one else has seen this technology and not thought how to build it into a control system so perhaps we’ll start to see this as the norm in a few years.
Right now its early days but I’d put money on there being some virual lighting programmers shipping with consoles. What type of personality the virtual programmers have will be down to each manufacturer. I hope that they realise that their virtual programmer needs some personality or it’ll be no more engaging than voice to text. I’ve given some thought to my bots personality, and it stems from real people. I’m hoping to provide both female and male assistance, and they’ll be named Nick and Rachel.
It’s never too early to start investigating how AI can disrupt your industry. This post focuses on the niche that is the lighting control industry, but this level of disruption will be felt across all industries. Get ahead of the curve and learn how you can shape the future of your industry with Microsoft AI services.