Ahead of season 2 of Channel 4’s Humans, they screened a special showing how a synthetic human could be produced. If you missed the show and are in the UK, you can watch again on 4OD.
Presented by Humans actress Gemma Chan, the show combined realistic prosthetic generation with AI to create a synth, but also dug a little deeper into the technology, showing how pervasive AI is in the western world.
There was a great scene with Prof Noel Sharkey and the self driving car where they attempted a bend, but human instinct took over: “It nearly took us off the road!” “Shit, yes!”. This reinforced the delegation of what could be life or death decisions – how can a car have moralistic decisions, or should they even be allowed to?
There was a very accessible description of natural language processing to understand verbal commands in humanoid robots and the importance of understanding the concept of the sentence rather than individual words. As ever, I love the analysis of when things go wrong and watching the robot fail on a tongue twister repetition was great. What is also interesting is that all humanoid robots blink – they obviously don’t need to, but we are just so freaked out by the dead eye stares of something humanoid, that we just don’t accept anything else.
Response time was shown to be crucial as well, any AI needs to analyse inputs and respond at least as quickly as a human, if not faster, for us to accept them.
Another great description was given of the deep learning process with regard to how IBM’s Watson managed to win at Jeopardy, with a clear graphic of different paths and reinforcement if that path was successful.
They went on to use a chatbot, trained on Gemma’s interviews, and press releases to create a program that could respond as she would. This was fairly straight forward, although making it sound like her required over 1400 sentence samples to get all variants of pronunciation of syllable combinations.
While the chatbot was training, they took a diversion to discuss Google DeepMind and the self learning of the systems to play both Breakout and Go. The point being that self-learning systems could spot solutions that are unique from anything that a human could do.
After a very well placed Echo ad from Amazon in the commercial break, the programs returned to the prosthetic robot face, and the difficulty in synchronising the convincing voice simulation with the correct facial expressions.
Again, the consequences of self-learning AI were explored, including the impact on the job market. Even online articles on sports and business are being created automatically, and we haven’t noticed this creep. At what point will the AI decide what we need to see and hear?
Prof Nick Bostrom of Oxford thinks that by 2050 we will have general autonomous artificial intelligence and the problem with the AI goal not being aligned with ours. That the AI would be indifferent to us – in realising the main goal of the AI, we would be swept aside.
It was a bit creepy seeing synth Gemma be revealed from a body bag, and the engineers were a little cruel to her in that scene. While the synth responded with the correct voice and responses, there were a few facial ticks at this point as it was still learning how facial expressions.
There was a time gap over the next advert break and the final version of the synth was revealed with a face to face between Gemma and her synth. A fake Turing test was set up with a few journalists interviewing synth-Gemma via Skype. Did the journalists believe they were talking to Gemma rather than a robot? I’ll let you watch the program to find out.
All in all, a really interesting program and well worth watching, as is the main series of Humans, which restarts at 9pm on Sunday 30th October, Channel 4.