Last year I wrote a post on whether machines could ever think1. Recently, in addition to all the general chatbot competitions, there has been a new type of test for deeper contextual understanding rather than the dumb and obvious meanings of words. English2 has a rich variety of meanings of words with the primary as the most common and then secondary and tertiary meanings further down in the dictionary. It’s probably been a while since you last sat down and read a dictionary, or even used an online one other than to find a synonym, antonym or check your spelling3 but as humans we rely mostly on our vocabulary and context that we’ve picked up from education and experience.
Alexa, what’s the weather like?
- Spoiler, it depends on how you define thinking 😉 ↩
- As this is the language I know most intimately. ↩
- I know it was 12 years for me – I was looking for some really cool spell names for a MUD that I was involved in creating and I believe got as far as ‘D’, with each spell bearing relevance to the meaning of the word… ↩
- But not always, and the ability does vary from person to person. The case of Derek Bentley, who was given a posthumous pardon after being hanged, is an interesting example. When his accomplice in the burglary was asked to hand over the gun, he was alleged to have said “let him have it” – did he mean “let him have the gun” or more colloquially “shoot him”. His friend chose the latter and despite Derek himself having learning difficulties, both boys were accused of murder. Assuming these words were even spoken. See the Wikipedia entry on the case ↩
- If you want an example of this in real time, the subtitles on the live news is a good one to watch – these cannot be prepared in advance and the speech recognition dumbly translates what it thinks rather than what is said. ↩
- which has since been fixed and used successfully ↩