Cambridge Dictionary (CD) has named ‘hallucinate’ as Word of the Year.
No, not to celebrate a surge in psychoactive substance usage, which causes users to see, hear, feel, or smell something that doesn’t actually exist.
But because 2023 will go down in history as the breakthrough year for AI, it’s perhaps not surprising the word has taken on a new definition.
CD has newly defined ‘hallucinate’ as:
“When an artificial intelligence (= a computer system that has some of the qualities that the human brain has, such as the ability to produce language in a way that seems human) hallucinates, it produces false information.”
A couple of examples are listed:
- LLMs are notorious for hallucinating – generating completely false answers, often supported by fictitious citations.
- The latest version of the chatbot is greatly improved but it will still hallucinate facts.
I must confess, I had not heard of the word being used in this context until I stumbled across the article, so I was a little surprised it won.
Nevertheless, it raises an important point: overreliance on AI can lead to catastrophic consequences, so we should see it as something that assists rather than replaces us.
At least for the foreseeable future. 😱