In the AI realm, a "hallucination" is a terrible thing. It is generally described as an output that lacks a basis - meaning that it is inaccurate. The AI system tries to give you the right answer for your question / prompt, but, for some reason, it lacks the right reasoning or data and gives you their best guess.
An example (from an IBM lecture): the AI database has been trained up until mid 2022 and you ask about a planet that has been discovered in January 2024. The system might generate a fictional, scenario-based and factually inaccurate answer based on its understanding of astronomy and scientific discoveries.
There are some user-side techniques to mitigate hallucinations. One of them is that you should check the database of the AI system (in my example, you should have a look at the cutoff date of the system). But probably the most powerful ones are to use prompt engineering techniques and tools, that work on the way you present your questions / prompts and guide the AI system through it.
Consumer, consumer, consumer! That’s what should be at the heart of any business! Exploring opportunities, created by specific needs, and then addressing them in an effective and efficient way. So, let’s talk about consumer and marketing?
Friday, 17 May 2024
Hallucinations
Thursday, 2 May 2024
Is it magic?
“Any sufficiently advanced technology is indistinguishable from magic” - Arthur C Clarke
And I think that is exactly what you feel when you use a generative AI tool for the first time. There is that slight thrill of excitement followed by an afterthought - “how far can this go?”
We don’t have an answer for that yet. We know though that we are in the driving seat. We know that the success of companies and ventures in the next ten years will depend on how they drive that triple transformation that has been put in motion, combining AI, Sustainability and People. So... Let’s go - let’s master magic ;) !