Artificial Intelligence is not all it’s cracked up to be. AI hallucination has become a well known term to describe its complete fabrication of data. AI is also adept at deception and blackmail, while recent studies show its inability to handle complex problems.
Shownotes:
*****
Makia Freeman is the editor of alternative media / independent news site The Freedom Articles. He is author of the books Break Your Chains and , the book series Controversial Truths Revealed (Cancer: The Lies, the Truth and the Solutions and 40 Incredible Real Life Alien Abductee and Contactee Experiences) and lead researcher at ToolsForFreedom.com. Makia is on Rumble, BitChute and Odysee.
1 Comment
Makia, the hallucination primarily comes about when data is being analysed without context. So the key is to use ontologies and knowledge graphs to provide context to the raw data and then based upon research improves the accuracy of results given by the AI system. As well, don't be surprised that it has been noticed that western AI systems such as OpenAI not only have problems with hallucinations but often create phantom references for results it provides. This doesn't happen, in my experience with DeepSeek which is a lot more rigorous in its reasoning (by looking at the reasoning being done where it is completely hidden in OpenAI) and the references it provides.