AI Hallucinations and Deception – Video #221

AI hallucinations deception

Share

Artificial Intelligence is not all it’s cracked up to be. AI hallucination has become a well known term to describe its complete fabrication of data. AI is also adept at deception and blackmail, while recent studies show its inability to handle complex problems.

Shownotes:
https://mitsloanedtech.mit.edu/ai/basics/addressing-ai-hallucinations-and-bias/
https://www.ibm.com/think/topics/ai-hallucinations

1 Comment

JayTe June 13, 2025 - 3:51 am

Makia, the hallucination primarily comes about when data is being analysed without context. So the key is to use ontologies and knowledge graphs to provide context to the raw data and then based upon research improves the accuracy of results given by the AI system. As well, don't be surprised that it has been noticed that western AI systems such as OpenAI not only have problems with hallucinations but often create phantom references for results it provides. This doesn't happen, in my experience with DeepSeek which is a lot more rigorous in its reasoning (by looking at the reasoning being done where it is completely hidden in OpenAI) and the references it provides.

Post Comment