Not even Apple is safe against artificial intelligence hallucinations. We’ve seen that happening quite frequently with Google’s Gemini (like the platform telling users to put glue on pizza), Microsoft, and OpenAI's ChatGPT.
Apple even has a prompt trying to prevent its Apple Intelligence platform from hallucinating, but it doesn't mean it won't get a few things wrong. Now, BBC News reports that the journalistic NGO Reporters Without Borders has called on Apple to stop using notifications summary.