Wikipedia has an ‘AI’ translation problem
Large language models, called “AI” in marketing terms, sometimes make shit up. It’s a problem that anyone recognizes if they use one for even a little while—and it’s not just basic information they get wrong. Wikipedia is wrestling with just one such example as it’s finding “hallucinations” in LLM-translated articles.
The start of this story is positive and altruistic: a third-party, non-profit organization called the Open Knowledge Association (OKA) is paying a stipend to people who...