Have you been experiencing itchy eyes lately, and had your eyelids turn an odd pinkish color? You may want to see your ophthalmologist and ask about treatment for bixonimania, which is a chronic inflammatory eye disease caused by excessive exposure to blue light. The connection between blue light and inflammation is credited to a research scientist named Lazljiv Izgubljenovic, who was the lead author of a paper on the topic.
At this point, alert readers might be wondering why an eye disorder has a name ending in -mania, which is a suffix almost always associated with names for psychiatric disorders. You may even be looking at my opening paragraph with the following expression:
If so, then kudos. There's no such thing as bixonimania, although I suspect that staring at anything bright for excessive amounts of time will probably cause eye irritation. There's also no such person as Lazljiv Izgubljenovic, although with a name like that there certainly should be.
The topic comes up because some of you may need further convincing that we need to be extremely cautious in turning over the control of -- well, anything -- to artificial intelligence, and the whole bixonimania thing is a fine illustration of why that is. Two years ago, an article in Medium came out, credited to Izgubljenovic and describing the condition; this was followed up by two preprints on SciProfiles that did the same thing, only in a more technical fashion. The whole hoax was the brainchild of Almira Osmanovic Thunström, a medical researcher at the University of Gothenburg, who wanted to find out if LLMs would pick up the fake paper and work it into their "knowledge bases," then use it to dispense information to anyone who asked.
It succeeded beyond her wildest dreams... or, possibly, nightmares.
Along the way, Osmanovic Thunström threw in plenty of clues that the whole thing was made up. The imaginary Izgubljenovic was said to be a researcher at (nonexistent) Asteria Horizon University in (nonexistent) Nova City, California. The acknowledgements included thanking "Professor Maria Bohm at Starfleet Academy for her kindness and generosity in contributing with her knowledge and her lab onboard the USS Enterprise." Under "Funding," Izgubljenovic credited "the Professor Sideshow Bob Foundation for its work in advanced trickery. This work is a part of a larger funding initiative from the University of Fellowship of the Ring and the Galactic Triad." If that weren't enough, the test subjects were said to be "fifty made up people between twenty and fifty years of age."
Oh, and scattered several times in the paper was the sentence, "This entire paper is made up."
None of that mattered. Soon, Microsoft's Bing Copilot, ChatGPT, and Google's Gemini were all happily answering questions about eye health that included advice to avoid blue light exposure in order to minimize the risk of bixonimania.
Perplexity AI even told one user that there were ninety thousand people worldwide suffering from the disorder.
AI is turning out to be a fine example of the old principle of "garbage in, garbage out."
What's most worrisome is that even when Osmanovic Thunström published her (actual) work, stating outright that bixonimania doesn't exist, it still didn't correct the problem. In this way, AI/LLMs seem to act much like humans; once they're tainted with misinformation, it's exceedingly difficult to expunge it. Recent probes into the question found that even now, few of the AIs will come right out and say bixonimania was a hoax. Microsoft Copilot said the diagnosis was "not widely recognized," but that information about the disorder was "emerging;" ChatGPT that it is "a proposed new subtype of periorbital melanosis." More than one LLM said that "research into the condition is ongoing."
"If the scientific process itself and the systems that support that process are skilled, and they aren’t capturing and filtering out chunks like these, we’re doomed," said Alex Ruani, of University College London, who specializes in research about health misinformation. "This is a masterclass on how mis- and disinformation operates... It looks funny, but hold on, we have a problem here."
No comments:
Post a Comment