AI Hallucinations Fuel Surge in Fake Citations, Study Warns

AI Hallucinations Fuel Surge in Fake Citations, Study Warns

A new Lancet study reveals a worrying trend in scientific publishing: researchers are increasingly inserting made‑up references that point to papers that don’t exist. These “fabricated” citations, experts say, are contaminating the scholarly record and making it harder to trace the true origins of ideas. The problem appears to be linked to the rise of large language models—AI tools that can generate text but sometimes hallucinate details, including bogus bibliography entries. Misha Teplitskiy, a sociologist at the University of Michigan who studies how scholars cite one another, warned that the flood of AI‑generated content is turning scientific writing into a mix of genuine insight and sloppy filler. “Is AI making science more efficient, or is it just creating slop?” he asked. The study, the first to quantify the quality impact of AI‑driven writing, found a sharp uptick in false citations over the past few years. If unchecked, these phantom references could mislead future research, waste reviewers’ time, and erode public trust in science. The authors call for stricter editorial checks, better AI‑usage guidelines, and heightened awareness among scholars to keep the citation chain honest and reliable.

Read more