The Rise of AI-Generated 'Frankenstein' Citations in Scientific Research
By Peter Rubin
AI Summary
As AI tools like Chat-GPT become more prevalent, their tendency to fabricate citations has become a significant issue for scientists and legal professionals. These 'hallucinated' citations often appear credible, with titles and authors that seem legitimate, but they are entirely fictional. The collaboration between Nature and Grounded AI reveals that these fabricated references are becoming more sophisticated, often resembling real studies with plausible details like co-authors and journal information. Kathryn Weber-Boer from Digital Science notes that even DOIs, which are supposed to be unique identifiers, are being fabricated by AI. Joe Shockman of Grounded AI describes these as 'Frankenstein' citations, pieced together from fragments of real publications, creating a convincing yet non-existent reference. This trend poses a challenge to the integrity of scientific research, as these false citations can easily slip into published articles, misleading readers and researchers alike.
Key Concepts
AI hallucination refers to the phenomenon where artificial intelligence systems generate false or misleading information that appears plausible but is not based on actual data or reality.
Frankenstein citations are references that are artificially constructed by combining elements from various real sources, resulting in a citation that appears real but does not correspond to any actual publication.
Category
TechnologyMore on Discover
Summarized by Mente
Save any article, video, or tweet. AI summarizes it, finds connections, and creates your to-do list.
Start free, no credit card