- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
It is now clear that generative artificial intelligence (AI) such as large language models (LLMs) is here to stay and will substantially change the ecosystem of online text and images. Here we consider what may happen to GPT-{n} once LLMs contribute much of the text found online. We find that indiscriminate use of model-generated content in training causes irreversible defects in the resulting models, in which tails of the original content distribution disappear. We refer to this effect as ‘model collapse’ and show that it can occur in LLMs as well as in variational autoencoders (VAEs) and Gaussian mixture models (GMMs). We build theoretical intuition behind the phenomenon and portray its ubiquity among all learned generative models. We demonstrate that it must be taken seriously if we are to sustain the benefits of training from large-scale data scraped from the web. Indeed, the value of data collected about genuine human interactions with systems will be increasingly valuable in the presence of LLM-generated content in data crawled from the Internet.
Yes this isn’t news- it’s called AI cannibalism and it’s the high tech version of making a tape of a tape of a tape. It’s part of the great enshitification.
Consider this: a lot of general knowledge is trained into ai using Wikipedia. Since ai bots have a friendly chat interface and natural language processing that makes a decent attempt at understanding context and language intent, asking ChatGPT to look something up results in an interestingly summarized, cross referenced answer that might draw from 5 or 6 wiki articles that otherwise might have required a couple hours of reading and diving to derive organically (with your meat computer). Since just asking ChatGPT is way easier than spending 2 hours clicking on Wikipedia, people start just using the bot instead of Wikipedia. Fast forward 5-10 years. People don’t even go to wiki anymore because why would you? People stop contributing to wiki because no one goes there anyway, it’s as useless as a serial port gender changer. So now 90% of the web is just the summarized output of ai bots. Wiki goes offline because no one donates, no one visits. Now the latest gen AI is trained on Russian troll bots, Instagram comment sections, and Reddit comments which have all become 90% ai bot spam. The thing that made AI good was the quality of the training data but now all the new data is absolute trash, just SEO ad garbage. The generation of AI model trained on that can’t help but produce total static because who the fuck is taking the effort the put real quality on the net anymore?
I’m really sad about this future… (this present).