The News
Artificial intelligence models could “collapse” within just a few generations if trained on data they generate themselves. A new study in Nature modeled what happens to several different kinds of AI, including large language models like those underpinning ChatGPT, if they run out of human-produced information and start training on data they make themselves.
Researchers found that models degenerate fast, losing their original abilities and triggering a “cascading effect” of compounding errors. As more of the internet becomes AI-generated material that is then fed back into the models, there could be a tipping point where online content becomes “poisoned,” the researchers warned, and it becomes harder or even impossible to access real, original information.