AI Gone MAD



This was inspired by an article about the Rice University study comparing self-consuming AI to mad cow disease, a topic practically begging for a cartoon. To quote one researcher:

“The problems arise when this synthetic data training is, inevitably, repeated, forming a kind of a feedback loop — what we call an autophagous or ‘self-consuming’ loop,” said Richard Baraniuk, Rice’s C. Sidney Burrus Professor of Electrical and Computer Engineering. “Our group has worked extensively on such feedback loops, and the bad news is that even after a few generations of such training, the new models can become irreparably corrupted. This has been termed ‘model collapse’ by some — most recently by colleagues in the field in the context of large language models (LLMs). We, however, find the term ‘Model Autophagy Disorder’ (MAD) more apt, by analogy to mad cow disease.”

He added that “one doomsday scenario is that if left uncontrolled for many generations, MAD could poison the data quality and diversity of the entire internet.” 

Get my weekly newsletter and support these cartoons by joining the Sorensen Subscription Service! Also on Patreon.



«


Jen Sorensen is a cartoonist for Daily Kos, The Nation, In These Times, Politico and other publications throughout the US. She received the 2023 Berryman Award for Editorial Cartooning from the National Press Foundation, and is a recipient of the 2014 Herblock Prize and a 2013 Robert F. Kennedy Journalism Award. She is also a Pulitzer Finalist.

 

Subscribe

Join the Sorensen Subscription Service! Powered by Campaignzee

Or subscribe via Patreon:

 



MAKE A DONATION




Archives