AI Will Eat Itself, Ed Gossage, Co-Founder of 7DOTS

By Ed Gossage, Co-founder of brand and digital agency 7DOTS:

Are we witnessing the slow death of originality? It’s a theory that’s been rattling around in my head for a while and the rise of generative AI could see the last rites being issued.

During a keynote presentation at a recent conference I attended, acclaimed author Alex Murrell delivered a powerful presentation, “The Age of Average”. Alex took delegates on a “journey of homogeneity”. The premise is that there is little variety or original thinking out there.

It starts with the tale of two Russian artists who hired a market research firm to survey the public on what they wanted in a work of art. They did this in 11 countries with the hypothesis that each work of art would be a unique expression of each country and its culture. What they found was that every painting looked the same.

Alex then moves on to coffee shops to car designs to electric toothbrush advertising to video games – it’s all the same within each genre. Whilst it was comically depressing, Alex finished on a positive note – there is an opportunity for businesses to stand out and shine.

Except, I think there is worse to come thanks to AI. Specifically, I am talking about AI tools like ChatGPT (not closed-loop AI tools).

AI is knowledgeable, not intelligent


AI is brilliant. It’s powerful. It’s genuinely useful. It might well actually be a game-changer. It’s here to stay. It’s not a fad (NFT anyone?).

Ironically, what AI is not is “intelligent” (in its current guise); it’s simply very “knowledgeable”. Looking at dictionary definitions, “Knowledge” is the collection of skills and information acquired through experience. In the case of AI, it hoovers up information from the World Wide Web.

“Intelligence” is the ability to apply knowledge. Whilst it’s been announced that ChatGPT can now browse the web, its “knowledge” is largely still based on being “trained” on data up to September 2021.

Without the web, ChatGPT is nothing. Its entire being relies on there being voluminous quantities of articles and data which can feed its database. I’m sure there will be criticism of this next statement, but ChatGPT is really a very clever version of Google. Rather than finding relevant pages, it finds relevant information scraped from web pages and pieces them together in natural language.

Stack Overflow, an early victim of ChatGPT

Let’s take a “feeder” website like Stack Overflow. If you are unfamiliar with it, this is one of the largest community forums on the web for (mostly) developers to share programming knowledge.

This is typically the first port of call for tech people to get code snippets and help with programming. Ditto, the more knowledgeable amongst us might also post suggestions and answers to the questions posted. No doubt, there’s a sizeable chunk of ChatGPT’s database which is scraped from Stack Overflow.

So what happens when people start going to ChatGPT to post questions and answers and stop going to Stack Overflow? Inevitably Stack Overflow usage will go down. Maybe so much so that they don’t get the visits to justify the costs of running the website.

Maybe Stack Overflow is no more. Which means no more original content from it. Which means ChatGPT can no longer feed itself. If you’ve ever played Gauntlet, it’s like shooting your food when your health is running low.

Extrapolate this to all such sources of knowledge on the web. The more we use ChatGPT, the more we shoot the food that feeds it.

Gen AI ‘A drunk uncle at a gathering’

And even if this is extreme, there’s another side effect of using ChatGPT. People will start (probably already are) pasting answers into Stack Overflow directly from ChatGPT. So when ChatGPT re-indexes itself, it’s going to start consuming its own content and treating that as the new “gospel”. The best description I’ve heard of ChatGPT is “a drunk uncle at a gathering” – Meredith Whittaker, president of Signal, the not-for-profit secure messaging app.

Stack Overflow is even offering an AI-powered search within its own website. That hurts my brain – it reminds me of Seamus Wray and the infinite self-portrait. He painted a portrait of himself painting his portrait. And then another one of himself painting a portrait of himself painting a portrait. And so it goes on.

So we may find ourselves in a world where there is no original content – it’s been regurgitated ad infinitum, with each consecutive pass removing more of the source of truth. Let’s take it a step further.

Media manipulation, election rigging, Twitter (X) bots – these have all been hot topics over the last few years. It’s easy to live in an echo chamber. If you want to believe that man didn’t land on the Moon, you can quickly find yourself lots of “proof” online. Famously, Google told people for a (short) while that “throwing car batteries into the ocean is good for the environment, as they charge electric eels and power the Gulf Stream”. It didn’t make that up – it used a form of AI to piece together “facts” it found while scraping the web.

There’s a danger that AI not only fuels the lack of originality, it also proliferates untruths. If we don’t continue to feed it original content, it will start to rewrite science, history, and politics. Not only will we find ourselves in an “age of average” where nothing is original, but it may also just be plain wrong.

The lesson is don’t bite the hand that feeds you.