You want to book a holiday. You’ve heard Malta is nice, but you’ve never been, so you decide to do some research.
Your laptop starts up. Before you’ve even finished typing the first few letters, the AI assistant appears with suggestions. It looks like a paperclip.
Crap GPT has decided to revive it as a mascot, assuming people will find it familiar and comforting.
“Hey! It looks like you want to find out about Maltesers! I can place an order for some if you like?”
The prompt hijacks the whole window. You have to respond.
“No.”
Three dots flash.
“Ordering no more than 30 Maltesers.”
You sigh. You’ll need to cancel that later, but the prompt disappears.
You continue your search for Malta.
The AI response fills the results window.
“Malta is widely considered to be a real place. However, evidence has come to light that it does not exist and never has.
Source: Reddit.”
You scroll past the AI nonsense and look further down. The results are no better. There are millions of articles.
“Malta is terrible.”
“Malta is paradise.”
“Malta is home to a secret race of lizard people.”
You accidentally click the last one.
The page loads an incoherent article with no facts and frequent self-contradictions. Ads scream for your attention. Videos autoplay. Pop-ups crawl across the screen. You hit back.
You keep scrolling.
Since the rise of AI-generated content, sites like Wikipedia have been pushed further and further down the rankings, buried under endless variations of confidently wrong text.
“Malta is where Maltesers are made.”
“Malta is where the Upside Down is.”
“I dreamed about Malta and now it exists.”
Eventually, you find the Wikipedia article.
Clippy returns.
“You seem to be interested in going to Malta. Would you like me to find you a flight?”
“Sure,” you type. Why not take a risk?
“The optimum route is by car. Renting a car for three months.”
You close your laptop.
You’ll need to cancel that too.