GROW YOUR STARTUP IN INDIA
Copilot generated image by The Tech Panda 5

SHARE

facebook icon facebook icon

Every day, I receive a bunch of content that I immediately identify as AI generated. There are telltale signs such as usage of words like ‘delve’, ‘underscore’, ‘revolutionize’, and ‘testament.’ Sentence formations also are typical, which remind of lifeless essays from school time.

With more and more “writers” turning to Gen AI for quick articles and stories, the Internet is fast filling up with what the New York Times called “slop,” a term describing sub-optimal writing generated by AI. The problem is AI has made it too easy.

Read more:

Recently, 404 Media did an experiment by creating an autonomous news site powered by ChatGPT that steals original news reports from other sites and publishes them with grammatically correctness. It cost only US$365.63.

AI is making fraud easy and cheap.

BNN Breaking, a news site that had reached a reading audience in the millions, claiming to have an international team of journalists, with a publishing deal with Microsoft, turned out to be AI generated content, full of errors.

No good can come of selling out human originality for easy news stories that can destroy the very news sites who sell out

AI has been facing flak as AI companies scrape across the Internet hither thither to train their LLMs on user content. There are reports of multiple AI companies acquiring content from online publishers in dodgy ways. According to Forbes, the AI search engine Perplexity is directly extracting content from news outlets, after it republished parts of exclusive stories from multiple publications, including Forbes and Bloomberg, without attribution.

As a result, lawsuits are pouring in from strange quarters. Renown authors, like Douglas Preston, George R.R Martin, Michael Connelly, Jodi Picoult, and John Grisham are suing AI companies.

A YouTube creator is suing OpenAI with the allegation that the company trained its generative AI models on millions of transcripts from YouTube videos without notifying or compensating the video owners.

Gone will be the pleasure of delving into a news report by a favourite journalist, because every report will read the same. The human angles of a story, which tug on our heartstrings will cease to actually be human but be AI generated

To detect the ‘slop’, there is AI detection software. But they aren’t as good yet. Many of them can’t catch errors as they promise, and have caused unjust job losses for freelance writers. Universities across the US, such as Vanderbilt, Michigan State, Northwestern, and the University of Texas, are contemplating a ban on such software.

In the midst of this alarming situation, websites and companies that see an opportunity here are selling their data to Gen AI companies. OpenAI has been striking deals with news publishers such as Politico, the Atlantic, Time, and the Financial Times. YouTube announced in late June that it will offer licensing deals to top record labels in exchange for music for training. Melissa Heikkilä of The Algorithm calls this bargain with AI “Faustian”. And rightly so.

Read more:

These stories are ominous signs of what could be coming for news media, more and more slop. No good can come of selling out human originality for easy news stories that can destroy the very news sites who sell out.

As small LLMs rise, with more focused ones replacing the former massive ones, the price of the data needed to train them is about to skyrocket. If news sites sell their data for this training, there’ll come a day when all you have to do to produce a news story is feed facts into an AI with prompts for the style you want it written in, and voila, a news report is served.

Gone will be the pleasure of delving into a news report by a favourite journalist, because every report will read the same. The human angles of a story, which tug on our heartstrings will cease to actually be human but be AI generated.

SHARE

facebook icon facebook icon
You may also like