Don’t Count Out Human Writers in the Age of AI

Spread the love

In 2025, human writers will reassert their value. In recent years, more and more content races have been driven by technical and market requirements, such as search engine optimization, which work for neither the creator nor the consumer. Human needs and desires have been sidelined in favor of the economy of attention and the drive for clicks.

Hailed as a boon for freedom of expression, the Internet’s early promise has failed us. Literature and journalism have been replaced by worthless “content”, primarily aimed at filling web pages rather than information or entertainment. Meanwhile, writers’ incomes have plummeted. Author Licensing and Copyrighting Society From 2006 to 2022, authors’ earnings fell 60.2 percent when adjusted for inflation. Emergence of widely available generatives A.I To many it felt like the final nail in the writers’ coffin.

But 2025 will be a turning point, not for AI to replace us but for a renewed appreciation of the emotional, spiritual, political, cultural, and ultimately financial value of high-quality human writing. Ironically, the advent of AI-generated search, stalling traffic to key websites, will kill the need for meaningless “content” to game the system and push people to better demand.

Generative AI has sparked several lawsuits and industry and regulatory action. EU and UK data protection regulators, prompted by complaints from civil society organization NOYB, succeeded in getting a break in Meta’s plans to train its AI on users’ posts, photos and interactions. Traditional publishers like The New York Times have stepped in to protect their own interests and those of their contributors along with them. But some, notably the Financial Times and The Atlantic, have entered into deals with generative AI companies, perhaps in the belief that stemming the tide is impossible. In 2025, they will be proven wrong.

As copyright cases rumble through the courts, in 2025, we will also see liability decisions for the inevitable errors produced by generative AI. Defamation lawsuits against AI companies and publishers using AI content will rise as defamatory falsehoods are disseminated online and amplified by unthinking bots and AI search engines. In 2024, the academic publisher, Wiley, Close 19 journals Faced with a flood of bogus scientific papers. To err is human, but industrial-scale fraud is a technological problem. AI has no professional ethics, no soul, and nothing to lose—but those who use it or ask others to use it for them do.

In 2023, AI companies begin recruiting poets from around the world to try to imbue their dead-eyed products with something approaching creativity. And in 2024, copywriters find their careers, seemingly destroyed by AI, revived as humanizers for artificial marketing content that doesn’t pass an algorithmic, let alone a human, sniff test for quality. The value of human creators is beginning to dawn on the corporations that sought to crush them, now even the machines aren’t fooled by AI. But robot text editing is boring—do writers say no in the end? And will readers join them?

The London premiere of The Last Screenwriter, a film written by ChatGPT 4.0, was canceled in June 2024 after receiving over 200 complaints about the movie’s premise.

Publishers that rely on people will attract the best writers and, ultimately, the most profitable audiences. With many news outlets offering little or no compensation for freelance writers, those people would be loathe to sell their souls so cheaply to train AI to replace them. Publishers who sell their authors will see their talent elsewhere and their readers with them.

In a world inundated by derivative automated drives, human writers will allow readers to breathe air like a green park in a polluted city. Rather than being erased by AI, in 2025, we will see recognition of the inherent value of quality human writing and, perhaps, human authors will be able to start charging their value.

Leave a Reply

Your email address will not be published. Required fields are marked *