Recent research by the BBC showed that AI chatbots from four major companies are unable to accurately summarize or answer questions when prompted with information from specific news sources, amid a backdrop of increasing legal action against AI companies.
What's happening?
Four AI chatbots — OpenAI's ChatGPT, Microsoft's Copilot, Google's Gemini, and Perplexity AI — were given content uploaded from BBC's website and then tasked with answering questions about content in the articles.
Given 100 BBC articles, results showed that 51% of AI-generated summaries had significant issues, with 19% introducing new incorrect information. Many of the inaccuracies had misinformation about dates, people, and even misquotes from the articles.
Why does this development matter?
Deborah Turness, the CEO of BBC News and Current Affairs, expressed concern over the inaccuracies the test showed.
"We live in troubled times, and how long will it be before an AI-distorted headline causes significant real-world harm?" she asked. "The companies developing [generative] AI tools are playing with fire."
With the increased use of AI, especially atop results from Google, it is important for companies to improve the tools so that misinformation is not spread to the general public.
There have also been adverse environmental effects from companies leveraging AI. Data centers consume massive amounts of water and other resources, and the poor user experience provided by AI services makes this usage wasteful.
While Big Tech has made claims of taking initiatives toward clean energy to power the energy needs with less pollution, much of both the current and future power plans involve natural gas power plants, which send heat-retaining gases into the atmosphere that basically act as an unnaturally thick blanket of gas.
What's being done about this problem?
Outside of its test, the BBC has blocked its articles from being used in AI results in Google searches.
The BBC's programme director for generative AI, Pete Archer, said companies "should have control over whether and how their content is used, and AI companies should show how assistants process news along with the scale and scope of errors and inaccuracies they produce."
TCD Picks » Quince Spotlight
💡These best-sellers from Quince deliver affordable, sustainable luxury for all
Do you worry about companies having too much of your personal data? Click your choice to see results and speak your mind. |
Other companies have followed suit, such as Chegg, the New York Times, Forbes, and News Corp. Chegg, an educational tech company, filed a lawsuit against Google in federal court primarily concerned with copyright issues and lost revenue from AI results negating, or seeming to negate, a searcher's need to open a site to understand the information in its proper context.
The others, meanwhile, have sued or threatened to sue Perplexity to stop using their content.
Everyday consumers should pay attention to how large companies are leveraging AI while putting forth clean energy initiatives, looking out for corporate greenwashing and prioritizing usage with companies who have environmental concerns on the forefront.
While a smarter future is possible by utilizing AI, users should still consider how a greener future might live alongside it. On a personal level, users can aim to lower their usage of AI tools unless the benefits seem to outweigh the costs, and it's worth trying the Google Chrome extension Ecosia, which replaces Google as your default search tool with a modified version of Bing that features zero AI results and even plants trees with the ad revenue it generates.
AI has its place in the world, as Dr. Chris Mattman told The Cool Down in a recent interview, but that doesn't mean people don't need to be mindful of their usage. While one individual might not be enough to reverse the danger, making a collaborative effort is the first step in working toward a cleaner future for all.
Join our free newsletter for good news and useful tips, and don't miss this cool list of easy ways to help yourself while helping the planet.