For many apps, the end of the year is a time for sharing personalized user summaries — custom reports recounting how people listened, watched, shopped, exercised, and generally logged activity over the past year.
However, with companies using artificial intelligence to create reports that are generally criticized as being void of personality, many customers have grown to see these year-end reports as pointless.
And if Spotify's annual Wrapped — generally highly anticipated — was widely regarded as an AI-powered disappointment, book-tracking app Fable took things to the next level.
Fable released a series of annual reading summaries for its users that had the intention of celebrating the unique reading style and book selections of each reader.
Unfortunately, the AI model it used to generate these summaries fell vastly short of the mark, Mint reported. The summaries contained what many users deemed to be "offensive language related to race, gender, sexuality, and disability."
For example, one summary said: "Your journey dives deep into the heart of Black narratives and transformative tales, leaving mainstream stories gasping for air. Don't forget to surface for the occasional white author, okay?"
Other commenters shared similarly off-putting summaries, such as a description of disability-focused narratives as ones that would "earn an eye-roll from a sloth" or a reference to rom-com reads as "cringe-worthy."
One writer, Danny B. Groves, was told by the app that his bookshelf is "a vibrant kaleidoscope of voices and experiences, making me wonder if you're ever in the mood for a straight, cis white man's perspective!"
The head of product, Chris Gallello, posted a video on Fable's Instagram page issuing an apology; he later stated that the app would be removing three of its key AI-powered features.
Groves criticized the app for not vetting the AI output before it became such an issue for their community. "There shouldn't be an AI algorithm that's immediately pushing out content or generating an output that can create harm," he told Mint.
In addition to pushing out confusing, inaccurate, or outright offensive content, many people are frustrated with the heavy reliance on AI because of its environmental impact. The megawarehouses that power AI computer banks use a staggering amount of energy and water to keep them powered and climate-controlled; similar crypto mining warehouses have also been criticized for emitting vast noise and air pollution.
Do you worry about companies having too much of your personal data? Click your choice to see results and speak your mind. |
Join our free newsletter for good news and useful tips, and don't miss this cool list of easy ways to help yourself while helping the planet.