CNET reported that an AI used to write news articles was posting wildly inaccurate data as of Thursday, January 26, 2023.
According to CNET, the tool was an internal tool and not the newly created Chat GPT, which allows users to interact with AI using natural language queries and responses.
"The case for AI-drafted stories and next-generation storytelling tools is compelling, especially as the tech evolves with new tools like ChatGPT. These tools can help media companies like ours create useful stories that offer readers the expert advice they need, deliver more personalized content and give writers and editors more time to test, evaluate, research and report in their areas of expertise."
The key takeaways from this experiment gone wrong from CNET are:
- AI engines, like humans, make mistakes.
- Bylines and disclosures should be as visible as possible.
- New citations will help us – and the industry.
Posting AI generated content is against Google's webmaster guidelines so it will be interesting to see if this impacts CNET's search engine rankings.
Technomancer is a science and tech enthusiast who enjoys writing about software and AI and other tech topics.