OpenAI’s Whisper tool may add fake text to medical transcripts, investigation finds.
Ars Technica has been separating the signal from the noise for over 25 years. With our unique combination of technical ...
In health care settings, it’s important to be precise. That’s why the widespread use of OpenAI’s Whisper transcription tool among medical workers has experts alarmed.
Researchers have found that OpenAI's Whisper audio transcriber is prone to hallucination — and that it's what powers one of ...
OpenAI has touted its artificial intelligence-powered transcription tool Whisper as having near “human level robustness and ...
The transcription tool is being deployed in hospitals across America, but some commentators feel it may pose dangers for ...
OpenAI's Whisper, an artificial intelligence (AI) speech recognition and transcription tool launched in 2022, has been found to hallucinate or make things up -- so much so that experts are worried it ...
OpenAI's AI-powered transcription tool, Whisper, has come under fire for a significant flaw: its tendency to generate ...
Tech behemoth OpenAI has touted its artificial intelligence-powered transcription tool Whisper as having near "human level ...
One transcription product that relies on an AI model deletes the original audio, leaving doctors no way to check the ...
Doctors have a reputation for having illegible handwriting, with implied risks for medical misunderstandings later. In the AI ...
While there’s been no shortage of discussion around generative AI’s tendency to hallucinate — basically, to make stuff up — ...