Hacker NewsFriday · May 15, 2026FREE

Ontario auditors find doctors' AI note takers routinely blow basic facts

aihealthcarehallucinationaudit

An audit by Ontario's Auditor General revealed that AI-powered note-taking tools used by doctors in the province routinely produce inaccurate records, including fabricated allergies, incorrect medication lists, and misstated medical histories. The audit reviewed thousands of patient notes generated by these tools and found that basic facts were often wrong, with some errors potentially leading to serious medical errors. The tools, marketed as time-saving aids for physicians, use speech recognition and natural language processing to transcribe and summarize patient visits. However, the audit found that the AI frequently hallucinated details not mentioned by patients or doctors, and sometimes omitted critical information. The report recommended stricter oversight and validation protocols before such tools are widely adopted. The findings highlight the risks of deploying AI in high-stakes environments without rigorous testing for factual accuracy.

// why it matters

Developers must ensure AI systems in critical domains are rigorously validated to prevent dangerous factual errors.

Sources

Primary · Hacker News
▸ Read original at theregister.com