AI Deepfake Medicine: Is There a Real Doctor in the House?
We’ve all heard warnings about fake news, but we are now facing something far more dangerous: fake doctors. It’s not just anonymous avatars giving dodgy advice online, but AI-generated videos using the real faces and real voices of respected medical professionals.
This is not a future threat. It is happening right now.
Read on as Mr Inder Birdi explores the dangers of deepfake medicine.
When Medical Authority is Hijacked
The independent UK fact-checking organisation Full Fact uncovered various deepfake clips circulating on TikTok and other platforms in a 2025 investigation.
These videos show well-known doctors apparently endorsing supplements, “miracle cures,” or made-up medical conditions. None of it is real. None of it was said. But the technology is so convincing that even trained clinicians could be fooled at a glance.
Deepfakes allow anyone to take footage of a doctor speaking at a conference, in Parliament, or even in an educational video, and manipulate it into something entirely fabricated.
One example:
A professor of children’s health appeared, Professor David Taylor-Robinson, without his knowledge, in a deepfake video promoting a menopause supplement. The so-called treatment contains everything from turmeric to Himalayan shilajit. AI rewrote his words, altered his mouth movements, and presented him as an enthusiastic endorser of a product he’d never heard of. ing
His real expertise didn’t matter.
His reputation did.
Fraudsters know exactly what they’re doing: patients trust doctors. Borrowing that trust, even artificially, sells products.
As a practising heart surgeon with online educational content, I can tell you: this could happen to any medical professional.
Why Deep Fakes are Dangerous for Patients
Along with being misleading, deepfake medical advice is harmful.
- Scammers specifically target vulnerable groups, such as women experiencing menopause, as demonstrated by the Taylor-Robinson deepfake.
- Fictional “symptoms” are invented to create demand — one video referenced a condition called “thermometer leg,” which does not exist.
- Supplements are marketed as “science-backed” without any proper evidence.
Patients arrive in clinics quoting advice from videos their doctor never made.
When people cannot distinguish genuine medical guidance from AI-generated manipulation, the entire foundation of evidence-based medicine becomes shaky.
If you believe your doctor said one thing online and tells you something different in person, who do you trust?
The Platforms Aren’t Ready
TikTok initially did not remove the Taylor-Robinson deepfake. The platform merely restricted TikTok’s visibility and eligibility for recommendation by the all-powerful algorithm. Only after Full Fact approached TikTok and the video had 365,000 views was it removed.
Platforms have no reliable system for verifying who appears in health-related videos. There is no fast-track for doctors to request takedowns.
And supplement companies can simply say the videos were made by “unaffiliated creators,” absolving themselves of responsibility while continuing to benefit from the sales.
Offline, impersonating a doctor is a criminal offence.
Online, there is effectively no consequence.
What Must Change NOW
If we want to protect patients and preserve trust in medical expertise, three things must happen:
1. Mandatory Watermarking for AI-Generated Content
Every platform should require visible labelling for synthetic media, and there should be systems in place to enforce the policy. While this principle should be applied to all content, from brunch snaps to mountaintop selfies, it is vital when health claims are involved
2. Accountability From Platforms
Platforms can’t use the “we can’t control our affiliates” defence when a company that benefits financially from fraudulent impersonation advertises with them. Protocols would be put in place and enforced to monitor scam accounts.
3. Verify Who’s Giving Medical Advice Online
If you appear in a video giving medical advice, platforms should verify your identity just as they verify advertisers.
What Doctors Can Do
Medical professionals cannot solve this alone, but we can take proactive steps:
- Monitor your digital footprint: Know what content exists about you by searching for yourself online and using tools like Google Alerts. Educate patients: Remind them to check your official verified channels.
- Create trusted spaces online — a verified hub where your genuine advice lives. For example, maintain an official website, a verified social media presence, or a professional blog that patients can trust as their go-to source.
If deepfakes are going to steal our voices, those in the medical profession must make sure our real ones are easy to recognise.
Where We Go From Here
If AI can convincingly impersonate any doctor to endorse any treatment, how do we preserve public trust?
Do we withdraw from public communication, making ourselves less accessible and leaving a vacuum for misinformation? This might feel safer in the short term, but it ultimately means handing over our platform to those who don’t care about medical accuracy or patient safety.
Or do we step forward, speak clearly, and help shape the safeguards before the technology outpaces our ability to control it?
For the sake of patient safety and the future of evidence-based medicine, I believe we must step forward. Ignoring medical deepfakes won’t protect us. Staying silent just makes room for misinformation. When authentic medical voices disappear, fraudulent ones take their place.
Watch the REAL Mr Birdi in his series of informative videos on heart health, cardiac diseases and keyhole heart surgery.
About Mr Inder Birdi – Keyhole Heart Surgeon & Cardiac Consultant
As Director of the Keyhole Heart Clinic, Mr Birdi uses his expertise in minimally invasive cardiac surgery to provide patients with access to advanced procedures, faster recovery times and great outcomes.
He is one of only a handful of surgeons worldwide able to perform keyhole coronary bypass surgery and keyhole aortic valve repair. In 2022, he made UK cardiac history by carrying out the first quadruple heart bypass using keyhole surgery.
Mr Birdi is a multiple-time TopDoctor Award winner and has received over 100 five-star patient reviews. In 2025, he was also named Entrepreneur of the Year by the Entrepreneurs Circle, recognising his leadership and innovation in heart care.
Click through to our Keyhole Heart Clinic blog for expert insights on medical topics, like this one on deep fake AI medicine.