{"id":18910,"date":"2024-07-18T01:00:15","date_gmt":"2024-07-18T01:00:15","guid":{"rendered":"https:\/\/interhospi.com\/?p=18910"},"modified":"2024-07-16T13:21:55","modified_gmt":"2024-07-16T13:21:55","slug":"deepfakes-of-popular-tv-doctors-are-being-used-to-sell-health-scams-on-social-media","status":"publish","type":"post","link":"https:\/\/interhospi.com\/deepfakes-of-popular-tv-doctors-are-being-used-to-sell-health-scams-on-social-media\/","title":{"rendered":"Deepfakes of popular TV doctors are being used to sell health scams on social media"},"content":{"rendered":"
\n

<\/p>\n<\/div><\/section><\/div>

<\/p>\n<\/div><\/section>
\n

Deepfakes of popular TV doctors are being used to sell health scams on social media<\/h1>AI<\/a><\/span>, deepfake<\/a><\/span>, fraudulent products<\/a><\/span>, scam<\/a><\/span>, The BMJ<\/a><\/span>, AI<\/a>, E-News<\/a> <\/span><\/span><\/header>\n<\/div><\/section>
\n

UK television doctors are being impersonated in AI-generated videos to promote fraudulent health products on social media, according to an investigation by The BMJ<\/em>.<\/strong><\/p>\n

\u00a0<\/strong><\/p>\n

<\/p>\n

\"fake

A fictitious image of a doctor created with Midjourney AI image generator<\/p><\/div>\n

Artificial intelligence revolution fuels rise in deepfake scams<\/strong><\/h3>\n

The rapid advancement of artificial intelligence (AI) technology has led to a surge in deepfake videos featuring well-known UK doctors promoting fraudulent health products on social media platforms. An investigation by The BMJ<\/em> [1] has revealed that trusted medical professionals, including Dr Hilary Jones, Dr Michael Mosley, and Dr Rangan Chatterjee, are being impersonated in AI-generated videos to sell scam products claiming to cure high blood pressure, diabetes, and other health conditions.<\/p>\n

These deepfake videos, which use AI to map a digital likeness of a real person onto another body, are becoming increasingly sophisticated and difficult to detect. According to a recent study, up to 50% of people shown deepfakes discussing scientific subjects cannot distinguish them from authentic videos.<\/p>\n

Exploiting trust in medical professionals<\/strong><\/h3>\n

The investigation, led by journalist Chris Stokel-Walker, found that scammers are exploiting the trusted status of popular TV doctors to lend credibility to their fraudulent products. Dr Hilary Jones, a well-known figure in UK medical broadcasting, reported that his likeness has been used to promote various products, including those claiming to cure high blood pressure and diabetes, as well as hemp gummies.<\/p>\n

Dr John Cormack, a retired Essex-based doctor who assisted with the investigation, explained the appeal of this approach for scammers: \u201cIt\u2019s much cheaper to spend your cash on making videos than it is on doing research and coming up with new products and getting them to market in the conventional way.\u201d<\/p>\n

The emotional connection that viewers have with familiar medical professionals makes these deepfake videos particularly effective. People are more likely to trust and believe information presented by someone they recognise from television or social media, increasing the potential for fraud.<\/p>\n

Challenges in combating deepfake scams<\/strong><\/h3>\n

The investigation highlights the difficulties in addressing this growing problem. Dr Jones employs a social media specialist to search for and report deepfake videos misrepresenting his views, but the process is often futile. \u201cEven if they\u2019re taken down, they just pop up the next day under a different name,\u201d he said.<\/p>\n

Henry Ajder, an expert on deepfake technology, attributes the rise in these scams to the increased accessibility of AI tools for voice cloning and avatar generation. He noted that while some tools require identity checks or biometric authorisation, many lack robust safety measures to prevent misuse.<\/p>\n

Social media platforms\u2019 response<\/strong><\/h3>\n

Meta, the parent company of Facebook and Instagram, where many of these deepfake videos have been found, stated that they would investigate the examples highlighted by The BMJ<\/em>. A spokesperson said: \u201cWe don\u2019t permit content that intentionally deceives or seeks to defraud others, and we\u2019re constantly working to improve detection and enforcement.\u201d<\/p>\n

However, the rapid proliferation of these videos suggests that current measures are insufficient to stem the tide of deepfake scams.<\/p>\n

Identifying and reporting deepfakes<\/strong><\/h3>\n

As the technology behind deepfakes continues to improve, spotting them becomes increasingly challenging. Ajder noted that telltale signs, such as poorly rendered earlobes or mismatched lip movements, are becoming less common as the AI improves.<\/p>\n

The BMJ<\/em> article offers advice for those who encounter suspected deepfakes:<\/p>\n

    \n
  1. Carefully examine the content to confirm suspicions.<\/li>\n
  2. Attempt to contact the person being impersonated through official channels.<\/li>\n
  3. Leave a comment questioning the video\u2019s authenticity to alert others.<\/li>\n
  4. Use the platform\u2019s built-in reporting tools to flag the content.<\/li>\n
  5. Report the account that shared the post, not just the individual video.<\/li>\n<\/ol>\n

    Implications for medical professionals and patients<\/strong><\/h3>\n

    The rise of deepfake scams poses significant challenges for both medical professionals and patients. Doctors may find themselves dealing with patients who have been misled by fraudulent videos and are seeking inappropriate treatments. It is crucial for healthcare providers to be aware of this trend and prepared to educate patients about the existence of deepfakes and the importance of following standard medical advice.<\/p>\n

    For patients, the proliferation of these scams underscores the need for critical thinking when encountering health information online. Dr Jones advises: \u201cThe onus falls on people using social media not to buy anything from it, because it\u2019s so unreliable that you simply don\u2019t know what you\u2019re buying.\u201d<\/p>\n

    As AI technology continues to advance, the medical community and social media platforms must work together to develop more effective strategies for detecting and combating deepfake scams. Until then, vigilance and scepticism remain the best defences against these increasingly sophisticated forms of medical misinformation.<\/p>\n

    Reference:<\/strong><\/p>\n

    1. Stokel-Walker, C. (2024). Deepfakes and doctors: How people are being fooled by social media scams. BMJ<\/em>, 386, q1319. https:\/\/doi.org\/10.1136\/bmj.q1319<\/a><\/p>\n<\/div><\/section>
    \n

    <\/span><\/span><\/div>
    \n