KAMPALA — Ugandan businessman and property mogul Dr. Sudhir Ruparelia has issued a strong public statement distancing himself from a widely circulated deepfake video that falsely depicts him endorsing a fraudulent online financial platform.
The fabricated video, which appears to have been generated using advanced artificial intelligence (AI) techniques, falsely shows Dr. Ruparelia claiming that the Government of Uganda is withholding information about a purportedly profitable investment scheme.
The AI-generated content alleges that an initial investment of UGX 915,000 could yield monthly returns of up to UGX 10 million, and insinuates government censorship and suppression of related information.
“This platform is completely legal, licensed, and working in the country,” the manipulated video falsely states in a voice eerily similar to Dr. Ruparelia’s. It further suggests that government officials deleted promotional materials and ignored appeals to make the platform public.
An investigation by this publication has confirmed that the video is a digital forgery. Experts believe that the creators used authentic footage from a legitimate interview Dr. Ruparelia granted to the Uganda Broadcasting Corporation (UBC), which was subsequently altered to create the misleading message.
Both the visuals and audio were manipulated using deepfake and voice cloning technologies to imitate the business mogul’s likeness with startling accuracy.
In a public rebuttal, Dr. Ruparelia condemned the video and warned the public against falling victim to such sophisticated scams.
“The video is fake. It’s a malicious attempt to misuse my name, image, and voice to deceive the public,” he said. “I have not endorsed any online investment platform. These are fraudulent schemes designed to prey on people’s trust and desperation.”
Dr. Ruparelia, whose business interests span across banking, real estate, education, and hospitality, emphasised the importance of vigilance in an era where technological misuse is on the rise.
He urged the public to always verify financial opportunities through official channels and to be wary of online platforms that promise unrealistic returns.
“If it sounds too good to be true, it probably is,” he cautioned.
Cybersecurity experts are raising alarm over the growing use of AI tools such as deepfakes and voice cloning in fraudulent activities.
Dr. Angela Ndaka, a Kampala-based technology analyst, warned that these technologies are becoming increasingly accessible to cybercriminals, posing a serious threat to both public figures and ordinary citizens.
“Fraudsters are now using AI to generate fake endorsements that look and sound real. This is not just about celebrities or business leaders—it’s about anyone whose identity can be exploited to build false credibility,” she said.
As authorities investigate the origins of the manipulated video, Dr. Ruparelia has pledged to work closely with relevant agencies to identify the perpetrators and protect the public from similar schemes in the future.
