Imagine scrolling through your Facebook feed, and suddenly, a familiar face pops up – Indonesia’s Minister of Religious Affairs, Nasaruddin Umar. He’s wearing his traditional peci and a smart black suit, the red and white flag proudly draped behind him. The video is short, just eight seconds. But what he says, oh, what he says! He’s announcing an incredible “Al Bayti grant program,” a special fund from Saudi Arabia and the United Arab Emirates. “Grants are officially open,” he declares, his voice clear and authoritative, “ranging from 250 million to 1 billion. Register now!” And just like that, a link appears, directing you to a Messenger account. It sounds too good to be true, doesn’t it? Well, unfortunately, in this case, that’s exactly what it was – a highly convincing, but ultimately deceptive, piece of digital wizardry. This seemingly generous offer, shared by various accounts and even featuring different visuals, was nothing more than a sophisticated hoax designed to trick people.
The team at Tempo, a reputable news organization, smelled a rat almost immediately. They decided to dig deeper, to verify whether Minister Nasaruddin Umar had truly announced such a mind-blowing grant program. Their investigation wasn’t a quick glance; it involved reaching out to official sources, consulting ministerial authorities, and employing cutting-edge artificial intelligence detection tools. And what they uncovered was a stark reminder of how easily technology can be manipulated to spread misinformation. The powerful word that echoed through their findings was “AI.” The entire video, from Nasaruddin’s image to his voice, had been meticulously crafted using artificial intelligence. This wasn’t the real minister speaking; it was a digital ghost, a cleverly programmed illusion designed to sound and look authentic.
“It’s a hoax, clearly created using AI,” confirmed Thobib Al-Asyhar, the Head of the Public Relations and Communications Bureau at the Ministry of Religious Affairs. He expressed genuine concern, emphasizing that such a “Saudi Arabian state grant program, Baitul Mal,” simply does not exist. This wasn’t just a simple mistake or miscommunication; it was a deliberate fabrication, and Thobib urged everyone to be incredibly cautious of similar, too-good-to-be-true offers. Think of it like this: someone painstakingly crafted a digital puppet of the minister, made it move its lips, and even generated a voice that sounded just like his, all to lure unsuspecting individuals. It’s a chilling thought, highlighting the dark side of advanced technology when wielded by those with malicious intent.
Tempo’s forensic analysis of the videos was truly impressive. They didn’t just take Thobib’s word for it; they brought in the big guns. Using AI detection tools like Hive Moderation and Hiya Deepfake Voice Detector, they put the clips under a digital microscope. The first video, the one with Nasaruddin holding court, scored a staggering 99.99 percent deepfake probability with Google’s Veo 3 model on Hive Moderation. Imagine that – almost a perfect score for being fake! And the voice detector? It could only identify authentic audio a measly eight times out of a hundred. Further digging revealed the source of the visuals: an iNews YouTube video from November 23, 2024, showing the real Nasaruddin Umar preparing for a trip to Saudi Arabia. The deepfakers had simply hijacked that genuine footage to create their digital lie.
The deception didn’t stop there. Another video related to these claims was also put under scrutiny. This one, too, was processed through AI. The original image that formed the basis of this fake video was traced back to a Kompas photographer, Totok Wijayanto, from an article dated January 28, 2013. The story then was completely unrelated to grants, discussing allegations against a Deputy Minister of Religious Affairs concerning a Quran tender. Again, Hive Moderation confidently declared it an AI-produced video with a 96.2 percent deepfake score. The voice in this video also suffered the same fate, with Hiya Deepfake Voice Detector giving it an abysmal authenticity score of just one out of a hundred. It’s like they were creating these digital masterpieces of misinformation one after another, each more elaborate than the last.
Finally, a third video revealed an even more explicit clue: a watermark of Google’s Gemini AI chatbot prominently displayed in the bottom right corner. This was almost like the scammers leaving their digital calling card, openly admitting that AI was the architect of their fabrication. A Google reverse image search once again confirmed this. The original context for this visual was the handover ceremony for the Minister of Religious Affairs for the 2024-2029 period on October 21, 2024, where the real Nasaruddin wore a black suit and a light blue tie. It’s a stark reminder that even with sophisticated tools, sometimes simple details can unravel the most elaborate lies. In conclusion, Tempo’s thorough investigation definitively debunked the claim: the “Al Bayti grant program” announced by a deepfake Nasaruddin Umar is entirely false. It’s a powerful lesson in critically evaluating everything we see and hear online, especially when it sounds too good to be true, and recognizing the increasing sophistication of AI-generated content.

