HomeTechnologyInvestigation Uncovers AI-Generated Political Endorsements

Investigation Uncovers AI-Generated Political Endorsements

-

Many short videos seen on social media leading up to the 13th national parliamentary election in the country feature individuals endorsing political parties and making voting pledges. However, investigators have discovered that some of the individuals in these videos are not real people but rather synthetic characters produced or altered using generative AI technology.

An investigation conducted in January 2026 by the fact-checking and media research organization Dismislab revealed that a Facebook page named Uttorbongo Television had uploaded 35 videos that were determined to be AI-generated or AI-edited. These videos showcased lifelike characters advocating for the “balancing scales” symbol associated with Bangladesh Jamaat-e-Islami, a key contender in the upcoming election.

As Bangladesh nears its polling day on February 12, concerns are mounting globally regarding the manipulation of generative AI to create false testimonies, mimic speech, and amplify persuasive messaging inexpensively.

Dismislab’s report delved into the history of Uttorbongo Television’s page, noting its evolution from Human Help to Help Mission, then Hum Bolo, before transforming into Uttorbongo Television. The page, which listed seven administrators based in Bangladesh, garnered over 90,000 followers at the time of Dismislab’s investigation.

According to Dismislab, the page began uploading AI-style political content on December 15, 2025, shortly after the election schedule was announced. Subsequently, Prothom Alo reported a surge in Uttorbongo Television’s follower count and the proliferation of AI-labeled political videos across various platforms like Facebook, TikTok, and YouTube.

The investigation outlined a common format observed in the videos, featuring brief “street interviews” with different personas such as elderly women, fruit sellers, individuals portrayed as disabled, and purported Hindu voters. These characters would address the camera with minimal context, consistently advocating for supporting Jamaat-e-Islami and endorsing the scales symbol.

One notable video posted on December 12 featured an elderly woman expressing her support for Jamaat-e-Islami and rejecting the “boat” symbol associated with the Awami League. Dismislab noted the video garnered significant engagement, with comments indicating that many viewers believed it to be authentic.

Dismislab also highlighted content that extended beyond endorsements to include criticisms and accusations against political adversaries. The most-watched video on the page was a 28-second clip of a fruit seller criticizing the Bangladesh Nationalist Party (BNP), amassing over 8 million views, hundreds of thousands of reactions, and tens of thousands of shares.

The organization documented videos targeting specific political figures, such as a clip condemning Tarique Rahman, the acting chair of BNP. This video generated millions of views, eliciting mixed reactions from viewers who debated its authenticity, with some suspecting it was AI-generated.

Prior to this, various fact-checkers, including Dismislab, identified another category of AI-generated content circulating online. One widely shared video featured synthetic footage purportedly showing Zaima Rahman, daughter of BNP chair Tarique Rahman, promising to send money to viewers via BKash and urging them to share their mobile numbers in the comments section.

Dismislab employed visual scrutiny and automated detection tools to analyze the videos, identifying telltale signs associated with synthetic media, such as unnaturally smooth facial textures, inconsistent skin folds, and Bengali lettering in the background that lacked coherence. The examination also highlighted anomalies like unnatural eye movements, potential lip-sync alterations, distortions in hand movements or objects, and irregularities in the video backgrounds.

Furthermore, Dismislab utilized Google’s SynthID detection capability, which identified a digital watermark on the content, suggesting the use of AI-generated or edited audio and video components. The organization also referenced results from DeepFake-o-meter, an open platform designed to evaluate the authenticity of digital content.

One particularly notable video scrutinized in Dismislab’s report featured a woman claiming that BNP leaders extorted money from her in exchange for facilitating a disability allowance card, subsequently threatening her when she demanded a refund. The video sparked significant engagement among viewers.

The Dissent, a digital investigative outlet, independently fact-checked the video, concluding that it was AI-generated and likely featured the likeness of Rikta, a garment worker injured in the Rana Plaza disaster of 2013. Prothom Alo corroborated this assessment, noting similarities between the woman in the video and the Rana Plaza survivor.

Dismislab emphasized that the content aimed not only to generate enthusiasm but also to portray support for Jamaat-e-Islami among demographic groups whose political inclinations are deemed influential. For instance, a video posted on January 8 showcased a

LATEST POSTS

“Bangladesh Labor Law Amended: Unions Now Easier to Form”

The government has revised the labor law to reduce the threshold for establishing a trade union in order to align with global standards. A recent...

“Chittagong University Students Demand Lower Admission Fees”

Students at Chittagong University are calling on the university administration to lower the admission form fees from Tk 1,000 per unit. The application process for...

“OpenAI Expects Revenue to Surpass $20 Billion by 2025”

OpenAI's CFO, Sarah Friar, has recently disclosed that the company's projected revenue for 2025 has exceeded $20 billion, a substantial rise from $6 billion in...

“Netanyahu’s UN Speech Sparks Controversy and Support”

Israeli Prime Minister Benjamin Netanyahu faced some pushback at the United Nations General Assembly in New York as several world leaders walked out during his...

LATEST ARTICLES