As disinformation rises, Taiwan’s pioneering strategies for monitoring and countering electoral interference provide valuable insights into maintaining transparency and public trust.
When a district judge ordered the unsealing of almost 1,900 pages of documents in a legal case against former U.S. President Donald Trump in October, the issue of electoral interference made headlines again ahead of the 2024 U.S. presidential elections on November 5. Despite the heavily redacted pages, prosecutors say the collection includes compelling, though not yet conclusive, evidence that Trump had tried to subvert the 2020 election results.
The news followed ongoing revelations of attempted electoral interference by foreign actors, including Russia, which in September was accused by the U.S. Department of Justice of operating dozens of fake news websites, the domains of which were seized by federal authorities. It was just the latest example of campaigns to influence U.S. elections, with Microsoft issuing a report on Iranian efforts in August and cyber security experts noting a shift in China’s strategy from targeting the presidential race to focusing on state and local polls.
Several Taiwanese organizations are well-versed in the various tactics in Beijing’s playbook. Following Taiwan’s presidential and legislative elections in January, organizations such as Doublethink Lab and Taiwan Cofacts shared their experiences in monitoring attempted Chinese interference at a panel discussion hosted by the Washington, D.C.-based Center for Strategic and International Studies (CSIS), a bipartisan research organization and think tank, in April.
One company that hopes to leverage Taiwan’s experience to support transparency and fairness in elections worldwide is Numbers Protocol. The Taipei-based firm utilizes blockchain and other technology to ensure the legitimacy and traceability of online digital assets.
Photographers can register and preserve their shots on the Numbers blockchain using the company’s Capture Cam application, which facilitates on-device authentication of critical details such as date, time, GPS location, and pixels. This approach provides a verifiable source of visual information for election coverage, making it harder for malicious actors to manipulate or misrepresent images – creating deepfakes – to spread disinformation.
Calling the process “immutable and transparent,” Numbers Protocol Chief Growth Officer and founder Sofia Yan notes that edits are also recorded, making the history of the digital asset trackable. “It’s like a digital watermark,” she says. “Every time the watermark is changed, you have a record of the change.”
In the build-up to Taiwan’s January presidential election, the firm worked with local news media outlets, freelance photojournalists, and civil society contributors to guarantee image provenance.
“We began cooperating with photojournalists in November [2023], so the schedule was quite tight,” says Yan. “The aim was to record the 66 days from the registration of the election candidates until the winners were announced as president and vice president.”
During this period, Numbers Protocol registered around 2,000 images from participants working with independent media, such as The Reporter, a nonprofit online media outlet, and individual freelancers with local and international media.
Venturing abroad
Numbers began its first major collaboration outside Taiwan with Stanford- and USC-affiliated research unit Starling Lab for Data Integrity to guarantee the authenticity of photos during the 2020 U.S. presidential election. The project, called “78 Days,” saw 102 Reuters photojournalists upload and register images on each of the 78 days between the election on November 5 and Joe Biden’s January 5 inauguration.
“The idea was to do a proof of concept with Starling,” says Yan. “By providing our technology, we were also helping them to build their framework.”
While Numbers supported the first stage, other companies and technologies were involved in the subsequent storage and verification phases. Storage providers included Filecoin, a public cryptocurrency and digital payment system offering cooperative digital storage solutions. The open-source system was developed by San Francisco-based Protocol Labs, which created the InterPlanetary File System (IPFS), a peer-to-peer network similar to BitTorrent. The final layer of the process – verification – came courtesy of the open-source CP2A Verify tool, provided by the Content Authenticity Initiative (CAI), an association that promotes provenance metadata industry standards.
By bringing together companies with different specialties, Starling was able “to complete the process in a very comprehensive way,” says Yan.
Further cooperation came in 2022 on an outstanding piece of investigative journalism for Rolling Stone magazine. The story covered a 1992 massacre by Arkan’s Tigers, a Serbian ethnonationalist paramilitary group, in the town of Bijeljina in Bosnia-Herzegovina.
Compellingly interactive, aesthetically and functionally superb, and founded on a painstakingly documented trail that leads in one direction, the case began with the original photographs taken between April 1 and 2, 1992, by American photojournalist Ron Haviv. With a career spanning 25 conflicts around the world, Haviv is most renowned for his sweeping coverage of the Balkans, particularly his infamous image of paramilitary leader Arkan – with a baby tiger dangling from one hand and a gun in the other – posing in front of his balaclava-clad men.
“As a former journalist, being part of that project was great because of who we worked with,” says Starling COO Adam Rose. “The team at Rolling Stone, the writer Sophia Jones [with co-reporters Nidžara Ahmetašević and Milivoje Pantović], and a range of people on all these different technologies, including Numbers.”
With a threefold focus on journalism, history, and law, Starling harnesses technology such as cryptography, decentralized systems, blockchains, and distributed ledgers to “try to help with trust in those fields,” says Rose.
Disillusionment with high-profile blockchain-based developments such as cryptocurrencies and NFTs has caused people to view the technology as “a solution in search of a problem,” says Rose. “The problem we were predicting all along was generative AI and the idea that people would increasingly not trust the media they consume.”
Most efforts to debunk disinformation – particularly AI image manipulation – thus far have involved downstream technologies, such as improved search engine functions, which Rose says are undoubtedly important. One example is Content Credentials, a new technical standard developed by the CAI and its coalition partners and implemented by Google.
“But there’s also the question of how we authenticate closer to the source before things get released into information flows,” says Rose.
In a year with at least 64 national elections worldwide, Numbers cooperated with photojournalists on presidential elections in Indonesia in February and general elections in India in April.
Both election photo archives have their own project websites, where users can toggle between sections for photojournalists and civic participants. The Indonesia site, titled “2024: Rebuilding Digital Trust,” aptly reflects the findings of a post-election analysis by the Canada-based Centre for International Governance Innovation.
The research revealed that, rather than the wide-scale disinformation campaigns from outside sources such as China or Russia that had plagued previous polls in Indonesia, the threat this time was internal, as outgoing President Joko Widodo (widely known as Jokowi) mobilized his army of social media buzzers in support of his preferred candidate Prabowo Subianto.
During a campaign that critics say stifled dissent, the former two-time election loser’s image was transformed from that of an ultranationalist ex-general – implicated in war crimes under the regime of his father-in-law, the late dictator Suharto – into a more benign, grandfatherly figure.
Protests over Jokowi’s maneuvering to secure his elder son’s vice presidency and attempts to alter the constitution so his younger son could run for governor reveal disillusionment with Indonesia’s political system and media environment. On its website for the Indonesian project, Numbers cites a survey by CSIS, which found that 50% of citizens reported being misled by disinformation during February’s vote.
In India, Numbers partnered with a local photojournalism agency and the livestreaming news website TV9 Kannada on the parliamentary election, which ran in seven phases between April and June. A civic participation campaign was incentivized by rewards in Numbers’ native utility token ($NUM). Unlike nonprofits such as Starling, which is funded by multiple donors, Numbers “needs to survive,” Yan emphasizes. “But we’re using the technology for the best purposes,” she adds.
In an article that mentions an AI-manipulated video that showed a presidential candidate endorsing a rival, the English- and Hindi-language news website The Quint highlighted Numbers’ election project. Speaking to The Quint, Yan said deepfakes and AI were “wreaking havoc with the democratic process in India.” In response, Numbers gave local media organizations “the tools to fight back and help restore audience trust in the news they are receiving.”
Taiwan can help
Although Numbers did not directly participate in a project for the 2024 U.S. presidential election, it cohosted an event on the role of AI in ensuring image integrity at the National Press Club in Washington, D.C., in June. Yan believes that Taiwan’s experience and perspective have much to offer the United States regarding its elections.
“As the U.S. approaches its elections, it can learn from this approach to protect the integrity of its electoral process,” she says. “Ensuring access to verifiable, trustworthy information is key to maintaining public trust.”
Rose agrees. He refers to the liar’s dividend, whereby those who spread false information benefit from a muddied media environment where people are increasingly unable to distinguish fact from fiction. “Having enough room for doubt undermines genuine content,” he says, noting that Team Trump has grasped this.
“That’s why what Numbers is doing is so important,” says Rose. Emphasizing that the company’s work on Taiwan’s presidential election came “at the right place and the time,” Rose says it provided the perfect foundation for upcoming developments involving the pairing of images with authenticated attributes.
For example, Rose says that unlike time, date, and location, some authentication categories, like shutter speed, are less obvious. Keeping a shutter open for a longer period on a digital camera can create images that the naked eye would not see, such as barrages of missiles in the air at the same time. “These are edge cases, but these are the types of things we need to think about,” he says.
Other analysts point to Taiwan’s multi-faceted and multi-stakeholder approach to monitoring, comparing, and assessing different sources and disinformation.
“Various government agencies, non-governmental expert organizations, and the private sector, including major messaging platforms, collaborate to monitor trends in the information space, debunk false information, and promote media literacy education,” says Marcin Jerzewksi, head of the Taiwan Office of the European Values Center for Security Policy (EVC).
As the only European think tank with a permanent office in Taiwan, the EVC has hosted events on disinformation and electoral interference, including a digital security training program for Taiwanese civil society organizations in late October. Meanwhile, revisions to media literacy education guidelines and the publication of a Digital Era Media Literacy Education White Paper last year “also introduced the concept of intersectionality into programs aimed at cultivating more responsible behaviors in the digital sphere across age groups,” says Jerzewksi.
Particularly noteworthy, says Jerzewksi, is the inclusion of Taiwan’s elderly in such programs. He also emphasizes the domestic element in Taiwan’s battle against disinformation. With a highly polarized media environment, which “creates fertile ground for the potential erosion of constructive idea exchange,” societal and political cleavages are ripe for exploitation, he says.
“Consequently, Taiwan’s example demonstrates that it is not only crucial to monitor disinformation as a tactic of foreign malign influence, but also a tool of domestic political warfare.”