Affected users have recently expressed on social media, their anguish and distress over the discovery of Facebook clones. Dado Ruvic, Reuters
Culture Spotlight

Attack of the clones: What really led to Facebook becoming so scary political

Or, since when did Facebook become so political. By DOMINIC LIGOT
ANCX | Jun 10 2020

The sudden existence and prevalence of clone or doppelganger accounts on Facebook has captured the public interest for the past few days. Speculations of political targeting and harassment, particularly with respect to critics of the impending anti-terror bill, have widely circulated among school publications but remain unproven. 

Jonathan Ong, who led research into networked disinformation shared in a Tweet that this current modus operandi represented a “break” in strategy. In his 2018 research, he documented that a few Advertising and PR professionals are the chief architects of disinformation. Ong posited that the clone phenomena is a possible demonstration orchestrated by such parties looking to pitch their services to clients for the next election season. 

In a CNN interview, Senator Panfilo Lacson hypothesized that the creation of the dummy FB accounts were part of a scare tactic to alarm the public against the anti-terror bill. 

Whoever the perpetrators are, what is clear is the anguish that affected users are voicing upon discovery of their clones and the collective distress expressed on social media about the phenomenon. 

Since when did Facebook become so political?

In 2007, Microsoft made a splash in the news with a $240 million investment in Facebook, then a fast growing social networking site poised to challenge the incumbents MySpace and Friendster. The Microsoft investment came at the same time as Facebook tipped its hand about launching advertising as a core service of the platform. 

Shortly after in 2008, Mark Zuckerberg announced in the annual F8 conference, the launch of the Facebook Developer Platform, which allowed third party providers to create software that utilized Facebook user data for various applications. 

Screenshot: Facebook User Data Permissions Accessible by Developers (Source: Facebook Developer Platform)

In 2009, Facebook launched Facebook groups and pages. Facebook groups are special sites that allowed users with common interests to join their conversations in the same space while Facebook pages allowed individuals, companies and brands to create a special portal that users could like and follow (instead of befriend) to get content and updates. 

These developments transformed Facebook from a social networking site for people looking to make friends, share photos and poke each other for fun, into a bonafide media content and advertising platform. 

The decade that followed would see the rise of social media marketing. Up until this point, digital marketing was confined to email blasts (spam) and website ads, but with Facebook, the digital playbook was reformatted. Here was an interface that allowed marketers to insinuate their brands into user conversations as well as a limitless database to classify and target people at a level and on dimensions never before seen in advertising and at an infinitesimal cost compared to traditional media. 

At the same time, internet users had never seen such an integrated social experience, allowing you to continue networking with friends while interacting with your favorite brands and interests. 

From this primordial soup arose the “influencer” – internet celebrities who were not limited to the usual actors and musicians, but also everyday individuals who attract a following based on common interests on Facebook. 

A 2018 report stated that Filipinos spend the most time on Facebook compared to any other nationality in the world. Facebook penetration in the Philippines (accounts / population) was more than 100% - with some users having multiple accounts. Political discourse has also effectively migrated to Facebook, with political parties, government agencies, and personalities maintaining a Facebook page or group to engage with their audiences. Political activism is also conducted on the platform with meetups and events scheduled online and broadcast via Facebook Live. 

In 2018, data analytics firm Cambridge Analytica (CA) shot to notoriety as having used data mining on Facebook user data to influence elections in the US, the Brexit vote, and various other politics. CA was not the only firm to do this, but its exposure culminating in Mark Zuckerberg’s testimony to the US Congress, laid bare for the world just how entrenched companies are with user’s data on Facebook, and how the user experience is manipulating political sentiment and decision-making by the public. 

Jonathan Ong’s 2018 paper described the ability of networked operators to engineer and terminate trends on social media as well as the existence of troll farms – vast armies of paid individuals who create and manage fake accounts on Facebook, which are used to execute digital campaigns to build or destroy reputations on the platform. These services are sold to brands and politicians alike. 

Nowadays, events of public consequence play out as social media “trends”. The “trend” has supplanted the daily newspaper or nightly TV as the barometer of newsworthiness. Even traditional TV and radio programs actively reference social media hashtags and accounts to promote their reach. 

While true news generated greater anxiety, disinformation generated more anger and incivility.

In just ten years, the platform we all previously used to share cat and food photos and poke at each other is now the same venue for political mass action and engineered trolling. It is like waking up every day to a kindergarten playground now filled with bloody placards hanging by the monkey bars. 

How healthy is it to be this close to politics all the time?

A 2017 paper examining political opinions expressed on Facebook found that individuals’ openness about political views on Facebook depended highly on whether they were in a supportive circle. Facebook users who were concerned about being perceived negatively by their friends would tend to keep their views to themselves. Communications practitioners know this as the Spiral of Silence. On the other hand, users within a highly engaged and supportive environment, such as a political Facebook group, would feed on this and be highly engaged. 

Facebook engagement could lead to highly toxic opinionated environments too. In a 2019 study of political disinformation on Facebook, researchers found that disinformation provoked more emotional responses compared to true news. While true news generated greater anxiety, disinformation generated more anger and incivility. More interestingly, the response to disinformation was similar regardless of the political affiliations of the source. 

Coming back to today, as innocent users struggle to retain ownership of their Facebook identities against the onslaught of the doppelgangers, maybe it is time to ask: is maintaining a Facebook account worth all this trouble?


Dominic Ligot is a data analyst, researcher, software developer, entrepreneur and technologist. He is the founder of CirroLytix a research company focusing on machine learning, data ethics, and social impact. He also co-founded Data Ethics PH, a monthly online meetup discussing the trends in the misuse of data and algorithms that can harm society. In 2019, Dominic led the Philippines team that won the Grand Prize in Break The Fake: an international hackathon competition against fake news and disinformation.



  1. Architects of Networked Disinformation: Behind the Scenes of Troll Accounts and Fake News Production in the Philippines
  2. Senate Bill 1083, Anti-Terrorism Act of 2019!.pdf
  3. Lacson: Anti-Terror Bill dissenters may be behind ‘Facebook dummies’
  4. Microsoft invests $240M in Facebook, as Facebook develops ad product
  5. Facebook Expands Power of Platform Across the Web and Around the World
  6. Facebook Pages vs Facebook Groups: What's the Difference?
  7. Digital Marketing Strategy and The Rise of The Micro-Influencer
  8. Are people willing to share their political opinions on Facebook? Exploring roles of self-presentational concern in spiral of silence
  9. Cognitive and affective responses to political disinformation in Facebook