News Details
Ezekiel Dixon-Román on the Facebook whistleblower
Authored by: Kristina García / Penn Today
Faculty & Research
10/28/21
In a Senate hearing earlier this month, former Facebook project manager Frances Haugen testified against the tech giant, saying that the company, which also owns Instagram and WhatsApp, “put profits before people.” Haugen came armed with thousands of pages of confidential company documents and shared them with reporters and lawmakers. She claimed that Facebook knew that organizers of the Jan. 6 insurrection used the online platform to disseminate information, that its algorithms fuel insecurities in teenage girls, and that it has become a megaphone for hate speech.
On Oct. 25, Haugen appeared before the U.K. Parliament to give evidence. Her testimony, along with statements from representatives from Facebook, Google, YouTube, Twitter, and TikTok, will inform Parliament’s Joint Committee on the draft Online Safety Bill, which proposes regulatory oversight for tech companies.
European Union lawmakers have also invited Haugen to appear at a Nov. 8 hearing on whistleblowers in tech. EU officials are looking to draft antitrust regulations to encourage more competition, along with laws requiring more transparency regarding the algorithms that Facebook and other internet platforms use to determine which content gets promoted on its users’ feeds.Penn Today spoke with data analytics expert Ezekiel Dixon-Róman, associate professor in the School for Social Policy & Practice, about the significance of Haugen’s evidence and what it could mean for the future.
There was a lot of chatter when Facebook, Instagram, and What’s App were offline. What does that response say about our society?
We already knew that Facebook and Instagram, in particular, are designed based on a behavioral psychology that seeks to incite and increase engagement and interaction. Even when we try to leave them alone, they still send us emails to draw our attention back in. Frances Haugen’s whistleblower interview and Congressional testimony merely provided internal documentation for much of what was already known.
But, I’m not sure the question is simply about their addictive design. All three of these apps are used around the world by businesses and organizations, small and large, to conduct various forms of communication, information dissemination, ad campaigns, financial transactions, product delivery, and even client and customer support. In the Global South, these apps may be less well understood simply as applications and better understood as utilities. In some places, they are the main medium for communication, news, commerce, etc. I think the temporary small technical issue that caused a hugely calamitous, digital butterfly effect around the world demonstrates how deeply threaded and entangled these technosocial networks are in everyday life and their asymptotic economic effects.
Haugen asserts that Instagram knowingly prioritizes profit over people, impacting young girls’ self-esteem and body image. What regulatory measures might result?
Again, this is also known information. There are existing studies on the effect of Instagram on girls and gay men. Why haven’t there already been more strongly-targeted regulatory measures designed and implemented? And how much are the existing regulatory measures being enforced? The fact that one small technical issue can have such a huge butterfly effect around the world says so much about the monopoly of Facebook. There are already existing laws against such monopolizing to occur. Yet where was the enforcement when these buyouts were happening? The main difference is that it is ironically corroborated by Facebook’s internal research. While Facebook and Instagram are not the only ones guilty of this, the question is, Why have they not only done nothing about it but made the unethical and unconscionable decision to pursue profit over the well-being of its users?
This is not a new or even surprising business practice. As Haugen pointed out, this is what Philip Morris did for many years until legislation regulated their practices. I do think Ms. Haugen’s testimony does indicate that there needs to be regulation on making internal documents public. While we know this will be gamed, it will at least create the conditions for greater oversight of what techno corporations are doing and what they report to know and don’t know. This may also likely lead to age-related policy practices, such as what type of ads are presented to users depending on their age and secondary questions and approvals for sharing and accessing certain information. I would also hope that this may also lead to a focus on providing training in critical digital literacies for K-12 schooling and out-of-school programming. This is a form of literacy that educates users on relevant and important digital practices and skills, including a tool kit on how to discern the veracity of information in the deluge of social media data.
Haugen also accused Facebook of ineffectively managing misinformation and being equipped only to review and take down 10-20% of posts that disseminated untruths. Is policing misinformation part of Facebook’s corporate responsibility?
Absolutely! Facebook should be held responsible for any information it conveys on its platform, maybe not as the culprit but definitely as a co-conspirator. If I was found to be complicit in the sharing of information for the Jan. 6 insurrection, it most certainly would be held accountable as a co-conspirator.
Read the full article in Penn Today here.
People
-
Ezekiel Dixon-Román, PhD
Associate Professor
Contact
office: 215.898.5512
fax: 215.573.2099
Email