People or profit? Facebook papers show deep conflict within
PTI, Oct 25, 2021, 5:45 PM IST
Source: unsplash
San Francisco: Facebook the company is losing control of Facebook the product — not to mention the last shreds of its carefully crafted, decade-old image as a benevolent company just wanting to connect the world.
Thousands of pages of internal documents provided to Congress by a former employee depict an internally conflicted company where data on the harms it causes is abundant, but solutions, much less the will to act on them, are halting at best.
The crisis exposed by the documents shows how Facebook, despite its regularly avowed good intentions, appears to have slow-walked or sidelined efforts to address real harms the social network has magnified and sometimes created. They reveal numerous instances where researchers and rank-and-file workers uncovered deep-seated problems that the company then overlooked or ignored.
Final responsibility for this state of affairs rests with CEO Mark Zuckerberg, who holds what one former employee described as dictatorial power over a corporation that collects data on and provides free services to roughly 3 billion people around the world.
“Ultimately, it rests with Mark and whatever his prerogative is — and it has always been to grow, to increase his power and his reach,” said Jennifer Grygiel, a Syracuse University communications professor who’s followed Facebook closely for years.
Zuckerberg has an ironclad hold on Facebook Inc. He holds the majority of the company’s voting shares, controls its board of directors and has increasingly surrounded himself with executives who don’t appear to question his vision.
But he has so far been unable to address stagnating user growth and shrinking engagement for Facebook the product in key areas such as the United States and Europe. Worse, the company is losing the attention of its most important demographic — teenagers and young people — with no clear path to gaining it back, its own documents reveal.
Young adults engage with Facebook far less than their older cohorts, seeing it as an “outdated network” with “irrelevant content” that provides limited value for them, according to a November 2020 internal document. It is “boring, misleading and negative,” they say.
In other words, the young see Facebook as a place for old people.
Facebook’s user base has been aging faster, on average, than the general population, the company’s researchers found. Unless Facebook can find a way to turn this around, its population will continue to get older and young people will find even fewer reasons to sign on, threatening the monthly user figures that are essential to selling ads. Facebook says its products are still widely used by teens, although it acknowledges there’s “tough competition” from TikTok, Snapchat and the like.
So it can continue to expand its reach and power, Facebook has pushed for high user growth outside the U.S. and Western Europe. But as it expanded into less familiar parts of the world, the company systematically failed to address or even anticipate the unintended consequences of signing up millions of new users without also providing staff and systems to identify and limit the spread of hate speech, misinformation and calls to violence.
In Afghanistan and Myanmar, for instance, extremist language has flourished due to a systemic lack of language support for content moderation, whether that’s human or artificial intelligence-driven. In Myanmar, it has been linked to atrocities committed against the country’s minority Rohingya Muslim population.
But Facebook appears unable to acknowledge, much less prevent, the real-world collateral damage accompanying its untrammeled growth. Those harms include shadowy algorithms that radicalize users, pervasive misinformation and extremism, facilitation of human trafficking, teen suicide and more.
Internal efforts to mitigate such problems have often been pushed aside or abandoned when solutions conflict with growth — and, by extension, profit.
Backed into a corner with hard evidence from leaked documents, the company has doubled down defending its choices rather than try to fix its problems.
“We do not and we have not prioritized engagement over safety,” Monika Bickert, Facebook’s head of global policy management, told The Associated Press this month following congressional testimony from whistleblower and former Facebook employee Frances Haugen. In the days since Haugen’s testimony and appearance on “60 Minutes” — during which Zuckerberg posted a video of himself sailing with his wife Priscilla Chan — Facebook has tried to discredit Haugen by repeatedly pointing out that she didn’t directly work on many of the problems she revealed.
“A curated selection out of millions of documents at Facebook can in no way be used to draw fair conclusions about us,” Facebook tweeted from its public relations “newsroom” account earlier this month, following the company’s discovery that a group of news organizations was working on stories about the internal documents.
“At the heart of these stories is a premise which is false. Yes, we’re a business and we make profit, but the idea that we do so at the expense of people’s safety or wellbeing misunderstands where our own commercial interests lie,” Facebook said in a prepared statement Friday. “The truth is we’ve invested $13 billion and have over 40,000 people to do one job: keep people safe on Facebook.” Statements like these are the latest sign that Facebook has gotten into what Sophie Zhang, a former Facebook data scientist, described as a “siege mentality” at the company. Zhang last year accused the social network of ignoring fake accounts used to undermine foreign elections. With more whistleblowers — notably Haugen — coming forward, it’s only gotten worse.
“Facebook has been going through a bit of an authoritarian narrative spiral, where it becomes less responsive to employee criticism, to internal dissent and in some cases cracks down upon it,” said Zhang, who was fired from Facebook in the fall of 2020. “And this leads to more internal dissent.” “I have seen many colleagues that are extremely frustrated and angry, while at the same time, feeling powerless and (disheartened) about the current situation,” one employee, whose name was redacted, wrote on an internal message board after Facebook decided last year to leave up incendiary posts by former President Donald Trump that suggested Minneapolis protesters could be shot. “My view is, if you want to fix Facebook, do it within.” This story is based in part on disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by Haugen’s legal counsel. The redacted versions received by Congress were obtained by a consortium of news organizations, including The Associated Press.
They detail painstakingly collected data on problems as wide-ranging as the trafficking of domestic workers in the Middle East, an over-correction in crackdowns on Arabic content that critics say muzzles free speech while hate speech and abuse flourish, and rampant anti-vaccine misinformation that researchers found could have been easily tamped down with subtle changes in how users view posts on their feed.
The company insists it “does not conduct research and then systematically and willfully ignore it if the findings are inconvenient for the company.” This claim, Facebook said in a statement, can “only be made by cherry-picking selective quotes from individual pieces of leaked material in a way that presents complex and nuanced issues as if there is only ever one right answer.” Haugen, who testified before the Senate this month that Facebook’s products “harm children, stoke division and weaken our democracy,” said the company should declare “moral bankruptcy” if it is to move forward from all this.
At this stage, that seems unlikely. There is a deep-seated conflict between profit and people within Facebook — and the company does not appear to be ready to give up on its narrative that it’s good for the world even as it regularly makes decisions intended to maximize growth.
“Facebook did regular surveys of its employees — what percentage of employees believe that Facebook is making the world a better place,” Zhang recalled.
“It was around 70 percent when I joined. It was around 50 percent when I left,” said Zhang, who was at the company for more than two years before she was fired in the fall of 2020.
Facebook has not said where the number stands today.
Udayavani is now on Telegram. Click here to join our channel and stay updated with the latest news.
Top News
Related Articles More
30 detained over killing of lawyer in clash between Bangladesh police, followers of jailed Hindu leader
‘Monks targetted by Islamist elements’: ISKCON Kolkata flags Bangladesh issue to Modi govt
Jaishankar says Indo-Pacific landscape calls for wider collaborative approach, terms G7 as partner for it
Will impose 25 per cent tariff on all imports from Canada, Mexico: Trump
Internal divisions leave open question whether Gandhi’s vision will ever be fully realised in India: Bill Clinton
MUST WATCH
Latest Additions
CM Vijayan opens Huddle Global; Says Kerala will be key player in India’s deep tech ambitions
Bengaluru to host Aero India 2025 from Feb 10-14
Karnataka cabinet decides to reopen graft case against Ex-CM Yediyurappa and family
Man attacks nurse inside hospital, caught on camera
Special tribute concert for legendary singer SPB to be held in Bengaluru on Dec 8
Thanks for visiting Udayavani
You seem to have an Ad Blocker on.
To continue reading, please turn it off or whitelist Udayavani.