MENLO PARK (CBS SF) — Facebook founder Mark Zuckerberg took to his social media platform Tuesday night, in an attempt to quell a firestorm of criticism ignited by whistleblower Frances Haugen’s ’60 Minutes’ interview and hours of compelling testimony before a Senate subcommittee.
Haugen calmly told the subcommittee Tuesday that Facebook chooses to allow harmful content on its platform that is “disastrous” for society and for children in particular because of the astronomical profits such content generates.READ MORE: UPDATE: Suspect Arrested After Armed Standoff In San Jose Neighborhood
After Haugen’s alarming revelations in her 60 Minutes interview and a worldwide Facebook outage on Monday, technology analysts anticipate more fallout following her testimony.
“The choices being made inside of Facebook are disastrous for our children, for our public safety, for our privacy and for our democracy,” she told the subcommittee.
“Facebook consistently resolved these conflicts in favor of its own profits. The result has been more division, more harm, more lies, more threats and more combat,” she continued. “In some cases this dangerous online talk has led to actual violence that harms and even kills people.”
Video: Haugen Testifies That ‘Facebook Knows’ It Leads Girls To Anorexia Content
“This is not a matter of certain social media users being angry or unstable. Or about one side being radicalized against the other. It is about Facebook deciding to grow at all costs.”
Video: Haugen Explains Mark Zuckerberg’s Role In Content Selection
Zuckerberg posted a lengthy response to Facebook users on Tuesday night.
“I wanted to share a note I wrote to everyone at our company,” he began his post. “It’s been quite a week, and I wanted to share some thoughts with all of you.”
He apologized for the lengthy service outage on the Facebook, Instagram and WhatsApp platforms.READ MORE: FBI Gang Enforcement Operations Lead to Charges in 2 Attempted Homicides in East Bay
“The deeper concern with an outage like this isn’t how many people switch to competitive services or how much money we lose, but what it means for the people who rely on our services to communicate with loved ones, run their businesses, or support their communities,” he posted.
But he quickly turned to Haugen’s alarming revelations.
“Now that today’s testimony is over, I wanted to reflect on the public debate we’re in,” he posted. “I’m sure many of you have found the recent coverage hard to read because it just doesn’t reflect the company we know. We care deeply about issues like safety, well-being and mental health. It’s difficult to see coverage that misrepresents our work and our motives.”
“At the most basic level, I think most of us just don’t recognize the false picture of the company that is being painted.”
Zuckerberg fired back Haugen’s claims the company ignored its own research.
“If we wanted to ignore research, why would we create an industry-leading research program to understand these important issues in the first place?,” he posted. “If we didn’t care about fighting harmful content, then why would we employ so many more people dedicated to this than any other company in our space — even ones larger than us? If we wanted to hide our results, why would we have established an industry-leading standard for transparency and reporting on what we’re doing?”
He was especially rankled claims his company was prioritizes profit over the safety and well-being of its users.
“At the heart of these accusations is this idea that we prioritize profit over safety and well-being,” he posted. “That’s just not true.”
When it comes to the use of the platform by children, Zuckerberg pointed to his own children and his desire to keep them safe.
“The reality is that young people use technology,” he posted. “Think about how many school-age kids have phones. Rather than ignoring this, technology companies should build experiences that meet their needs while also keeping them safe. We’re deeply committed to doing industry-leading work in this area.”
Zuckerberg did post that he was in favor of stronger regulation.MORE NEWS: Conservatives to QAnon: Facebook Researchers Saw How Its Algorithms Led to Misinformation
“We’re committed to doing the best work we can, but at some level the right body to assess tradeoffs between social equities is our democratically elected Congress,” he said. “For example, what is the right age for teens to be able to use internet services? How should internet services verify people’s ages? And how should companies balance teens’ privacy while giving parents visibility into their activity?”