When sociologist Diane Vaughan used the term “the normalization of the deviationShe was referring to NASA administrators’ disregard for the mistake that led to the 1986 Challenger space shuttle explosion. The idea was that people in an organization could accept a problem in such a way that they no longer consider it problematic. (In the case of the Challenger, NASA had been warned that the shuttle’s O-rings were likely to fail in cold temperatures.) Look at Facebook: For years, its leadership has known that the social network promotes political polarization, social unrest and unrest even ethnic cleansing. More recently, his algorithms were known to have promoted misinformation and disinformation campaigns about COVID-19 and vaccines. Over the past year, the company has made gradual attempts to remove false information about the pandemic its most comprehensive ban in February. An analysis of the nonprofit group last month First draftHowever, it was found that at least 3,200 posts with unsubstantiated claims about COVID-19 vaccines had been published after the February ban. Two weeks ago the Top post on Facebook About the vaccines, Tucker Carlson reported on Fox News that they weren’t working.
Over the years, Facebook CEO Mark Zuckerberg has one Cascade of apologies among other things because of data protection violations, algorithmic prejudice and the promotion of hate speech. Too often, the company doesn’t seem to change course until after such topics are publicized. In many cases, it was made aware of these errors long before by Facebook employees, injured parties, or objective evidence. It took the company months to realize that political ads on their platform were being used to manipulate voters, and until then create a way so users can find out who paid for them. Latest DecemberAfter years of criticism from black groups that the algorithm disproportionately removed posts from black users discussing racial discrimination, the company finally reconfigured its hate speech algorithm. “I think it’s more useful to let things happen and then apologize later,” Zuckerberg said said early in his career. We have seen the consequences since then.
This is what the normalization of the deviation on Facebook looked like in the first few months of 2021: In FebruaryIn-house emails from ProPublica revealed that in 2018 the Turkish government requested the blocking of Facebook posts in Turkey by a mainly Kurdish militia group that it used to alert Syrian Kurdish civilians to impending Turkish attacks against them Facebook made it clear that “this would have resulted in its services in the country being completely suspended.” Sheryl Sandberg, COO of Facebook, said to her team, “I agree.” (Reuters reported that the Turkish government in Turkey had arrested nearly six hundred people “for social media posts and protests against their military offensive in Syria”.)
On the 3rd of April Alon GalThe chief technology officer of the cybercrime intelligence company Hudson Rock reported that the personal information of more than half a billion Facebook users was “scraped off” before September 2019 and posted on a public website where it is frequented by hackers, where it is still available. The data stolen included names, addresses, phone numbers, email addresses, and other identifying information. But loud Mike ClarkScraping data, Facebook’s product management director, is not the same as hacking data – a technique that will be lost in most people. As a result, it appears that the company was under no obligation to notify users that their personal information was stolen. “I have not yet seen Facebook recognize this absolute negligence,” wrote Gal. A internal memo A Dutch journalist who published it online was accidentally informed of the violation. It says: “Assuming that the press volume will continue to decrease, we are not planning any additional statements on this topic. In the longer term, however, we expect more scratches and we consider this to be important. . . normalize the fact that this activity occurs regularly. “On April 16, it was announced that the group Digital rights Ireland plans to sue Facebook for violating what it calls a “mass action”; and Ireland Data protection authorityThe Data Protection Commission has launched an investigation to determine whether the company has violated EU data protection regulations. (Facebook’s European headquarters are in Dublin.)
On April 12, the Guardian revealed new details about the experience of Sophie Zhang, a data scientist, the one angry, warning goodbye note to her employees before she left the company last August. According to the newspaper, Zhang was fired for “spending too much time uprooting civic wrongdoing and not spending enough time on management priorities.” “In the three years I’ve spent on Facebook, I’ve found several obvious attempts by foreign governments to widespread abuse of our platform to mislead their own citizenship,” Zhang wrote in the memo, according to the Guardian Facebook is trying to suppress it. “We just didn’t care enough to stop them.” A known loophole in one of Facebook’s products allowed corrupt governments to create fake followers and fake “likes,” which then triggered Facebook’s algorithms to bolster their propaganda and legitimacy. According to the Guardian, when Zhang told senior officials how this was being used by the Honduras government, an executive said to Zhang, “I don’t think Honduras is big in people’s minds here.” the newspaper: “We fundamentally disagree with Ms. Zhang’s characterization of our priorities and efforts to eradicate abuse on our platform.”)
On April 13, The Markup, a nonprofit investigation website of public interest, reported that Facebook’s advertising business was up Monetization and amplification of political polarization in the US by giving companies the ability to target users based on their political beliefs. ExxonMobil, for example, served Liberals with advertisements about its clean energy initiatives, while Conservatives were told that “the oil and gas industry is THE engine that drives the American economy. Help us ensure that unnecessary regulations don’t slow down energy growth. “How did ExxonMobil know who exactly to target? According to the report, Facebook has been constantly monitoring the activities and behaviors of users on and off Facebook and delivering those “custom audiences” to those willing to pay for ads on its platform.
On April 19, Monika Bickert, vice president of content policy at Facebook, announced that the company would do so in anticipation of a verdict in the Derek Chauvin trial Remove hate speech, incidents of violence, and misinformation related to this process. This placement was a tacit recognition of the power users of the platform have to incite violence and spread dangerous information, and it commemorated the company’s decision after the November elections to tweak its newsfeed algorithm to include partisan outlets like z as Breitbart. The original algorithm was restored by mid-December, prompting several employees to do so to tell the times’ Kevin Roose reported that Facebook executives had reduced or vetoed past efforts to combat misinformation and hate speech on the platform, “either because they violated Facebook usage metrics or because executives feared they would disproportionately harm right-wing publishers.” According to the Technical transparency projectRight-wing extremists spent months on Facebook organizing their storms on the Capitol on January 6th. Last week, an internal Facebook report from Buzzfeed News confirmed that the company had not abandoned coordinated “stop the steal” efforts on the platform. Soon afterwards Facebook removed the report from his employee message board.
Facebook has nearly three billion users. It is common to compare the company’s “population” to the population of countries and wonder why it is larger than the largest of them – China and India – combined. Facebook’s political decisions often have oversized geopolitical and social implications, even though no one voted or appointed Zuckerberg and his staff to rule the world. For example, the Guardian article on Zhang’s experience concludes that “some of Facebook’s political contributors are acting as some sort of legislature in bringing Facebook closer to a global government.”
It is possible to see Facebook’s board of directors, an advisory body of twenty respected international lawyers and academics, which the company founded in 2018 as a further branch of its self-appointed parallel government to decide on controversial substantive decisions. In fact, when Zuckerberg announced the creation of the board, he called it “almost like a Supreme Court. “In the near future, the board of directors will make what is probably the most controversial decision that has ever been made: whether to uphold Donald Trump’s ban Facebook set up after this January 6 uprisingon the grounds that, as Zuckerberg put it at the time, “we believe that the risk that the president will continue to use our service during this time is simply too great.” This decision will not be a referendum on Trump’s disastrous presidency or his promotion of Stop the Steal. Rather, a single, discreet question is answered: Has Trump violated Facebook’s guidelines, what is allowed on his platform? This tight mandate is enshrined in the Oversight Board’s charter, which states: “The board will review content enforcement decisions and determine whether they are consistent with Facebook’s content guidelines and values.”
As events over the past few months have shown, Facebook’s policies and values have normalized the kind of deviation that allows disregard for regions and populations that are not “big in people’s minds.” They are not democratic or humanistic, but corporate. Regardless of which direction the Trump decision – or an oversight board decision – goes, this will still be true.