Facebook (FB) was always going to bring us here.
For years, there have been disturbing public disclosures about FB’s practices and content, including, but not limited to, uncontrolled and unauthorized experiments on the emotional impact of user flow, the ownership of data and photos, publication of salacious, violent and conflicting content. , the mental health of human evaluators and the health of its platform. Then there was the acceleration of AI, both Facebook and third-party bots designed to shape both content and human interaction, regardless of its impact on user behavior, beyond simple engagement, leading to disaffection and unease among FB users and employees.
Analyzing the pathology of FB, illustrating what plagues the public about social media, can identify both the disease and the industry’s potential cures.
This disease can be diagnosed through its signs, five of which are predominant:
- Concentration of power, both in the platform and its CEO
- The nature of the platform
- The nature of the content
- Anti-competitive behavior
- Data confidentiality and ownership
Several former FB staff and advisers, as well as recent whistleblower Frances Haugen, speak of FB’s dominance, not only over industry, but as a means of Internet access, communication and commerce. This inordinate power and importance was punctuated on Monday 04 October 2021 when more than 3 billion users were denied access to what for many is an essential service when FB went offline for more than 6 hours. FB’s footprint covers almost half of the world’s population.
The second piece of power is FB chairman, CEO and majority shareholder, Mark Zuckerberg, and his unprecedented control over his company. It has a 58% voting share on the board of directors of FB and cannot be canceled. This is the last word at FB. As late as May 26, 2021, FB’s board rejected proposals to reduce its powers. As a former senior FB member put it, he doesn’t respond well to being told ‘no’.
Keep in mind that before he became a social media mogul he was originally from a village in Westchester County, NY, dropped out of Harvard in his sophomore year, and had never held a job. With that limited exposure, grooming and socializing, within 2 years of entering Harvard, he would launch FB from his dorm and find himself embroiled in litigation before moving to Palo Alto (2004) and being catapulted to a global relevance and power.
His interviews and contemporary public statements reveal the thought of a “man-child”, a psychological profile so well established that it is a basic entertainment trope. With such a slim CV and so little experience, no one would have placed him at the head of an empire of such social significance, even at FB status in 2004, but fate is without feeling.
Is FB just a platform, like a utility or a carrier, as he has argued, or is it a publisher? FB argued that as a platform it is neither responsible nor accountable for the content of its platform, despite a consensus to the contrary.
And yet, the company’s claim of simply being a platform is contradicted by the whistleblower’s audience, highlighting FB’s confused, inconsistent and arguably ineffective policing of its platform’s content. . If FB is just a disinterested content platform, then why this attempt at moderation?
Additionally, the audience explored how FB’s own AI-based algorithms select “engaging” content (which tends to be controversial, divisive, and angry); content more likely to be shared and commented on. Facebook’s AI then sends that content to likely responding users, then monetizes their engagement.
In short, FB creates “bestsellers” from its content, then promotes it and earns money from it. That’s what publishers do. FB is an editor . Complete stop. This should no longer be debatable.
We have described the conflicting nature of the content above, a chronic complaint which was highlighted during the Senate hearing. There have been previous public defections, preceded but accelerated by the Cambridge Analytica scandal.
Ms Haugen also described how FB’s AI-based algorithms push certain content to specific users, progressing to increasingly extreme content. Sociologists have discussed why we are drawn to controversy, division and extremism. It is a survival mechanism, an attempt to defend the tribe on which our survival depends. FB uses augmented intelligence to accelerate the intensity, speed and scale of this engagement. In short, FB’s AI attacks a hard-wired survival trait; a feat for which we have not developed a natural defense and to which we are extremely vulnerable. It’s like taking a highly contagious virus for which we have no preventive or therapeutic interventions, and further arming it with cutting-edge technology. Facebook’s policies and business model have had a disproportionate effect on the most vulnerable members of the global community; minority populations such as the Rohingya and Dalits (especially women and girls), and beyond, populations of declining liberal democracies around the world.
Another pathological sign highlighted during the hearing was FB’s anti-competitive behavior, including the acquisition of platforms that could either compete with FB (WhatsApp) or supplant FB for a younger generation (Instagram), equating their “Biological and technological difference” in the FB. continuum, thus eliminating their threat. It is important to note that in purchasing these companies their business principles and practices have been enuclosed and superseded by those of FB, especially with regard to data ownership and privacy (see below). Other aspects of FB’s anti-competitive behavior have been discussed elsewhere. This complaint was accompanied by calls for the dissolution of FB and other companies.
The final pathological sign is FB’s policy on confidentiality and data ownership, which is one-sided and abstruse. Most users do not understand the terms, which defies the meaning of informed consent. How FB then monetizes this user data with their real customers, that is to say advertisers and other customers (users and their data are not customers, but the product), is beyond the user’s control and is poorly understood by global legislators. Despite debacles like Cambridge Analytica, FB continues to advocate for self-regulation of user data and privacy.
Haugen’s accusation, along with FB’s growing earnings and market capitalization, demonstrates a preference for profit over social good, characterized by expansive unregulated growth at the expense of host health. This is indeed the definition of cancer. The host is the global company.
Let’s be clear, neither social media nor FB is cancer. Like vital organs, they provide benefits upon which modern society depends. On the contrary, they harbor a malignant transformation of business practices that seeks profit and growth at the expense of the well-being of society. It is malignancy that requires action.
If cancer is the diagnosis, then how to treat it?
If we aim to contain it, we will feel it. It will be not to be fast. This is will hurt.
Mark Zuckerberg has repeatedly demonstrated that he is neither prepared nor able to unilaterally exercise the global power he has amassed. Its power must be reduced to a manageable size. This is at the heart of FB’s malignant potential and occupies the first order of importance. His personal and social boundaries and lack of responsibility characterize both his leadership and the modus operandi of the platform with that defining characteristic of immaturity – irresponsibility. Recall that the original slogan of FB was “Move fast and break things”. Despite dropping the slogan, FB’s behavior remains unchanged. Indeed, the corollary of irresponsibility is the denial of guilt. This has been FB’s consistent stance whenever the platform is accused of prejudice.
In addition, Mr. Zuckerberg demonstrates a cruel lack of self-awareness. Compare his behavior to that of Google founders Larry Page and Sergey Brin, and their hiring Eric Schmidt as CEOs when they both realized their need to grow up and be responsible CEOs, and we understands how dangerous and reckless the path chosen by Mr. Zuckerberg is. . Messrs. Page and Brin knew this was necessary, Mr. Zuckerberg never did. This highlights a current danger.
Giving any CEO a compelling decision on policies and decisions, including corrective actions to documented problems, is a recipe for disaster; especially since this CEO’s exposure and global social intelligence are as obviously limited as Mr. Zuckerberg’s; not to mention when its decisions can negatively affect the lives of billions of people. About 90% of FB users reside outside of North America, and former employees admit that FB moderators are as ill-equipped to control such a diverse population as Mark Zuckerberg is to seemingly appreciate such complexity. His tenure as CEO is a case study of why this is an untenable arrangement, underscored by the thousands of lives lost and the millions displaced by the Rohingya alone.
The remedy is to reduce that power to align it with that of the industry, where a CEO is constrained by a board of directors. Unless and until Mark Zuckerberg’s power is subordinate, and he is accountable to collective intelligence, experience and wisdom, no further corrective action can have lasting effect. He will undo any changes he doesn’t like, as he always has. This is the first and essential step.
How? ‘Or’ What? If the goal is to bring about a change in power structure and politics, divestment on its own is unlikely to be successful, due to market dynamics and human nature, although divestment can create unwanted stigma around FB. On the contrary, boycotting the platform, with a drop in the number of users, clicks and revenue, is more likely to be successful. Those who wish to see FB change, especially in resource-rich environments with competing social media options, should boycott FB until it does. A sufficient drop in income could precipitate the kind of investor activism cited by a Booz Allen Hamilton study as a major driver of CEO and company change.
Lawmakers should create and agree on common regulatory standards. Whether they have any role in internal corporate power structures is a debatable but timely discussion.
Next: Systemic Therapy For The Problem With Facebook