A New Despotism in the Era of Surveillance Capitalism

22/02/2019
  • Español
  • English
  • Français
  • Deutsch
  • Português
  • Opinión
facebook_lupa.jpg
-A +A

There is a fascinating chapter toward the end of Alexis de Toqueville’s Democracy in America titled “What Kind of Despotism Do Democratic Nations Have to Fear?” in which the author attempted something truly extraordinary – to describe a social condition which humankind had never before encountered. We find him trying to put his finger on something which does not yet exist, but which – in his extraordinary political imagination – he was able to foresee with startling clarity.

 

I maintain that we have good reason to fear that the business model of commercial surveillance – pioneered by Google and adopted by Facebook, among others – is serving to undermine the foundations of our democracy. Shoshana Zuboff explains in her new book, The Age of Surveillance Capital (Public Affairs, 2019), that the system works by treating human experience as “free raw material for translation into behavioral data. Although some of these data are applied to service improvements, the rest are declared as proprietary behavioral surplus, fed into advanced manufacturing processes known as ‘machine intelligence,’ and fabricated into prediction products that anticipate what you will do now, soon and later. Finally, these prediction products are traded in a new kind of marketplace that I call behavioral futures markets. Surveillance capitalists have grown immensely wealthy from these trading operations, for many companies are willing to lay bets on our future behavior.”

 

In effect, we are becoming the subject of a new insidious, subtle, and almost invisible form of subjugation that was foreseen with uncanny ability by Tocqueville in 1849. Over a hundred and seventy-five years ago, Tocqueville wrote: “The kind of oppression with which democratic peoples are threatened will resemble nothing that has proceeded it in the world.” He goes on to describe the elevation of “an immense tutelary power … which alone takes charge of assuring their enjoyments and watching over their fate. It is absolute, detailed, regular, far-seeing, and mild. It would resemble paternal power, if, like that, it had for its object to prepare men for manhood; but on the contrary, it seeks only to keep them fixed in childhood; it likes citizens to enjoy themselves, provided that they think only of enjoying themselves. It willingly works for their happiness; but it wants to be the unique agent and sole arbiter of that.”

 

In Time magazine’s January seventeenth article “I Mentored Mark Zuckerberg, But I Can’t Stay Silent” author Roger McNamee observes, “One of the best ways to manipulate attention is to appeal to outrage and fear, emotions that increase engagement. Facebook’s algorithms give users what they want, so each person’s News Feed becomes a unique and personal reality, a filter bubble that creates the illusion that most people the user knows believe the same things.”

 

The notion of a bubble here is a useful one: central to the work of Jakob von Uexküll, an Estonian-born biologist and one of the fathers of biosemiotics, is the concept of the umwelt – or ‘surrounding-world’ – the ‘soap-bubble’ that each creature creates for itself and which constitutes their experiential world. The umwelt is composed of signs as bearers of meaning, and for each organism the umwelt is the whole of their reality. What distinguishes us as human beings is that our umwelt is not fixed, immobile, rigid, or static. One of the ways we can understand the effect of Facebook’s algorithms on its users is that the umwelt each user inhabits runs the danger of effectively shrinking: growing smaller and ever more calcified. In “How Facebook’s Algorithm Suppresses Content Diversity and How the Newsfeed Rules Your Clicks,” the author Zeynep Tufekci asserts that researchers were able to definitively conclude that, by a measurable amount, Facebook’s newsfeed algorithm reduces a user’s exposure to “…ideologically diverse, cross-cutting content...”  By assuring that we are exposed only to that which we are likely to approve of and assent to, our umwelt – or social reality – is that much more diminished and homogenized.

 

Facebook’s business model has far-reaching implications, especially in terms of our ability to empathize with others – others who may not be like, or think like, ourselves. This had devastating results in Myanmar where Facebook became a tool for ethnic cleansing against the Rohingya. While it certainly may not have been its intention, Facebook has become a “forum for tribalism” promoting a “simplistic version of ‘community’” while arguably “harming democracy, science and public health” – as Siva Vaidhyanathan suggests in Antisocial Media: How Facebook Disconnects Us and Undermines Democracy (Oxford University Press, 2018).

 

Much of my research has shown that there is a close relationship between empathy and our ability to creatively reconstruct the umwelt of the other. While one cannot share his or her umwelt – each of us remains in our own soap-bubble, as it were – we can participate in a common umwelt, which in many ways is purportedly the stated goal of social media. It is ironic that Facebook, which claims to prize connectivity above all, has in fact, contributed to producing the opposite result – where each of us fixed in a vapid and hardened bubble of isolation.

 

In the face of an American government that is increasingly retreating from its responsibilities, we must recognize that Facebook, Google, and Amazon are the new leviathans. In serving users only those posts with which they will agree, Facebook is like Tocqueville’s ‘tutelary’s power’ which “everyday … renders the employment of free will less useful, and more rare; it confines the action of the will in a smaller space, and little by little steals the very use of free will from each citizen.” These companies do not simply want to automate information: as Zuboff observes, “the goal now is to automate us… to produce ignorance by circumventing individual awareness and thus eliminate any possibility of self-determination.”

 

Facebook’s business model represents a new insidious form of subjugation that does not tyrannize, but as Vaidhyanathan observes, “it hinders, compromises, enervates, extinguishes, dazes, and finally reduces each nation to being nothing more than timid and industrious animals of which the government is the shepherd.”

 

Facebook has contributed its share to the deterioration of epistemic norms and has helped to usher in the era of so-called post-truth. The motivation behind this disdain for truth as such, has always been the same – namely, it serves the bottom line. As McNamee puts it: “on Facebook, information and disinformation look the same; the only difference is that disinformation generates more revenue, so it gets better treatment.”

 

Over a two-year period preceding the 2016 election, one hundred and twenty-six million Americans saw Russian-backed content. Facebook was at best reckless in the rampant and deliberate spread of disinformation through fake Russian accounts; which is to say that by allowing the proliferation of fake news, Facebook incontrovertibly helped Donald Trump to become the President of the United States. Facebook has provided fertile ground for the spread of grossly irresponsible conspiracy theories and “hopelessly inaccurate viral posts.”

 

Like many others, McNamee suggests that users should have control over their own data and metadata – as if data ownership is the solution to the scourge of surveillance capitalism. The problem with this kind of thinking is that it fails to ask the more elementary question of whether such data should exist at all. As Zuboff observes “It’s like negotiating how many hours a day a seven-year-old should be allowed to work, rather than contesting the fundamental legitimacy of child labor.” Surveillance capitalism represents a new form of despotism, one that is harming our capacity for individual autonomy in order that behavioral data can continue to be generated unimpeded, supplying markets and the advertisers that are Google’s and Facebook’s real customers.

 

We are becoming the kind of solipsistic and atomistic society that Tocqueville foresaw, “an enumerable crowd of like and equal men who revolve on themselves without repose … each of them, withdrawn, and apart, is like a stranger to the destiny of all the others… As for dwelling with his fellow citizens, he is beside them, but he does not see them; he touches them and does not feel them.” Alexis de Tocqueville warned us that oppression may take forms which are gentle, quiet, calm, but nonetheless, inimical to genuine freedom. To adequately respond to the problem will require more than demanding greater privacy or data ownership – it will involve a radical questioning of our basic assumptions, and a new understanding of what democracy means and entails in the age of capitalistic surveillance.

February 18, 2019

 

 

- Dr. Sam Ben-Meir is a professor of philosophy and world religions at Mercy College in New York City.

sam@alonben-meir.com

 

 

https://www.alainet.org/fr/node/198335
S'abonner à America Latina en Movimiento - RSS