Data Black Magic
Our digital shadows are not something we have, but something that to some extent we are. “We are made up of layers, cells, constellations”... of bytes, data and metadata.
- Análisis
‘Data is the new oil’, you’ve probably read this. Since it was coined by the English mathematician Clive Humby back in 2006, the oft-repeated phrase has become a cliché, routinely dropped in conversations about the digital economy.
For starters, the comparison suggests that it is possible to make a lot of money out of exploiting data. This seems quite obvious now: internet giants like Facebook and Google have been accumulating enormous fortunes thanks to a business model that relies on the collection and processing of personal data, monetized through targeted advertising. This analogy also implicates a strong belief that data will hold a vital place in 21st century economy, on par to the one oil still occupies today.
A third imaginary evokes the most catastrophic consequence-a leakage, as is the case with oil spills.
Data and oil don’t actually have a lot in common. Oil is a natural resource - its production resulting from different physical and chemical changes that last millions of years. If by any chance, humanity disappeared from the planet next week, these processes would persevere without human interference.
But data…that is a different story.
The problem with understanding data as being the new oil is that it responds to an extractivist logic, reducing the value of data to solely monetary terms. Arguments suggesting that data subjects - the consumers of big tech platforms like Google and Facebook should be economically rewarded for their data exploitation reflect this view: “if the data we’re generating has proven to be so valuable, then why are we the ones paying the price, rather than being paid?”
Of course, this is an open invitation to disaster. For example, let’s take a look at the business model of a company like Facebook. In elementary terms, the company develops and maintains a series of services - Facebook’s social network, Messenger, Instagram, WhatsApp - and puts them at the users’ disposal. In exchange, the platform collects information about their identity and habits: what they like, which websites they visit, where they go, who they talk to, what they share and at what time. This information is processed and sold to other companies or data brokers that are interested in utilizing it. Given this context, how should consumers be compensated for their data: who decides how much data is worth and who has a say in how this data will be used?
For the ones that acquire this data, its value lies in profiling and targeting the person the data belongs to. There are various applications of data based profiling: ranging from targeted advertisements, to credit scoring algorithms, and filtering employability assessment. What matters here is that data analysis is shaping decision-making processes, in a manner that could encourage discriminatory practices, with potentially devastating impacts on human rights.
It is here where the analogy of data-as-oil falls short. Even if we believe that one must be pathologically naïve to imagine a scenario where individuals can fairly negotiate the value of their data with transnational companies like Google or Facebook, the problem is not simply that these companies are gaining massive profits at your expense. It is the control these companies exercise over people and communities and how they shape access to human rights based on the exploitation of personal data. The introduction of payments for data only legitimizes these abusive conditions, particularly affecting those trapped at the margins.
In the face of this reductionist economic logic, it is necessary to remember that data has no independent existence. It belongs to the person who generates it and cannot be conceived of as an external, distinct element in and of itself for anyone to mine, exploit, or turn into profit. Ultimately, the control over personal data is directly related to questions of autonomy, human dignity, and fundamental human rights.
We need alternative imaginaries and metaphors to better understand the relationship between our selves (our bodies-personalities and communities) and the technologies we inhabit.
Cyborgs and the Data Body
In Marshall McLuhan's, The Medium is the Massage, he examines how all media are extensions of a human faculty, physical or psychical: The wheel is an extension of the foot, the book is an extension of the eye, clothes are an extension to the skin, the circuit board is an extension of the central nervous system.
The extension of a sensory system also implies a transformation of the ways in which we perceive the world and our relationship with it: “media, by altering the environment, evoke in us unique ratios of self-perceptions (…) When these ratios change, people1 change”.
The underlying idea in McLuhan’s approach is that technologies are not simple tools that work at our command. We not only relate through technology, but we also relate to technology; we are affected by technology.
Technologies transform us, but how?
In her 1984 famous essay, The Cyborg Manifesto, Donna Haraway radicalized McLuhan’s approach, positing that our relationship with technology not only alters who we are but what we are: “hybrid[s] of machine and organism, a creature of social reality as well as a creature of fiction.”
And so we turned into cyborgs.
Our imagination of cyborgs usually evokes the image of a RoboCop: a body intertwined with a machine, where iron and muscle coexist harmonically, inseparable. But the conceptualization that Haraway portrays suggests something way less literal: the possibility of embodying a subjectivity constructed at the fusion of organism and machine, rejecting both natural and essentialist approaches to the concept of identity.
The cyborg is built on its material and historical possibilities. In this sense, being a cyborg is not a choice but more like an imposition of the techno-military logic on which late capitalism is underpinned. It is precisely this rejection of the naturalist logic that allows us to raise questions of human identity in the data economy.
What does a body look like in the social network era? How can we conceptualize the relationship between “data” and the “self” under the cyborg logic?
When we set up a Facebook profile or a Twitter account, we are providing subjectivity to a technology. Facebook is not just a simple way to communicate with others; it is another deployment of the self-contained in the possibilities allowed by the platform. I am my Facebook profile, and my relationship with the platform alters the way I perceive the world around me, inside and outside the internet.
Yet the equivalence between my organic-self and my Zuckerbergian-Doppelgänger is not absolute: the first cannot be reduced to the latter, while the latter cannot exist without the first- as it lacks autonomy. It is a compressed, simplified, everyday self whose existence is a product of the tension between my desires and the possibilities Facebook’s engineers chose to give me. It is the expression of my self as data on a particular corner of the internet.
Myself and my other selves
The notion of the digital “dopplegänger” has been used years ago to address the formless being constituted by the accumulation of our data. As Reppel, Szmiginy and Funk said: “it is the digital data trail that an individual leaves which can develop a ‘life of its own’, a life that is generally understood as an unintentional product of the person who provides the personal data. While this information is fragmented in isolated data points, it is ‘reconstituted’ later to produce a different self, i.e. an ‘other’ to our real self.
Nonetheless, this notion seems to have become insufficient a long time ago. The “doppelgänger,” whom we admitted has an independent life on its own, only vaguely affected the person, beyond privacy risks arising from the potential leak of personal information.
But the ‘digital selves’ that evolve from our internet activity are not independent of us- they are only different expressions of who we are.
Our digital selves most certainly affect our everyday lives. They keep a close relation to our bodies and the activities that our bodies are allowed to perform, they affect and interact with our reputation, and our identity. They can even replace aspects of our real identity - incase the person is ever at conflict with society due to notions of ‘legal identity’, which is increasingly intertwined with information bits such as biometric data.
The problem, lastly, is located in the arbitrary, conceptual division between the ‘real’ and the ‘unreal’. The tangible is that to which elements like the material body of the person belong, while other realms are considered ‘unreal’ due to their intangibility, where the data of those bodies (and the data of the bodies’ activities in the world) belong.
As Irma Van der Ploeg would say, “as long as the body is considered a material reality, the information that derives from it will be regarded as socio-cultural matter”.
The transformation of the self to data is not a mere pragmatic operation facilitating the relationship between humans and machines. Rather, it reflects the idea that the self is constituted through this process of datafication and the connection with the platform that gathers the data. My data is not different from me.
Consequently, following the logic of the data economy, this idea becomes evident, as this information is used to isolate us, classify us, and target us. We are (re)constructed by these bits of information, retrieved from various sources, where conjectures fill the holes. Thus an other-self is born, endowed by a data-body.
In their 1998 book, Flesh Machine, the Critical Art Ensemble defines the data body as “the total collection of files connected to an individual: From the moment we are born, and our birth certificate goes online, until the day we die and our death certificate goes online, the trajectory of our individual lives is recorded in scrupulous detail. Education files, insurance files, tax files, communication files, consumption files, medical files, travel files, criminal files, investment files, files into infinity....”.
In the era of data capitalism, the data body works as a perverse substitute to the actual body, from which decisions are made, but whose existence is completely opaque. We will never know who set it/us up, what pieces build our virtual selves, nor the measure of their weight. It is like a voodoo doll that lacks autonomy and dignity, and that is secretly used to manipulate us. Data capitalism is like black magic.
Those who defend the personal data approach from a proprietary perspective, arguing that people should be able to use their data as a commodity for exchange, lose sight of the critical imbalance of power that takes place in this negotiation. Just as much as those who argue that in the face of data exploitation, the solution is to leave the digital realm altogether. Both approaches proceed from a logic of privilege that disregards a wide array of human realities.
The conceptual leap that seems necessary relies on changing the mindset from ‘having a body’ to ‘being a body’. These doppelgängers or digital shadows are not something we have, but something that to some extent we are. There is not a single ‘self’ linked to the material reality. Each one of those shadows is a fragment of our self that adds up to an expansion towards the digital, or what Daniel Stern has called the ‘layered self’. Each one of these shadows is a new extension, a new layer of the self, “We are made up of layers, cells, constellations”... of bytes, data and metadata.
And so, if we admit as a starting point the fundamental right of the individual to have a certain degree of control over their body and their identity, the opacity of the data body becomes immediately unsustainable.
Our digital identity, reflected through encounters with the advertisements we come across while browsing the internet, transforms into a mirror that doesn’t reflect our face, but our doppelgänger’s. This causes an emotional reaction of fear or loathing to find, as Ruyq points out, that “our data double doesn't resemble our understanding of ourselves”. Mined, stored, transmitted and traded by others, used to make decisions that affect us, but towards which we cannot participate, the data body is transformed into a voodoo doll at the hands of someone else. Its impact on our lives is unpredictable and beyond our control.
- Marianne Diaz is a Venezuelan lawyer, digital rights activist and fiction writer, currently based in Santiago, Chile. Her work focuses mainly on issues regarding online freedom of speech, web filtering, internet infrastructure and digital security. She founded the digital rights NGO Acceso Libre, a volunteer-based organization that documents threats to human rights in the online environment in Venezuela.
- Vladimir Garay is Advocacy Director at Derechos Digitales, a Latin American, non profit, digital rights organization. Vladimir is a journalist and holds a Bachelor Degree in Social Communication from Universidad de Chile.
Source: https://botpopuli.net/data-black-magic
Del mismo autor
- Data Black Magic 06/08/2019
Clasificado en
Clasificado en:
Comunicación
- Jorge Majfud 29/03/2022
- Sergio Ferrari 21/03/2022
- Sergio Ferrari 21/03/2022
- Vijay Prashad 03/03/2022
- Anish R M 02/02/2022
Internet ciudadana
- Nick Bernards 31/03/2022
- Paola Ricaurte 10/03/2022
- Burcu Kilic 03/03/2022
- Shreeja Sen 25/02/2022
- Internet Ciudadana 16/02/2022