The dilemma around digital rights in education
It is necessary to advocate for retrieving the notion of public data as a common good
The Covid-19 pandemic imposed the fastest and widest adoption ever seen of technologies for distance learning and communication between students and teachers. It was often lacking due transparency of the donation or public procurement in question. This situation left the right to education facing a real "dilemma" of rights: transparency versus privacy. While it is necessary to reaffirm the right to privacy and to prevent setbacks, it is also necessary to advocate for retrieving the notion of public data as a common good. The fulfillment of this idea depends on public governance of the digital infrastructure, in which the State is placed as the guarantor of these digital rights.
There is a complex personal data market in the world today, in which we are all active participants — whether we are aware of our role or not. Our "digital footprints" are collected at every moment and place, on every device, fixed or mobile, on the sites we browse. Such a constant flow of data is stored and shared among different institutional and corporate actors, formally and informally, in a complex value chain of a lucrative and invisible market.
Data obtained and monetized through technological surveillance has become so central to the functioning of this "new economy" based on the Internet that it has inspired a new attire for capitalism: surveillance capitalism (Zuboff, 2015). This system has as one of its main inputs the flow of individual attention, and the data collected are raw material for analyzing and forecasting tastes, interests and desires (Silveira, 2017).
Education, especially public education, is a coveted territory. First, because the state collects a large amount of citizen data to operationalize policies and services. Second, because this movement of searching for economic exploitation (or "commoditization") of personal data finds and potentiates another: the privatization of education (Adrião & Domiciano, 2018). The possibility of collecting personal data and attention becomes worth more than the software itself. These products come to be "donated" by interested companies.
Such mechanisms of usurpation of personal data from the educational community, and deviation from their intended purpose, can potentially deepen inequality gaps and make those groups that have historically been exposed to processes of exclusion and discrimination even more vulnerable.
The right to data protection
The significance of the debate on privacy is situated unequivocally in the field of fundamental rights and freedoms. Since the 1948 Universal Declaration of Human Rights, the texts have sought to protect the right to privacy and to safeguard all forms of telephonic, telegraphic or telematic communication, such as the Internet. And not only from public authorities, but also – and increasingly – from private actors (Comparato 2010).
This principle is maintained, but today, the idea is gaining ground that privacy also includes the individual's right to control what is collected about him or her — data protection. Personal data are those that refer to identified or identifiable living persons, including those data that, in aggregate, may lead to their identification1. The issue has been gaining new momentum with a new wave of approval of specific laws in the last decade.
Among all the personal data to be protected, two dimensions deserve special attention: so-called sensitive data and data whose owners are children and adolescents. For such data, the legislation reserves specific conditions for treatment and extra layers of protection.
The idea that "privacy is dead" — propagated by the technology industry — is one of the major drawbacks to claiming the right to data protection. To the general public, this negotiation may sound fair: what's the problem in providing personal information in exchange for free services such as games and communication apps that are useful to me? In the end — people often say — "I have nothing to hide". The problem is that they are often unaware of the extent of the data that is collected about them. Moreover, they are unaware of the uses to which they can be put, and of the immediate and future consequences of this concession.
Risks of improper processing
In digital markets, this data is often used for data profiling — automated processes to build detailed individual profiles aimed at "predicting" and inducing behavior. This occurs through the collection and analysis of "digital footprints" during internet browsing and application use.
Digital profiling classifies people individually into categories, according to "scores" for education, employment, political views, health interests, religion and ethnicity, media usage, consumption, income, economic stability and personality. They also include analysis of their online behavior, including types of sites and content visited, and interests. One of the companies with the world's largest consumer database, Acxiom, already claimed, in 2013, to have up to 3,000 attributes on 700 million people. Oracle, a technology giant, claims to supply more than 30 thousand items on 2 billion profiles (Christl, Kopp & Riechert, 2017).
This market and its "profiling" techniques are not just for advertising purposes. The world of work and employment, the real estate sector, insurance and credit companies, even the dynamics of democracy and electoral debate, the justice system and the social welfare state — all these fields are beginning to be affected by the use of technologies, algorithms and automated decisions that feed on this data. And that is why, when targeted at traditionally marginalized groups, these technologies can accentuate inequality, creating a real "feedback loop of injustice" (Gangadharan, 2017).
That is why data protection is not only about violations such as situations of leakage, system invasion or improper data exchange. The legal debate on privacy has often been framed by the notions of consent and data processing. With consent, citizens who submit their data declare that they agree to the intended collection and processing of their data. It is in privacy policies — the "fine print" contracts of online applications and services — that this consent is usually collected. Educational institutions are considered responsible for the processing, as long as they determine the purpose and means of one of these operations with personal data of the educational community. Even if they opted for hiring third parties.
These third party actors – edtechs – are increasingly present on the scene. The acronym, Education and Technology, broadly refers to companies that produce products for the educational sector — hardware (equipment) and software (applications, programs and systems). The fad of joining the suffix "tech" to the "industry" prefix (agro, gov, ad, fin, legal, health) is generally intended to highlight the innovation aspect of the segment's technologies, whether they are startups (nascent companies) or not. In Chile, there were 100 such companies. In Brazil, 449 (CIEB, 2020; Omidyar, 2019). To thrive, this market depends on infrastructure: in telecommunications, electricity and internet. Therefore, it is usually accompanied by strong lobbying for the expansion of public-private programs of this type, for the use of technologies inside and outside schools.
There is still no general and systematic mapping published on data collection and surveillance practices in the field of education, especially in Latin America and the Caribbean. Much is discussed about the role of big tech in the violation of digital privacy, but smaller edtech companies may also have their business model linked to the collection and transfer of personal data to third parties, or the targeting of advertisements and personalized content to users. This objective is not always explicitly stated. The tools may be purchased directly by schools or adopted by education departments. Often, adoption is offered free of charge for use in public networks, through terms of cooperation between networks and institutions such as foundations or private institutes. Precisely because it is not yet on the radar of studies on surveillance in education, it is necessary to focus attention.
The lack of adequate materials in schools and the underfunding of education faced by the countries of the region make schools more susceptible to the uncritical adoption of "free" technological tools that collect data from the school community. This also makes it difficult to advocate for the development of in-house software or customized solutions, as these incur possible costs that will be seen as higher than those of "donated" tools.
Payment for such services is usually made in the form of personal data of students and teachers, something not uncommonly considered by those responsible for educational institutions to be a fair price for an expensive service. Not only should governments and public education secretariats/departments be held accountable for personal data protection policies. Every unit that operationalizes public education policy and deals with technologies, whether administrative or pedagogical, is also responsible.
The educational community must know exactly what is being done with their data. Hence the need for processing to always be accompanied by transparency and accountability policies. As long as they do not expose individuals, data must be treated with maximum transparency, as a common good — including that of the source code of the technologies adopted for educational activities. The right of access to information and the right to privacy are not discordant rights, but complementary. Especially in times of pandemic, they will be fundamental for the full guarantee of the right to education in the digital environment.
Fernanda Campagnucci is executive director of Open Knowledge Brazil. A journalism graduate with a master's degree in Education, she is studying for a PhD in Public Administration from FGV-SP. https://twitter.com/fecampa
Adrião, T. & C. A. Domiciano. (2018). “A Educação Pública e as Corporações: Avanços e Contradições Em Uma Década de Ampliação de Investimento No Brasil.” FINEDUCA - Revista de Financiamento Da Educação 8.
CIEB. (2020). Mapeamento Edtech: Investigação sobre as tecnologias educacionais no Brasil 2019.
Christl, W., Kopp, K. & Riechert, P. U. (2017). Corporate surveillance in everyday life. Cracked Labs.
Comparato, F. K. (2010). A Afirmação Histórica Dos Direitos Humanos. 7. ed. São Paulo: Saraiva.
Gangadharan, S. P. (2017). The downside of digital inclusion: Expectations and experiences of privacy and surveillance among marginal Internet users. New Media and Society.
Omidyar (2019). Scaling Access & Impact: Realizing the Power of EdTech. Chile Country Report.
Silveira, S. (2017). Tudo Sobre Tod@s: Redes Digitais, Privacidade e Venda de Dados. São Paulo: Edições Sesc SP.
Zuboff, S. (2015). “Big Other: Surveillance Capitalism and the Prospects of an Information Civilization.” Journal of Information Technology 30(1): 75–89.
1 European regulations are a reference for other legislation in this area.. Available at: https://ec.europa.eu/info/law/law-topic/data-protection/reform/what-personal-data_pt#referncias. Latest access 29/10/2020.
Del mismo autor
- Gabriela Ramírez Mendoza 07/02/2022
- Jyotsna Singh 06/02/2022
- Gabriela Ramírez Mendoza 06/02/2022
- Richa Chintan 10/01/2022
- Isaac Enríquez Pérez 03/01/2022