Three to five dollars: that’s the answer. As simple as that. I am talking about the behind-the-curtain market for personal data that sustains machine learning technologies, specifically for the development of face recognition algorithms. To train their models, tech companies routinely buy selfies as well as pictures or videos of ID documents from little-paid micro-workers, mostly from lower-income countries such as Venezuela and the Philippines.
Josephine Lulamae of Algorithm Watch interviewed me for a comprehensive report on the matter. She shows how, in this globalized market, the rights of workers are hardly respected – both in terms of labour rights and of data protection provisions.
I saw many such cases in my research of the last two years, as I interviewed people in Venezuela who do micro-tasks on international digital platforms for a living. Their country is affected by a terrible economic and political crisis, with skyrocketing inflation, scarcity of even basic goods, and high emigration. Under these conditions, international platforms – that pay little, but in hard currency – have seen a massive inflow of Venezuelans since about 2017-18.
Some of the people I interviewed just could not afford to refuse a task paid five dollars – at a moment in which the monthly minimum wage of Venezuela was plummeting to as little as three dollars. They do tasks that workers in richer countries such as Germany and the USA refuse to do, according to Lulamae’s report. Still, even the Venezuelans did not always feel comfortable doing tasks that involved providing personal data such as photos of themselves. One man told me that before, in better conditions, he would not have done such a task. Another interviewee told me that in an online forum, there were discussions about someone who had accepted to upload some selfies and later found his face in an advertisement on some website, and had to fight hard to get it removed. I had no means to fact-check whether this story was true, but the very fact that it circulated among workers is a clear sign that they worry about these matters.
On these platforms that operate globally, personal data protection does not work very well. This does not mean that clients openly violate the law: for example, workers told me they had to sign consent forms, as prescribed in the European General Data Protection Regulation (GDPR). However, people who live outside of Europe are less familiar with this legislation (and sometimes, with data protection principles more generally), and some of my interviewees did not well understand consent forms. More importantly, they have few means to contact clients, who typically avoid revealing their full identity on micro-working platforms – and therefore, can hardly exert their rights under GDPR (right to access, to rectification, to erasure etc.).
The rights granted by GDPR are comprehensive, but do not include property rights. The European legislator did not create a framework in which personal data to be sold and bought, and rather opted for guaranteeing inalienable rights to each and every citizen. However, this market exists and is flourishing, to the extent that it is serving the development of state-of-the-art technologies. Its existence is problematic, like the ‘repugnant’ markets for, say, human organs or babies for adoption, where moral arguments effectively counter economic interest. It is a market that thrives on global inequalities, and reminds of the high price to pay for today’s technical progress.
See the full report here.