Close

A Conversation about the Delegability of Data Rights

Nils Gilman, Matt Prewitt, E. Glen Weyl, Jennifer Lyn Morone, Neil Lawrence, Sylvie Delacroix, Jef Ausloos, Divya Siddarth

March 19, 2021

The question of whether individuals can (partly or wholly, temporarily or permanently) delegate data rights to intermediaries such as trusts, cooperatives, agents, or collective bargaining entities, is at the center of numerous global policy discussions. The following exchanges – edited lightly from a longer conversation hosted by RadicalxChange Foundation – shed some light on this important question.

Speakers

Nils Gilman is the Vice President of Programs at the Berggruen Institute, in which capacity he leads the Institute’s research program, directs its resident fellowship program, and is also Deputy Editor of Noema Magazine. He has previously worked as Associate Chancellor at the University of California Berkeley, as Research Director and scenario planning consultant at the Monitor Group and Global Business Network, and at various enterprise software companies including Salesforce.com and BEA Systems. Gilman has won the Sidney Award (for long-form journalism) from the New York Times and an Albie Award (for international political economy) from The Washington Post. He is the author of Mandarins of the Future: Modernization Theory in Cold War America (2004) and Deviant Globalization: Black Market Economy in the 21st Century (2011) as well as numerous articles on intellectual history and political economy. He holds a B.A. M.A. and Ph.D. in History from U.C. Berkeley.

Matthew Prewitt is RadicalxChange Foundation’s president, a writer and blockchain industry advisor, and a former plaintiff’s side antitrust and consumer class action litigator and federal law clerk.

E. Glen Weyl is a political economist and social technologist whose work focuses on harnessing computers and markets to create a radically equal and cooperative society. He is the Founder and Chairperson of the RadicalxChange Foundation, a Principal Researcher at Microsoft Research, and a lecturer at Princeton University. Glen was recently honored as a Bloomberg Top 50, one of Wired Magazine’s 25 leaders shaping the next 25 years of technology, and one of Coindesk’s most influential people in blockchain for 2018.

Jennifer Lyn Morone is RadicalxChange Foundation’s CEO and a multidisciplinary visual artist, activist, and filmmaker. Her work focuses on the human experience with technology, economics, politics, and identity, and the moral and ethical issues that arise from such systems. Her interests lie in exploring ways of creating social justice and equal distribution of the future. Morone is a trained sculptor with BFA from SUNY Purchase and earned her MA in Design Interactions at the Royal College of Art in London with Dunne and Raby. Her work has been presented at institutions, festivals, museums, and galleries around the world, including ZKM, Kunsthalle Düsseldorf, Ars Electronica, HEK, the Martin Gropius Bau, the Science Gallery, Transmediale, SMBA, Carroll/Fletcher Gallery, panke.gallery, Aksioma, Drugo more, and featured extensively on international media outlets such as The Economist, WIRED, WMMNA, Vice, the Guardian, BBC World News, Tagesspiegel, Netzpolitik, the Observer.

Neil Lawrence is the inaugural DeepMind Professor of Machine Learning at the University of Cambridge. He has been working on machine learning models for over 20 years. He recently returned to academia after three years as Director of Machine Learning at Amazon. His main interest is the interaction of machine learning with the physical world. This interest was triggered by deploying machine learning in the African context, where ‘end-to-end’ solutions are normally required. This has inspired new research directions at the interface of machine learning and systems research, this work is funded by a Senior AI Fellowship from the Alan Turing Institute. Neil is also visiting Professor at the University of Sheffield and the co-host of Talking Machines.

Sylvie Delacroix is professor in Law and Ethics at the University of Birmingham, which she joined in January 2018, coming from UCL where she was a reader in Legal Theory and Ethics, with a fractional appointment in UCL Computer Science. Prior to that Sylvie was the Evelyn Green Davis Fellow at the Radcliffe Institute for Advanced Study (Harvard University, 2004-05), a lecturer in Law in Kent University and a post-doctoral scholar in Trinity College, Cambridge University. While in UCL Sylvie Delacroix was the founding Director of the UCL Centre for Ethics and Law, as well as the UCL Virtual Environments and the Professions Group. Professor Delacroix’s work has notably been funded by the Wellcome Trust, the NHS and the Leverhulme Trust, from whom she received the Leverhulme Prize. Professor Delacroix has recently been appointed to the Public Policy Commission on the use of algorithms in the justice system (Law Society of England and Wales).

Jef Ausloos is a postdoctoral researcher at the University of Amsterdam’s Institute for Information law (IViR), where he is part of the personalised communications project team. His research centers around concepts such as transparency, empowerment and autonomy. Jef is an affiliated researcher at the University of Leuven’s Center for IT & IP Law (CiTiP), where he obtained his PhD 'The right to erasure: safeguard for informational self-determination if a digital society?’ in 2018. He has also worked in academia and civil society in the US and Hong Kong.

Divya Siddarth works on building, testing, and studying impactful technology. Her work covers a broad range of applications in the intersection of technology and society, including digital work, political communication, digital security and privacy, and tech-augmented cooperation and collectivization. She currently holds the position of Associate Political Economist and Social Technologist at Microsoft, and has done extensive fieldwork in urban and rural contexts, studying and implementing large-scale technology interventions for societal good. Her work has been published in the ACM Conference on Human Factors in Computing Systems, the ACM Conference on Computing and Sustainable Societies, and the ACM Conference on Information and Communication Technologies for Development, and Information, Communication, and Society. She has previously taught classes at Stanford University in both the computer science and political science departments, in collaboration with the Digital Civil Society Lab.


Nils: The question we’re getting at is, exactly who should have decision rights as to data that happens at the intersection of different people and institutions. Right?

Matt: Right. In order for ordinary people / consumers / users to have bargaining power over their data, they need to be able to group their interests. And that might entail mechanisms that allow them to bind each other in common decisions. This is important because a number of policymakers seem to think of rights to data as attaching to individual persons inalienably, in such a way that they might not be able to be assigned to collective bargaining agents or intermediaries that would help groups make decisions. To explore this, perhaps we need to start by exploring the way that data always touches the interests of many people at the same time.

Glen: I would start with a philosophical point from Wittgenstein. Wittgenstein says it’s impossible to think about language or information in a completely individual, separated context. He says there’s no such thing as a private language. There are stories that are versions of this, such as: does a falling tree really make a noise if no one hears it? This is because information is not really about individual processing, but an act of communication between people. Almost everything online is like that. The whole concept behind the internet is the conveyance of a packet from one person to another person. There is always a sender and a receiver, and often much more than that, for example an email or blog where the audience is larger than one. All these things are ultimately part of a relationship between people. So there is almost no category of information – perhaps none at all – that is actually a separate or individual. In fact even things we want to keep private, we actually want to keep private to a group. For example even if I were having an illicit affair, that is actually not an individual piece of information, because it pertains to me and to the person I’m having the affair with. So that’s just an example.

Jen: What about physiological information, such as my heart rate when I go for a run? Is that separate, individual information?

Glen: Even there I would argue that what’s going on in the interaction between your body and the tracking mechanism is effectively that a doctor is listening to your heart. The doctor is just implemented through some digital tool.

Nils: I like this provocation that the data we care about is essentially intersectional. But there’s a huge amount of data that’s not produced by or about individuals. You could imagine a huge sensor network about things growing in a field. Is it not data at all until a farmer or a corporation wants to do something with it?

Glen: I think what you’re describing goes back to foundational discussions about property. Are things that are out there in the world part of me, or are they out there in the world? The traditional Lockean approach is, as I mix my work or use or observation with it, it becomes part of me. In a similar way data in the world put there by someone is partially part of them. But I also take exception with the Lockean account in some ways because it ignores the ways that society gave you the context in which you were able to appropriate that thing. The sensor network wasn’t just put there by you, it was put there by you and some context around you that makes sense of it.

Nils: I agree with that. I think this is the key point. There is an infinite amount of theoretically possible data, but only a tiny fraction of that is being collected. And some human being, ultimately, always made a decision to collect it. It’s the organization of this data that makes it valuable. Now the claim of the big platform companies is that because they are organizing the data, they are adding this value. Now, I would push back against this. Our challenge is to explain why we are pushing back against it.


Neil: It’s interesting in this context to consider the meaning of the rights we have to data and what we can do with them. Paragraph 24 of the Data Governance Act says that the rights in the GDPR cannot be collectively mandated. This is a really strong statement that surprised a lot of people. But it’s not entirely clear what it means. If I can’t mandate a right, does that just mean it’s not a one way door, that I cannot be contractually bound and never recover the right? Because in that case, I really like it. But I’m not sure that’s what it means. If you say something’s inalienable, does that mean I cannot give it away ever, similar to my basic right to liberty? At the same time there are other cases, where making a right inalienable – such as the right to contract with whomever you want – would make it impossible to collectivize the right.

Sylvie: Paragraph 24 mentions delegatability. When I recently chatted with Paul Nemitz who has played an active role as advisor to the European Commission, he pointed out that even the authors of that provision agreed that the wording was too broad, in at least one respect: it doesn’t discriminate between different data rights. This is an interesting starting point: Why should we discriminate between different data rights? Perhaps we need to think a bit harder between the difference between, say, the right to erasure on the one hand, and the notion to have one’s consent taken for X or Y. Paul Nemitz pointed out that the worry underlying this provision was consent. I initially thought this was very strange, because surely if you can mandate your rights you don’t lose your rights to consent, but consent better by setting the terms according to which your right to consent can be used.

Pragmatically, we are still at the beginning of a long story in terms of thinking about which data rights are closer to the economic dimension of data, and hence should be more easily delegatable, and which data rights could be deemed to encapsulate in a stronger way the human rights motives behind the GDPR. That seems to be the thinking at the moment behind the political clash. There are people who see any sort of mandatability as compromising hard earned rights. Then there are other people that say you cannot stand in the way of emerging forms of data sharing that are badly needed. What fascinates me from a theoretical perspective is that data rights are a clash between the property dimension and the human rights dimension, and we’re never going to be able to isolate one from the other.

Pragmatically, however, I don’t understand the fascination with mandatability. It seems to me that things like data coops can do their jobs no matter what. At least for some rights, like the right to erasure, it would be better if the rights could be mandated to coops. But do data coops absolutely need delegatable rights or strong mandates?

Matt: I think there’s an informative parallel between the question of mandatability of data rights, and “right to work” laws in the United States. In the United States, anti-union activists have pushed through state laws that purport to defend individuals’ right to contract with purchasers of their labor. They make it more difficult for individuals to delegate their authority to bargain with employers to unions. There’s really not much interesting discussion of what’s going on here – these are efforts to keep wages down, to keep bargaining power in the hands of large employers, and to prevent unionization.

In the context of data, restraints on delegability could prevent a shift in bargaining power that would result from people being able to combine their rights. The reason the rights need to be to some extent alienated is that binding group decisions about data use by coops are needed to prevent data subjects from responding to incentives from platforms to “cross the picket line”, so to speak.

Glen: Consider the case of social network based information. We saw in the Cambridge Analytica scandal that a crucial element of data rights violation comes through social relationships because I have access to information about my friends, so someone who has access to information about me has access to information about them. Now suppose we try to do collective bargaining, but all maintain our individual rights to portability. And suppose somebody tries to obtain some of the information that I am allowed to port, and therefore pertains to someone else. Any unscrupulous actor could set up a bidding process where they try to assemble all the information for the lowest possible cost by finding the people who are willing to sell it for the least. This could be a race to the bottom.


Matt: I can see people organically banding together to do things like actively providing data to entities they support. But without some kind of temporary delegation of authority to intermediaries, it’s hard for me to imagine sufficient coordination to do things like switching platforms en masse. For example, a group of friends of mine recently suggested moving a long-running text thread from WhatsApp to Signal. An overwhelming majority wanted to move, but we all stayed with WhatsApp because there were a few holdouts, important to the social fabric of our little group, that we did not want to leave behind. We needed a democratic decision process.

Nils: There’s lots of examples of things like people having a temporary period where they delegate their authority. Think about athletes, for example. If you get traded from the Yankees to the Dodgers, you don’t have a choice about whether you do that, and that’s under a contract that you earlier agreed to.

And obviously no one here is advocating that you sign over your data rights permanently. Because we do want some competition. You want different intermediaries to put together different value propositions that people can opt in or out of. Different levels of privacy, monetization, social good that people are enabling these organizations to pursue. I do think the idea of competition is something that we want to affirm.

Matt: I agree. I think time granularity is the key lever there. If you can’t delegate your authority for any period of time, then people can’t coordinate, and the well-capitalized parties on the other side of the market exploit them. On the other hand if people can delegate their rights irrevocably, they’ll be exploited by whomever they delegate them to.

Nils: Right, let’s go back to the example of sports. Sports teams demand contracts – athletes can’t go play for a rival team whenever they want – because the team needs to be put together with a reasonable amount of stability, in order to be effective. At the same time, we want free agency, so that at the end of a period of time, athletes can take their talents to a rival.


Jef: Mandatability for periods of time isn’t always important. For example, there was a recent case where individuals dealing with a German credit rating agency accused of discrimination obtained their individual data via the GDPR, and assembled it to prove discrimination. So there, they didn’t need the rights to be held for any period of time. Just getting the data for one second allowed this to happen. We need to make distinctions between different ways of getting value out of data, there is a danger of putting everything in one pile.

Sylvie: Jef I think you’re completely right. There is also a distinction between cases where you have centralized data collection, and cases where you don’t. In the latter case, the mandatability and the time framework we were talking about is more important.

Divya: To me a lot of this relates to advances in privacy preserving machine learning, where less data would be centralized through things like federated learning or multiparty computation. This would allow us to train on more sensitive data without compromising privacy. And I think that would require some level of mandatability, because there’s no way to access the data later.

Neil: Even with federated learning though, something of the problem Jef identified is still present. Because if I gain a single insight, then it doesn’t matter if everyone then leaves.

Matt: This is important and it’s integral to my view about what this collective bargaining should look like, that collective bargaining entities themselves have to be precluded from fully alienating the data to further third parties. So whenever we’re talking about delegation, we’re talking about delegating the right to exercise an interest in data that does not include the ability to disclaim all interests in it. This may be the core of what makes this so confusing. Because we need to be able to delegate something but we can’t be able to alienate everything.