
In this podcast series, Bruce interviews people from across different communities and industries who, in their own way, are fighting for a better web.
In this episode, Bruce talks with Jan Penfrat from European Digital Rights (EDRi) about the importance of defending digital rights as human rights in an increasingly digital world. We chat about EDRi’s work advocating for these rights within the EU, the challenges they face against Big Tech lobbying, and key battles surrounding privacy, surveillance, and the power of tech companies. Learn about how EDRi is working to ensure a better, more rights-respecting web for everyone.
Transcript
[Bruce:] Hello everybody and welcome to another edition of the For a Better Web podcast in which I, technical communications officer at Vivaldi browser, talk to somebody from an organization or on their own who is working in their own way, in their own industry to make the web better for everybody. And it is my delight and honor today to welcome Mr. Jan Penfrat from EDRi.org. I guess I say “EDRi”. Would that be correct, Jan?
[Jan:] That would be correct, yes.
[Bruce:] And it is “European Digital Rights”. What’s the I stand for?
[Jan:]I guess it’s part of rights and EDR was maybe already taken. You’d have to ask the people from 20 years ago.
[Bruce:] Okay. It’s that old, 20 years old?
[Jan:] Yep.
[Bruce:] And it is, you represent an umbrella group of different digital rights groups across the European Union, I suppose.
[Jan:] Yes. EDRi represents 50 non-governmental civil society organizations all working for the defense of digital rights, which is human rights in a digital environment, in our digitalized societies. Our office is based in Brussels and we represent these organizations mostly towards the institutions of the European Union to positively influence policy making and law making on the European level.
[Bruce:] Given that it’s 50 different civil society groups and, you know, I’ve experience of campaigning for various things in previous lives, how difficult it is to represent or express the interests of 50 diverse groups? I imagine that whereas there’s a lot of overlap, there also must be little bits of disagreement here and there.
[Jan:] Yeah, it does happen, of course, that opinions vary, but I think that’s a strength because it allows us to look at problems from a lot of different angles and it helps us to not overlook aspects that might otherwise be overlooked. I think the diversity is actually a strong point and it also, I think, in reality, many organizations focus on very specific issues. Not everybody can cover the whole range of digital rights issues that are on the table. That would just be too much.
And even here in the Brussels office, we sometimes struggle to cover everything that we should be working on, that we would like to be working on, but we try as best we can with the resources that we have at our disposal to make sure that new European legislation and partly also member states’ legislation takes digital rights into consideration, respects human rights to the extent that is necessary.
[Bruce:] Excellent. And how refreshing it is in 2025 to hear somebody say that diversity is a strength. We feel that very much in in Vivaldi, but we seem to hear those words said too infrequently these days.
Absolutely. Yeah.
[Bruce:] Out of interest, how many people, you say you struggle as EDRi in Brussels to cover everything, how big an organization is EDRi?
[Jan:] The Brussels office has 18 staff members at the moment. Seven of them are working on policy and advocacy. And we’ve got five or six colleagues in the communications campaigns department. And then there is of course the typical leadership team, operations, fundraising, all of these aspects that are really important. And we do try to pool, of course, the resources that we have in our membership to bring all the expertise that we represent across the continent into the discussions and into the policy debates that that happen in Brussels. And so I think we have quite a good standing in the debate. Some people would say probably punching above our weight, but there’s always room to grow.
And I think necessity to grow, especially if we compare the resources that we have with the number of, let’s say, big tech lobbyists that exist in this city. So to give you a sense of scale, so of course, every single one of the big tech corporations has a large lobbying office on their own here in Brussels. The Microsoft office is across three stories, three levels. And I think they have a dozen lobbyists just for the lobbying part themselves.
And then that’s at least true for each of the big tech companies, Apple and Amazon and Meta and X and so on and so forth. X, I guess, might be less well represented by now, but I don’t have the latest numbers. And then on top of that, obviously each of these companies is member of half a dozen industry associations, funds or co-funds, think tanks and research institutes, and all kinds of things that gives them a voice on top sometimes quite untransparent. And that obviously gives them quite some weight in the debate as well.
[Bruce:] I would imagine that the combined might of Microsoft, Google, Apple, Amazon, X, ByteDance, et cetera, has a budget that perhaps even exceeds that of EDRi .
[Jan:] Oh, by far, absolutely. Every single company of those by far exceeds our budget. I mean, our budget is very transparent. We’re about the range of 2 million and a bit, at the moment of annual budget for the Brussels office. And so every single one of these companies has many more millions at their disposal. And then obviously if you combine that, you end up with really big numbers.
Nobody knows exactly how much lobbying money big tech spends, because even though there is a formal, what is called a “transparency register”, in which each lobbyist has to register towards the EU institutions, the information there is rarely verified. So it’s just self-reporting in a way. And then some of the spending that is happening is also concealed. It’s tucked away as, oh, we’re supporting a little bit of journalism here. Google supporting journalism, or Meta supporting fact checkers (or that’s in the past now) or Microsoft supporting research into how AI can make everything better. And these kinds of things aren’t technically lobbying budget, but in practice, they do influence the public debate.
[Bruce:] Absolutely, absolutely. So I’m very interested that you don’t seem to draw a distinction between digital rights and human rights. When my kids were little, I used to say, oh, you know, that’s just the internet. That’s not real life. But I find increasingly now that real life and digital life is pretty much interchangeable. Is that what you think? Is that how you regard it?
[Jan:] Yes, absolutely. I mean, there is ample evidence of digital harms. So online harms, let’s say, things that are happening online, spilling over into the offline world, be it in form of verbal violence becoming real world violence, up to things like genocide. But also, just continuous and increasing digitization of every aspect of our lives that previously were thought to be offline lives. As a digital rights organization, we have to deal with things like health policy now. We have to deal with things like the automotive industry. We need to deal with financial services.
All of these things, as they are being digitized, become part of the discussion in a way, whenever the regulation of online or digital items become somehow interlinked with or touch upon human rights, then that is a field that we will be interested in. Because there is no difference between what digital rights do and what human rights do. When you talk about digital rights, the traditional ones would be the right to privacy, for example, the right to free expression, freedom of access to information, the freedom of assembly. All of these things obviously have enormous importance for offline lives, just as much as they have enormous importance as soon as we go online.
[Bruce:] Well, I mean, arguably, privacy rights online are sometimes even more important because so much activism, so many people work together online where they couldn’t meet together in the real world because they might be geographically dispersed or they might not live in societies that are particularly anxious to see large amounts of people getting together. I know that you, EDRi, and you, as Jan Penfrat, are concerned about people’s right to chat without government surveillance at the moment. Could you talk us through what the issues are and what EDRi is attempting to do, please?
[Jan:] So EDRi has a long history of working for the protection of private communications. The privacy of private communications basically is for us one of the cornerstones that is basically an enabler right for other human rights. So if you think of the right of freedom of assembly, for example, if you can’t have private conversations about your future plans to assemble, then the right of assembly might not even be as worthwhile for you because you can’t even discuss future assemblies at all before having an intervention by the police, for example.
So the privacy of communications is something that people require to organize among themselves in order to make use of some of the other human rights, which is why we’ve been working on this for so long. Now, when an infringement of your communications privacy was in the past required opening, catching your letters and then opening your letters, reading your letters, putting them back into the envelope and sending them further without you noticing, then that was really cumbersome and you couldn’t really do that at scale, with maybe the historical exceptions of the German Democratic Republic and the Stasi who tried to scale this kind of stuff.
But today, doing this is incredibly cheap. And because it’s gotten so cheap and it’s so ubiquitous, it’s also become one of the biggest threats to digital rights that we’re seeing in the current debate. And what worries us a lot is that in the past years, it has gotten more and more commonplace for some of the political leaders to question, to put into question a fundamental right to the privacy of our communications, in particular online, without hesitating.
And that becomes kind of more and more normalized in the debate. And we’re very, very concerned about that and trying to push back here in Brussels as much as we can against this. So one example is the discussion around the surveillance of communications for the search of the child sexual abuse material. So the argument goes in Brussels from some politicians that we should scan everybody’s communications with an AI in order to detect potential wrongdoing. And so as soon as someone would send anything illegal, you could immediately be caught by police. So it’s very putting anybody, everybody, all of us under constant surveillance and suspicion, just in case someone might be doing something wrong in the future. And that is something that obviously undermines a whole range of different human rights, with a very questionable positive impact, if at all, on the protection of people against harms that might happen over online communication.
And that is something that we spend a lot of effort fighting back against. And I dare say, to some with some success in the past year or two, but it’s definitely not a fight that is won yet.
[Bruce:] By coincidence, I had a meeting with my member of parliament here in the UK on Friday to do similar to what EDRi’s open letter to von der Leyen was asking them to make sure that they actually implement the Digital Markets Act. I was asking my MP, who’s now a minister in our new government, to make sure that the UK Competition and Markets Authority lived up to its promise to regulate Big Tech. And whereas she was very receptive to that argument, she works as a minister for safeguarding in the UK Home Office, which I guess is like the Minister for the Interior for many countries. And she was making exactly this argument that we might need AIs to protect against, it’s always child abuse is the example that’s used. But nobody wants to be the person who stands up and says, you shouldn’t infringe people’s basic liberties to prevent this crime because this crime is so abhorrent. But I’ve seen no evidence that actually it would even prevent those crimes because it seems to me, and I’m no expert, that the vast majority of these things are done in the home by people who are known, etc. But good luck with that fight.
I think that probably the fight between liberty and security is not one that’s ever going to go away. Or the tension between the two seems to be several hundred years old already and will continue after we’ve gone. What other things is EDRi fighting in the EU for? What other are your current battles?
[Jan:] There’s a number of battles that we’re still fighting. There’s a battle around AI, of course, and the spread of generative AI and the harms that that can do. There’s a big workstream around the power of tech companies and the reduction of that power.
So we’re putting a lot of effort into trying to reduce, well, first of all, educate policymakers about the problems with the power concentration that we were seeing with the big tech companies.
And then the next step to reduce that power and point people to alternatives. And that is also something that we’ve been doing for a while, and it’s not going to go away anytime soon.And we’re looking at this not only in terms of market power. So when we say they have too much power, we don’t necessarily mean, oh, they’re monopolies or something that as well.
But we’re looking at them in the context of societal power, political power that they can use, leverage, to gain other advantages, political advantages. They have social and societal influence. And these are questions that we’re looking at. And for a long time, this was a real threat that we’ve identified, but it was somewhat theoretical in a way for people here in Europe. And now with Elon Musk buying Twitter and turning this into a really problematic place.
And now the changes or kind of the behavior of big tech CEOs after the Trump inauguration in January. These are now basically the real world incarnations that we’ve been warning about for several years, where we always argue, if you put our entire public debate and our political conversations, our campaigning, our online campaigning, political campaigning, elections campaigning into the hands of a small caste of billionaires, that gives them an enormous power. And they might not have abused this quite yet, but they could at any time. Isn’t that a problematic position to be in as countries? And now we see this unfolding in a terrifyingly fast and impactful way. And I think that is really an important moment for us to amplify that argument and say, look, people, this is the moment we need to seize to make ourselves less dependent on these tech corporations who control so much of our digital lives.
And so we continue to point people to the alternatives that exist. We continue also to try to support calls for more public funding for alternatives. We’re obviously talking a lot about the use of open standards, free software, and all of that, with the goal to give power and control over digital lives back to us, the people, rather than keep it centralized with large tech corporations. And I can tell you, it’s not an easy conversation to be had sometimes with policymakers, because if some of the policymakers, if they hear, oh, we don’t like big tech, or sure, I understand, because they’re foreign corporations, we need to…
And then they come up with ideas where like, oh, we need to create a European Google or a European TikTok or something to be safe. And that is where we need to jump in as a digital rights community and say, like, look, guys, this isn’t about whether that’s US owned or Chinese owned or Europe owned. This is about structural problems. It’s about structural power that can be abused against the people. And that’s why, if we talk about digital sovereignty, we need to talk about the sovereignty of people and not the sovereignty of nations.
[Bruce:] I was very impressed by the letter that EDRi sent, or EDRi coordinated and sent to the EU, because you made the point, this isn’t Europe versus American tech. This is an assault by big tech on us all. And just as much the citizenship of the USA as the citizens of the EU, I think. But whereas it’s relatively easy to quantify economic monopolist behavior. So, for example, this morning, I’ve been reading a consultation from the UK Competition and Markets Authority saying, you know, Apple owns 80% of the market and it seems to have 100% profit margin on X or Y, which is indicative of a structural failure of competition.
Is it possible to quantify an over-influence in society in a way that you can demonstrate?
[Jan:] I think quantification, in particular in economic terms, is really hard in that area, which is why traditional antitrust tools have failed us in this regard. So in the past, because this isn’t the first power grab by large corporations in human history. So we’ve got big oil, we have big tobacco, we had big pharma. So all of these things, we had telcos who kind of were encroaching the market and controlling the market.
And so these were areas where previous policies might’ve been able to solve some of these issues, including antitrust action. And there are a couple of reasons why this doesn’t work that well in the digital area, and why these companies were so immune against the tools that regulators had at their disposal. And the first reason that I would like to mention is that these markets, the tech markets work, they run with a much, much higher speed. So product development is much faster. The rolling out of products is much faster onto people’s devices.
People have much less control of what kind of products they actually have on their devices because they don’t have to buy them. They’re just installed with the next security update. And so there’s nothing you can do about it. Now everybody’s got AI in their pockets. Did we ever say we want this? No, we didn’t, but there we are. So I think that’s the other issue. And then the third issue I think would be important to mention is that in the past, in particular antitrust law, in order to define whether there has been harm to consumers or to society has been by prices.
So antitrust law would look at: does the monopoly position of a company in a given market lead to prices that are higher than they would otherwise have been under a functioning competition situation. But if you live in a situation where the product is nominally for free, let’s say Gmail, for example, or the browser that you use, you will never be able to establish that in order to quantify this in economic terms, because the price is already zero and the monetary price is already zero.
But the societal costs, they’re there. And that is not something that antitrust law was traditionally looking at. So we need new kinds of laws and new kinds of rules to look at these problems. And we need to politically acknowledge that size matters because size leads to influence. First size in terms of user numbers leads to influence because you can now control users’ behavior. Once you’ve got 3 billion people on your platform, you can control their behavior. You can feed them certain information and withhold other information. And that gives you political power, gives you opinion-making power, incredibly dangerous.
But then there’s also obviously the power that comes through money. So the bigger your company is, the bigger your margins, the more cash you can accumulate. And that’s what we’re seeing with big tech lobbying in Brussels. And I’m sure in Washington, it’s the same thing. And in London and other places is that if you have virtually unlimited resources for your lobbying spending, there is very little you cannot do. And we’ve seen this in the past years where we’ve been fighting, for example, for limits on surveillance advertising.
So we wanted to have European regulation that puts limits on what surveillance advertising can do, what kind of data can be collected and used to target ads. And the backlash from big tech lobbying was so massive. And they were in a position to easily outspend us. They put advertising literally into all major newspapers across the continent. Full-page advertising on major newspapers is very expensive. The whole Brussels airport, metro stations, the main train stations, all the advertising spaces there were covered in big tech lobbying messages, basically, because they know that in Brussels, politicians come and leave by airplane, by train. So they would see these spaces. So they spent, in our estimates, tens of millions of euros within a few weeks on having this advertising campaign. And that is obviously nothing civil society could ever achieve in terms of impact.
And so size matters, the cash availability matters, and user numbers matter. So I think that gives a good picture for why traditional antitrust tools weren’t able to limit these companies’ power and why we need new tools to do that.
[Bruce:] And of course, a recurring theme of mine, it seems to be, is again, this is not a new problem. There’s a reason why 100 plus years ago, if there was an uprising, they seized the post office where the telegraph was. There’s a reason why the Russians would always seize the TV station if they were rolling into Czechoslovakia or something like that.
I remember when I was doing my dissertation 100 years ago, and I was writing about why science fiction in English turned very dark at the beginning of the 20th century. And I remember quoting Albert Speer in his testimony at the Nuremberg war trials. And he was saying, the reason that Hitler was so successful was because he controlled the radio and he could broadcast instantly to millions of people. He didn’t depend upon layers and layers of hierarchy all to be loyal and to relaying his message. He could get out there. And I think…
[Jan:] Yeah. And today what you’ve got to do is you seize social media. And once you control the algorithm that decides what people see in their news feeds, and people can’t control that by themselves, then you have an incredible amount of power over where their thinking goes, where the public debate goes. And to be honest, I don’t want to over dramatize, but what we see with X and with Meta as well, that’s one of the first things that we see happening. So we first see, okay, if you’re on X, basically you see Elon Musk content and his friends’ content all the time, whether you follow them or not. There were rumors that that happened after Trump’s election, that people were kind of set on auto-follow the new president and the vice president, even though they had never chosen to follow these people. All of this is very hard to prove, because there is no insight into the algorithms. We can’t see this in real time, how this is happening. And that makes it so dangerous.
Now, some of the laws that were passed in the European Union were designed to increase the transparency and have some more control over these things, namely the Digital Services Act, DSA, and the Digital Markets Act, DMA. But it’s unclear as of now, because these are relatively new laws, how much impact they will eventually have. We hope that they will have a good impact. We were quite happy with what the DSA in particular delivers, the DMA as well in terms of structural market power.
But it’s too early to tell whether that is actually will yield the results that we hope it will, and whether the European Commission, which is the main enforcement authority under these laws, will be in a position to actually enforce these laws properly. Will they have enough staff to go into companies and investigate and interview employees, and so on, and seize documents? In theory, they have that power now. But do they have the skills? Do they have the teams that can actually do this quick enough? It’s an open question. We very much hope so.
[Bruce:] So your worry about the enforcement of the Digital Services Act and the Digital Markets Act isn’t that the political will is waning or wilting, but you’re worried that simply these are not resourced or staffed enough? Full disclosure to our listeners, I worry because as well as technical communications officer of Vivaldi, I’m also all of its lobbyists. I worry, particularly in the United Kingdom where I live, which of course sadly is not part of Europe, that the political will to actually enable these organizations to enforce might be disappearing. We’ve seen in the UK that the chair of the Competition of Markets Authority was forced out, invited to leave, whatever, and replaced by somebody who was running Amazon UK and Amazon China, which seems to me indicative that the political masters may be less, their enthusiasm to fight big tech is waning. Is that your take? But your take in the EU is that the political will is still there, it’s just a resourcing problem or am I…
[Jan:] At the moment that’s our impression. I fully agree with you that what happened in the UK with the Competition and Markets Authority is wild. How can anybody think in the UK government that that might be a good idea to put the fox in charge of the hen-house really in this case? And I think it would be a tremendous mistake to repeat things like that on the European level. In the past, we’ve had the impression that the European Commission wanted to make these laws a success.
At the time it was the European Commission Thierry Breton from France who was still in charge, and he was always a person of big words and big gestures and big actions. And now he’s gone and he’s been replaced with a new colleague, Henna Virkkunen, and she is now responsible for enforcing those laws. We have been previously arguing when those laws were debated, arguing that the enforcement should be under the responsibility of an independent enforcement agency in order to not require political will to make those laws work, to make it independent from political bargaining.
That didn’t happen. Interestingly, the main argument at the time to not give enforcement powers to an independent agency was, “Oh, if we have to set up an independent agency first, it will take too much time and cost too much. So let’s just put this into the European Commission’s hands.” And so we were very worried about that from the start. And now it looks like this has been set up quite nicely, but I agree that it requires continued political support from commissioner levels, from the highest political level in order for the staff to do their work properly. So we look at the officials working for the commission. They’re all hardworking people, very well informed about these topics. They have been working on these issues for a long time.
And I think there is a lot of desire to make this work, but they can’t do this without their bosses up high agreeing with these actions, in particular, when we come to the imposing of very high fines. So we’re talking 6%, up to 10% of annual global turnover of these companies.So we’re talking billions of dollars of fines. And so technically these fines should come very soon. And we’re all crossing our fingers that that actually happens and that there isn’t some political bargaining going on in the background.
Now, I don’t know what I would do if I was commissioner and then a president Donald Trump comes around the corner and says, “Ah, if you find my precious X, Elon Musk buddy, we will leave NATO,” or something like that. That would be really, really tricky, a tricky decision to take from the European Commission. I hope it doesn’t come to that, but we live in wild times.
[Bruce:] We live in wild times. I note that EDRi has ceased posting on X. Is that just a complete coincidence?
[Jan:] It is not a complete coincidence. It’s something we’ve been planning for a while.We’ve debating that since at least two years and then planning concretely for quite a while already. And it was high time for us to leave. There is no excuse, I believe, for anyone to continue to feed into this machinery that is X today. Sometimes I hear the argument, “Oh, but we’ve got to reach our audiences where they are,” da, da, da. I think there is a big misunderstanding about what kind of audiences actually remain on X. Even though, of course, it does always hurt to leave followers behind. We left over 35,000 followers behind, and that hurts. It does.
But I think if we want to build a better web for the future, including the social web, if we want to build an open social web that works for everybody, it can’t be a centralized commercial platform such as X.
[Bruce:] I completely agree. Just after Christmas, I finally left X, and again, I left 30 odd thousand followers. But it was when Mr. Musk personally accused my MP, of whom I’ve spoken earlier, accused her of being a “rape genocide apologist” for which she’s getting death threats. And given that before she was a member of parliament, she worked for a charity that helps women escape domestic violence, it was a very, a very bad thing. And I just said, “I can’t personally support this man anymore for exactly what you said.”
And I should mention, by the way, talking about my MP, because she was worried about the government not being allowed to scan messages about child abuse, et cetera. She is on the side of good. When I spoke to her on Friday, and I told her about the Competition and Markets Authority, she was shocked because she didn’t know that because it’s a different government department. And she’s promised to write to the government to find out exactly what’s going on. So, Jess, if you’re listening, thanks for doing that. Hurry up and write.
I know Tempus Fugit, Jan, and you’re a busy chap, but we were talking about algorithms. We were talking about social media stuff. But also, with free products, and Vivaldi is a browser that costs zero to its users, we depend upon search revenue. And search revenue is derived, ultimately, from advertising. Every time a little Microsoft Clippy pops up and says, “It looks like you want to fix capitalism,” I acknowledge that I possibly can’t all by myself, and therefore, advertising’s a part of life. And I noticed that one of the members of EDRi, the Chaos Computer Club, whom I believe are German, and you personally, one of your interests in EDRi is surveillance advertising and surveillance capitalism. What’s your take, and by all means, it could be Jan’s take rather than EDRi ‘s take: Can we wind back from this kind of utterly profiling everybody to the nth degree to micro-target them? Should we? Could we? What is your take?
[Jan:] Yeah, I think we should and we could. I think that’s the technology is there, right? It’s not witchcraft. If at all, surveillance advertising is witchcraft, where stuff is happening in the background that people can’t understand and can’t see. If you go into the streets and ask people, “Do you know what real-time bidding is?” I bet that most people won’t be able to explain, or have never even heard of it. And yet, it happens every single time they open a website or open an app that they’ve installed, or most of them at least.
And every time they suffer a massive privacy breach by their advertising profiles being shared with dozens, if not hundreds of advertising partners who then bid on their attention, and the highest bidder gets access to your eyeballs. And so this is an incredibly intrusive system and very, very well-hidden from the user and their daily experiences. Now, is advertising required to fund free services, such as the Vivaldi browser? Yes, probably yes. But can advertising function without that intrusiveness? It absolutely can. And again, the technology is already there.
Now, what we see is that the advertising industry, and Google and Meta at the forefront, but many others as well, try to tell everybody that you have two options. You’ve got either surveillance advertising as we know it today, as intrusive as it is, or you’ll have to pay for all your services and all the free good stuff will go away. And I think that’s just a lie. It’s a lie.
What really is possible is that you fund services through advertising, without targeting it based on all these immense intrusive types of personal data. You can still target it, of course, just as any newspaper advertising is targeted by the advertiser picking which newspaper they want to advertise with. So I say, if I want to reach an audience that loves cars, I’m going to pick Car Magazine to put my ad in. If I want to have a more left-leaning audience, I’m going to pick The Guardian to advertise in. So that’s already targeting. And then the newspapers themselves can of course target based on which page they put the ad on in order to make sure that it reaches the readers that are most interested potentially in that advertising. For none of this is any bit of personal data required. So there is a whole targeting ecosystem out there that lives with zero personal data. And that is the future that we envision, even though, personally, I’m not a big fan of advertising, I’d rather not look at it if I can, because I don’t find it useful for myself. But I see why it’s important to fund some of the things that are out there. I see why companies would want to put advertising out there. I can understand that.
And the question is, can we help them? Can we make them do their advertising in a privacy preserving manner? And I think with a good regulation, they can. It’s a pity that regulation is necessary. I’d rather have the industry do that by themselves. They have, unfortunately, the past decades or two been very, very unwilling to go in any step in that direction. And so I think regulation it is.
[Bruce:] I agree, personally. And we agree, Vivaldi. When you said we, are you talking EDRi or are you talking about…?
[Jan:] Yeah, there’s a whole work stream at EDRi about surveillance advertising. We’ve been trying to put stricter regulation into the Digital Services Act. We obviously have been strong supporters of the European Data Protection Law, the GDPR, which with all its flaws is one of the best things that has happened in the past decade or two for digital rights.
And I think there is more we can build on here. At the moment what we’re battling is the battle of cookie banners. So because GDPR and the ePrivacy directive require advertisers to obtain consent before processing data for targeting ads, they put up these terrible, terrible cookie banners, which is a real pity because in order to obtain, there’s a myriad ways to obtain consent in very user-friendly manners.
But they have chosen the worst of all cases with the explicit aim of having as many people as possible click, yes, I accept all cookies, even though they don’t want to. Just because it’s the most frictionless way of going about your daily business. If you want to just access a website, clicking accept all is the easiest way to do that. And that’s on purpose, right? They make it deliberately hard for you to exercise your right to privacy, in order to get your data against your will. And that has to stop.
And that’s what we were working on since then, but still at the moment because it’s still not solved.
[Bruce:] To be clear, when you use the pronoun “they”, you’re talking about site owners and advertisers rather than this being a requirement of GDPR.
[Jan:] Exactly. And if I say deliberately the vague “they” rather than naming because the industry is so big. So you’ve got everybody from Google and Facebook. So the big advertising networks, then you’ve got kind of middlemen, you know, ad exchange companies, you’ve got data brokers like Criteo and others that are based in Europe.
And then there’s like, there are hundreds of them. And then there’s those who funnel the data into the ad ecosystem. And that’s mostly app developers who want to provide a free app.And then they build in data tracking into their app because that helps them to make money.And that data tracking then siphons up all of your personal data from your phones, for example, and sells us off to whoever pays most. And that is a huge ecosystem that makes billions and billions of euros every year. And that most people have never heard of. And that is a scandal in itself.
So we need to put a stop to that, and bring regulation that gives people actual control over their data. And that isn’t through cookie banners. The one way that we envisage at the moment in the political context that we have is that at the very, very minimum, the bare minimum, that there should be a mandatory consent signal that is automated so that your consent cookie banners are basically, your response to those is going to be automated. Your browser would send this signal automatically. Your operating system can send it automatically in the background. So you have to say like once, do I want to be tracked for ads? Yes or no. And then you say no once, and nobody will ever bother you again. But for that to work, you’d have regulation to make it mandatory.
[Bruce:] It’s interesting you said a lot of people are unaware of the tracking/ advertising industry, which in itself is a scandal. Obviously, you are not a consumer organization. You’re not like BEUC, or something like that. But do you get a sense, or do your members get a sense that consumers are becoming more aware of the problem or are aware that people are talking about it, but a kind of well, that’s the modern world. What do you think European consumers think about this if they do at all?
[Jan:] So it’s our feeling and that’s based on context knowledge. It’s based on interviews that were done, representative, sorry, what’s the right word in English? Onkit. Like when you do questionnaires.
[Bruce:] Like a survey?
[Jan:] Yes. So basically, based on surveys that we have seen and that partner organizations have done, we get a pretty good feeling for what people would actually prefer. And so what we take from those data points is that most people in Europe are aware that there is a problem, but they feel very disempowered about their ability to do anything about it. And so in those surveys, the large majority of people, 60, 70, 80%, depending on the data category, say, “I do not want my personal data to be spread across the internet with parties I don’t know, in order to target ads at me”.
And I think that’s pretty common across all European member states, except that people have no knowledge about how exactly it works and what they could do technically and what would have to happen in order for this to stop. So there’s only a minority of people who know what an ad blocker is or know what a tracking blocker is for their browser. And that’s a big issue. And that shows how disempowering that entire system is.
And that’s also why we think. Putting the burden on individuals to protect themselves is a bit unfair because the system is so opaque and so impossible to understand for anyone who isn’t studying this. And that’s also why we see strong legislation in that area is the only thing that would be fair to people.
[Bruce:] I mean, when you were saying that, that’s entirely my gut feeling is most people when asked don’t want to be snooped on, but they have no feeling that they have any mechanism to control that.
[Jan:] And the snooping is very hidden, right? So it’s happening, but the parties snooping on you are mostly anonymous or far away. If you say, “oh, your neighbor is reading your private emails”, everybody would be outraged. If you say “Google is reading your private emails”, there’s like, yeah, whatever. I don’t care. Right. Because it’s very, it’s, it’s, it sounds more theoretical, even though the end result is pretty similar is your private life is being exposed and potentially abused against you.
[Bruce:] Yeah. And, and also a lot of people say, well, I’ve got nothing to hide and as well, you don’t think you do, but when they know everything you’ve bought, they can use that data, not necessarily to target you, but, um, target somebody else, et cetera.
[Jan:] Yeah. I challenge everybody who says I have nothing to hide to please hand me over their phone and all their passwords, and then see how they feel about me going through their digital lives. I I’m pretty sure that would be, would make them feel uncomfortable.
[Bruce:] That’s a very good point. Jan, I’ve taken up a great deal of your time. Thank you so much. One final question. What’s your favorite bike or bicycle ride you’ve ever done?
[Jan:] I’ve done a lot of bicycle riding and my second, my second passion after digital rights. And I’ve journeyed from Brussels to China a couple of years ago. And, I was astonished about just how much I loved cycling through Turkey and and Georgia as well countries I didn’t have, I hadn’t visited before and I was astonished by the hospitality of the people. Obviously the politics is different, but just the hospitality of average people in every village I came through was, was a wonderful experience.
[Bruce:] It’s extraordinary, isn’t it? I’ve been in the tiniest, most impoverished villages in Eastern Turkey and people have like killed their only chicken for me and you feel so bad, but to refuse would be terrible.
[Jan:] Yeah. Yeah. The one thing that I took away from that is the one thing I can do is try to give back the hospitality that I learned there. And whenever I can pick someone up from the street and hosted my place, I will.
[Bruce:] What a delightful and optimistic way to end the podcast. Thank you. And thank you so much for having me. Thank you to EDRi. I’m already signed up to all the newsletters, et cetera, but listeners and viewers, I will put all the links in the show notes. I encourage you, if you are European citizen to sign up and engage. Cause even if you disagree with what they’re doing, tell them otherwise they don’t know. Once again, Jan, thank you so much for your time. Thank you so much for all you’re doing. Thank you to everybody who tuned in and we’ll see you again in a month for the next episode of “For A Better Web”.
Show notes
- EDRi site
- Jan Penfrat on mastodon: @ilumium
- EDRI’s open letter to EU (PDF)
- Why EDRi is leaving X and where to find us
- Changes at UK Competition and Markets Authority
- Chaos Computer Club: Ban tracking and personalised advertising
- Albert Speer’s Nuremberg testimony
- Jan’s bike blog