This interview accompanies an analysis of the fears over privacy engulfing Facebook
Should Cambridge Analytica have used encrypted and self-destructing email in its work? Andy Yen said he can’t prevent his free, encrypted email service from getting into wrong hands because doing so would undermine the privacy and security of everyone. And that it’s not up to him or his company to judge whether that was unlawful.
Yen is chief executive of ProtonMail, the world’s largest free encrypted email service. It happened to be used by Andrew Nix, CEO of Cambridge Analytica. This became the sort of publicity ProtonMail didn’t want.
Nix was captured on camera in a Channel 4 documentary talking about setting up a ProtonMail to self-destruct emails. “There’s no evidence, there’s no paper trail, there’s nothing,” said Nix of the facility.
“Technology is blind,” Yen told WikiTribune, particularly when it comes to encryption. At the end of the day, he says, it’s all mathematics, and “mathematics always serves you the same exact answer, no matter the situation.” In other words, encryption does not calculate ethics in the choices of its users.
Know a fact to enhance this story? You can edit it
Know a fact to enhance this story? You can edit itEdit
WikiTribune spoke to Yen to better understand the importance of encryption. We discussed what privacy means today and the changes that may be needed to protect data and privacy as the world becomes ever-more digitized.
The interview has been edited for clarity.
WikiTribune: Your company came under some scrutiny because Andrew Nix, CEO of Cambridge Analytica, said he used ProtonMail for arguably unethical reasons or to cover tracks. Would you say his use of your service was improper, and is there a responsibility for you guys to do something about it?
Yen: It’s not really appropriate for us as a company to comment [on] whether or not this is legal because, obviously, we do not understand the facts of the case. There would need to be a thorough investigation before that’s determined.
But stepping back to the real question you’re asking here: Is it appropriate for technologies like ours to be used in this way? And to answer that: Technology is blind. Especially when it comes to something like encryption, at the end of the day it’s mathematics. So you cannot devise encryption in such a way that it gives one answer for the good guys and a different answer for the bad guys. Mathematics always serves you the same exact answer, no matter the situation. It’s not possible to say, ‘I only want to have security that the good guys can benefit from, and the bad guys, I don’t want to protect them.’ It’s not possible. When it comes to data security, it’s all or nothing.
WikiTribune: But are there certain safeguards that you could put in place as a company to prevent bad guys from using it, or is that just impossible when it comes to encryption?
Yen: When it comes to encryption, you could say that the genie is already out of the bottle. Encryption had been in business for a long time before ProtonMail came along. What we’ve done is to take this technology that previously was very difficult to use and we’ve made it more democratic, in the sense that anybody that wants to have security, especially for their communication, can now have that.
Encryption is like a bullet-proof vest in the sense that it’s a form of protection. And of course bullet-proof vests are used both in law enforcement and also by criminals. But I would not say that because criminals use bullet-proof vests we should ban bullet-proof vests.
So, yes, you have some things you can do here and there to try to avoid the wrong people using encryption. But when it’s something like software, you can’t put the genie back in the bottle. The code is available online for anybody to use. You really cannot limit it in any way. And I don’t think you would want to, either, because of the overall benefit that society gains from having strong encryption and good online security. This far outweighs some of the downsides to it.
WikiTribune: Twenty years ago, Scott McNealy infamously said that we have zero privacy so we should just get over it. What do you make of that today?
Yen: Zuckerberg had a similar quote back in the day, saying the definitions of privacy are changing and that privacy doesn’t really exist anymore. I think to a large extent this quote is accurate. But debating the accuracy of the quote is not really helpful. Yes, privacy is different today: There’s less privacy. But the more important question for society is: Are we okay with that, and do we understand the implications and downsides of that? And I would say until very recently this was not widely understood.
Nowadays, when you look at the Facebook situation, I think people are starting to realise why privacy actually does matter. And the issue with companies like Facebook and, to a much larger extent, Google is that they have a business model that’s fundamentally built on the idea of creating a massive apparatus that captures as much data as possible. If your business model is collecting a tonne of data and analysing it, that data is going to be misused eventually. That’s almost inevitable.
People talk about regulation, control, security, but I don’t think it matters whether the data was sold, hacked or breached. That’s not the key point. The fact that this data exists causes a clear, present danger. And you don’t solve the fundamental problem unless you begin to change the way business is done online. The whole business model of mining data to serve ads to make money is fundamentally a flawed model. There are no security measures or safeguards you can add to fix a broken business model.
WikiTribune: Considering that the internet is driven by the advertising model, would you say that privacy is incompatible with the internet today?
Yen: No, because it’s not the internet that is the issue. It’s the business model of the internet and those are two very different things. It’s definitely possible to do security online. If you look at ProtonMail and our technology, there’s really nothing special we do that a company like Google cannot do. The reason Google doesn’t do what we do is because it’s not in their financial interest. So it’s not technical limitation of the internet. It’s entirely a business limitation.
WikiTribune: Will GDPR [Europe’s new data protection law] alter the way the internet functions in regards to data and privacy?
Yen: I think GDPR is a step in the right direction. But the internet, unlike a lot of other industries, is something that is completely private-sector driven. If you look at airlines, railroads, all these big industries, they are totally under government regulation. The internet is nothing like that. It was something that was always dominated by the private sector. I don’t think you can regulate the internet in the same way you can regulate, say, the airline industry. GDPR isn’t going to force Google or Facebook to alter the business model. That’s not going to be something that governments can enforce or have the power to enforce. The only way that will change is if people’s attitudes about privacy and data begin to change.
WikiTribune: What are your thoughts on the whole ‘Delete Facebook’ movement? Is it a solution, or is Facebook too big to fail?
Yen: When consumers change their habits, based on their preferences and their impressions of companies, that’s the world’s most powerful market force. The issue with the tech sector right now is that in many ways it’s a monopoly. And it’s a monopoly way worse than the monopolies we’ve seen in past centuries. The reason Delete Facebook is probably not going to succeed is that there isn’t another Facebook waiting in the wings, ready to replace Facebook. It’s also because Facebook has either copied or purchased or somehow acquired all the other potential platforms. And that, in fact, is a core part of the business model.
But, on a positive note, people do see now that there’s a problem with Facebook’s business model. This creates a massive opportunity for other entrepreneurs coming in with a different way of thinking about data and privacy to really make a difference. To delete Facebook today is not going to be effective, but because Facebook has been shown to be vulnerable, I think in the long term, this encourages others to enter the space with a different business model. And that would be very promising in the future.
WikiTribune: What do you think are some of the biggest threats to privacy today?
Yen: The whole business model of the internet. At the end of the day, the market more or less controls what is built and what is designed. Governments can try to reign in and control that market, but the market is still the driving force. The business model of the internet, such as it is today, is a huge risk to not only privacy but also to democracy as a whole. The challenge for people like me and companies like ProtonMail is that in order to protect privacy, we need to do more than build technology. We need to focus on education. And we need to create an environment where the market forces begin to require privacy.
The recent Facebook controversy has created a very strong push in that direction. Obviously, the Edward Snowden revelations were another turning point. And we’ll see many more turning points like this because the data’s out there already. It’s going to be misused. It’s just a matter of time. And in the end the market will always, generally, trend toward the correct solution. Right now, we’ve swung way too far on the business model of exploiting data, data mining and advertising, and things will come into balance in the next five, 10 years.
WikiTribune: Can you give me an example of when the market corrected itself in a similar context?
Yen: Well, in the ‘90s you had Microsoft dominating, which led to a backlash from consumers, other companies and eventually regulators over antitrust violations. That opened the door for competitors like Google to come through. So it’s always kind of a natural evolution of the system to go away from extremes because you cannot maintain an extreme position indefinitely. While Facebook may still be a very potent force in 10 years, there really aren’t a lot of strong precedents to support the idea that Facebook will still being as dominant as it is now.
WikiTribune: Do you think that ultimately the onus is on the individual to protect their own privacy?
Yen: Today, you can’t expect privacy to be done for you. You have to really be proactive about choosing the companies and choosing the right solutions. We’re not at a point where privacy is a default online; in fact, the default is actually no privacy. Until that changes, you have to be very cautious.
But the key is that I think consumers need to have the choice. Today, if you want an email account, you don’t have that many options. So, unless you start on ProtonMail, you’d probably go to Google, and if you go to Google, you’re not given an option for privacy. So, in terms of social media, if you want a popular social media platform that isn’t based on ads and selling your data, that option doesn’t exist. So we’re not saying we need to abolish all services that don’t provide privacy – that’s kind of an extreme position. Rather, we want to provide consumers with a choice so that you can decide what information you want to be private and not private instead of having everything not private by default.
Know a fact to enhance this story? You can edit it
Know a fact to enhance this story? You can edit itEdit
WikiTribune related articles
- Interview: The ethics of big data, Facebook & Cambridge Analytica
- GDPR: European Union forces pace on protection of individual data on the internet
- WikiProject: World business prepares for EU data protection rule
- Europe’s new General Data Protection Regulation explained
- WikiProject: Facebook, data collection, and you
- Google says it had 2.4 million ‘right to be forgotten’ requests
- Facebook says ‘outraged’ by investigation into misuse of data