Whitfield Diffie (EN, 2018-09-26)
The Digital Transformation is currently changing all aspects of our lives fundamentally. In this series I discuss with people about their personal experiences regarding this Digital Transformation.
This episode: Whitfield Diffie, Turing Award Winner (“Nobel Price” for Informatics): 2015 for fundamental contributions to modern cryptography.
More about him: https://en.wikipedia.org/wiki/Whitfield_Diffie
The interview took place at the occasion of the Heidelberg Laureate Forum in September 2018. You can watch it online here: https://youtu.be/xjcbtna3eaA
Marc-Oliver Pahl: “Being Human with Algorithms” today with Whitfield Diffie, the first part of the Diffie-Hellman key exchange that all of you use daily when you serve the web and do secure web access.
Marc: Whit, maybe some part of the audience doesn’t know you yet. Could you briefly describe yourself?
Whitfield Diffie: Well, probably not. I’m a very lucky sort of an academic. I was born in Washington the day before D-day and moved to New York when I was three months old. With the exception of a year spent in Portugal when I was four, I lived there until I was 17. I moved to Boston to go to college and then in 1969 I moved to the west coast and I lived there ever since. Is that what you want to know?
Marc: Yes, because this is what people don’t know yet, the rest they can also read on the Internet. “Being Human with Algorithms”, we chose this matter because we wanted to express that the digital transformation is changing the way people are living today. When you think about the digital transformation, what would you say is the most apparent thing where it strikes you that we have such a thing as digital transformation?
Whitfield: What things strikes me most is just how ready access we have to information. Things that would have taken a trip to the library and an afternoon’s work years ago, you get in a few seconds by looking at the web. And also, how much less impact on the quality of conversation that has had than I would have assumed it would have had, if I described it.
Marc: So in the in the positive sense, the conversations are still good even though you have the access?
Whitfield: No, in contrary. It seems to me with all or 90% or something information readily available, we ought to make the conversation smarter. I don’t feel that it’s smarter.
Marc: You are definitely one of the shapers of this digital transformation. How would you say you’re shaping the digital transformation today? What is the effect you have on bringing it forward or backward or sideward?
Whitfield: I think the effect cryptography had, was that it was the only security measure that can be used to persuade people that they had some degree of safety in online communications. That, I think, was very important in the expansion of the web in the early 90s and the development of digital commerce.
Marc: Security, a very important topic and also one of my favourite topics, is still nothing that people buy a product for in many cases. My research is about the Internet of Things and people rather take the cheaper sensor than the secure one. How do you think could people be convinced to invest in security and that secure products are needed?
Whitfield: I’m not the person to ask because I failed at this for thirty years. I’ve been a crypto marketing man for most of my career. Talking about sensors for the Internet of Things, a friend said something I even better. She said that she’d been talking to these people and she had discovered that they have no power or low power. No power to do cryptography, because if they could find the power, they would implement another feature.
Marc: That’s right.
Whitfield: I don’t know what the prospects for security are.
Marc: What I’m also thinking about is this “Mirai” botnet case, you’ve probably also heard about this.
Whitfield: No, I don’t know it.
Marc: This was just a bunch of so-called IoT devices, like webcams. Someone was running through the Internet looking for the standard password, logging into them and then using these resources for doing a distributed denial-of-service attack. This is also like a huge thing, that you have this computational power outside, you have the standard internet protocols to access it and at the same time, you have no security.
Whitfield: This sounds as though that there is no security due to something that might be possible to change with easy pieces of behaviour.
Marc: Also, on top of the security that people don’t normally talk about, because this was just misconfigured devices, they just didn’t roll it out properly. Another dimension of security is towards the direction of privacy. Especially in this in this digital world you have more and more online personality. So each of us has a digital persona at some point. What is your opinion about that?
Whitfield: What got me into cryptography was this feeling. I did have exactly that vision forty years ago: that we would have a world in which people would have intimate relations with people they never met in person. The question I asked myself is, in fact, you know what do you do to enable private political conversation, which is required for our democracy? We have to be able to get together and discuss what our party’s position is going to be without everybody being able to spy on it. That was the thing I saw the cryptography was good for and to what degree its functioning that way it’s not permanent. The real end-to-end cryptographic security turns out to be logistically fairly difficult to implement. The big failure is PGP, not that PGP isn’t popular, it doesn’t work at all. Running PGP required a whole lot of work to move keys around. And, it had the problem that, since it’s basically running on platforms rotten from the point of view of security, you really can’t trust it to store keys for long periods of time. Any system that’s going to do that is going to have that problem.
Marc: Going into that direction, is the complexity of computer systems increasing and how do you think it could become manageable again?
Whitfield: Complexity of computer systems not the only thing, I think the critical thing is to, and this comes back to your saying people don’t value security, in fact, they won’t pay for it, but I think the critical thing is to devote more hardware to security. Flexibility and security are, loosely speaking, opposed to each other. For flexible things it’s often more difficult to say what security means. If gates were devoted to putting specialized security functions into computers, you could do better. Those people are doing some of that, but they’re not all sure they’re doing it the way I believe in.
Marc: Related to security this is the term trust. Trust has multiple dimensions. One dimension is trusting in values that are provided by computers and also, trusting in information.
Whitfield: Well, I’m trusting the people who make the computers. Of course, the interests of corporations are never the same as the interests of individuals.
Whitfield: The basic interests of corporations is making money off selling a computer. That’s different from your interest, which is buying a computer that has certain characteristics.
Marc: What would you say is the most negative thing that you see with the digital transformation? What would you say is a critical thing that is currently happening with people today due to this digital transformation?
Whitfield: I’ve thought for a long time that Nazism may have lost the second world, I should say Germany may have lost, but fascism won it. The technologies needed to support people’s control over other people had come out. I think that computers are a big thing in that category. It bothers me very much, that, you know, thirty years ago truck drivers were fairly independent people. Your boss said get this load to Mannheim by next week and they never knew exactly how you went there, where you stopped and picked up a little extra cargo and all those things. Now, they track every minute with the GPS. What bothers me is a society in which everybody is being watched. I think there’s an interesting difference between covert watching and overt watching, because covert watching is limited if they don’t want to admit that they’re doing it. But when now we get to the point where they can just say openly “you know we’re watching you, we’re entitled to watch you”, then they can jerk you all around.
Marc: In that direction: mass surveillance was never as easy it is today.
Marc: Would you say it changes people? When one also thinks about the panopticon, for instance, where people are constantly watched while they’re there. We are somehow developing to that, I mean not to the panopticon, but having anything like that.
Whitfield: No, I think we have something very much like that. I don’t know Germany, but if you look at London, you have cameras all over the place. As facial recognition, software and databases can keep track of how you move around the city and so forth, I suspect it’s entirely within computational feasibility to sort of know where 90 percent of the people were 90 percent of the time.
Marc: In that direction: We have data collection for a longer time, like for at least five years or longer. Your personal data is stored at Google, for instance, at Facebook, at the NSA. What’s interesting to me is that now it seems, especially with this machine learning progress, that this data becomes finally accessible, analysable and usable.
Whitfield: Yes, I think that’s very likely. Whose benefit that would be to remains to be seen, but it’s not at all clear that it will be the benefit of the individuals whose data it in some sense is.
Marc: Let’s imagine we would be both ten years old and we would get our first smartphone. Would you say it’s better to be cautious, to stay away from the social networks and not to share too much? Or, would you say it’s okay to share it, because everybody’s sharing, and if everybody’s doing it then there’s not too much harm about it?
Whitfield: I think that confuses two things: people who are social network users, and I am not, share all sorts of things with their friends. That kind of times hurt them perhaps, but that isn’t what bothers. The thing that bothers me is that they’re sharing it with Facebook, who has tremendous power to keep secret the basic architecture of what it’s keeping and what it’s doing with the data. I think we misused the word privacy. We confuse two things and call them both privacy. One is what happens in a small community, in a small town. Like a naval vessel, which has a crew that rotates rather slowly with a couple of hundred people typically with a year or several months assignment. That’s one kind of thing. The other kind of thing is these entities that are watching millions of people and have tremendous power to keep secret what they’re watching for, what they are doing with the information. You could be denied a job or denied insurance because of something these people know about you, that other people can buy from them behind your back. Curiously, I’m associated with secrecy because that’s one of the major uses of cryptography. But I think if you’re trying to point to really serious problems in the modern world, it is the degree of respect for secrecy. Because people believe in it, there are tools to make it fairly effective. Look at the tiny quantity of the leaks. The US goes ballistic over somebody leaking a thousand or a million documents, when there are billion of them. Their success in keeping things secret is amazing.
Marc: About sharing information with someone: There, I see the big problem that people are not sharing information with Facebook because they want to share information with Facebook, but because they want to share information with their friends.
Whitfield: That’s right.
Marc: It appears as I’m just talking to you, but I’m using Facebook as a medium. Talking about algorithms, do you think that computer scientists should have more education in ethics when they develop computer programs? That they should take it in consideration from the design time?
Whitfield: The trouble with ethics is it seems to me usually to be a rationalization of what somebody believes. People say things like that you shouldn’t do things just because they might be legal, because maybe they’re unethical. Almost nobody ever says to you, well, maybe it’s illegal but it’s ethical so you should do it. I think ethics is largely used as a way of jawboning people into not doing things that are within their rights to do and to their advantage to do. Trying to persuade people that they shouldn’t do things, because they’re not to the advantage of the person who’s doing the persuading.
Marc: Interesting. When we come now from the negative or critical points to the positive sides. What would you say is the most positive effect of this digital transformation?
Whitfield: The availability of information is the one that strikes me, but there may be others. Mobility, abstraction, I like lots I like lots of things about it. This one doesn’t affect me directly, but the fact that over my lifetime it has gone from being a felony for two guys to have sex to it’s okay for them to get married. That may be a result of more information available to people. No matter what your interest is, what you practice, you can find five hundred people on the web who like the same things and you can get together with them. Maybe it promotes tolerance in some ways.
Marc: In that direction: filter bubbles.
Whitfield: I love that phrase, but what does it mean? A filter bubble?
Marc: Filter bubble means that you have a lot of information, the information gets filtered and you only see filtered information. So, you’re within your bubble.
Whitfield: I see.
Marc: Also, you only get information that is customized to you. Do you see this as a problem or a necessity because we just have that much information?
Whitfield: It sounds like an inevitability. My immediate reaction was to think it’s a problem, but I probably do it myself. With all sorts of information, I look at what interests me, not necessarily at what’s important.
Marc: The difference is that you decide. In the other case, in case of Facebook for instance, when you look at your timeline then it’s only the Facebook algorithm is deciding what you see. You don’t see any other thing. This is a little bit problematic, because it’s not like in the journal before, where you had a newspaper and then…
Whitfield: I only read the sports page, I only read the business page, …
Marc: but you still saw the other pages. In the case of Facebook, you don’t even see it. “Being Human with Algorithms”, what does the slogan tell you.
Whitfield: Well I assumed that it was a bad translation from some meaningful German phrase.
Marc: Indeed, it is a little bit, because it’s the philosophical term of “being a mensch”. Your human being, being a humankind individual, in a time where everything is getting more and more dominated or controlled by algorithms.
Whitfield: I think the greatest issue of this time is the interaction. I would say the confrontation between people and machines. This is just one manifestation of that.
Marc: Anything else you want to say about that? Or everything said about that?
Whitfield: Everything is said.
Marc: It was a great pleasure, thank you very much.