Being Human with Algorithms: ChristianThiel and DominikGolle (zd.b) (EN, 2019-11-19)
In this episode we discuss about privacy, responsibility, law enforcement, and many more highly interesting aspects of the digital transformation. Enjoy!
Christian Thiel (https://www.christianthiel.com) and Dominik Golle are responsible fot the topic platform “Verbraucherbelange in der Digitalisierung” (https://zentrum-digitalisierung.bayern/verbraucherbelange ) at the zd.b (center for digitization Bavaria https://zentrum-digitalisierung.bayern/).
In his PhD, Christian Thiel taught machine learning algorithms to deal with uncertain data and teacher signals. Having experienced the pitfalls of algorithms, today he is thinking about the consequences that a digitisation of all aspects of daily life will have on consumers – and what needs to be done to make digitisation good for humans.
Dominik Golle holds a Master of Public Policy degree from Hertie School of Governance. His deep interest in digital transformation got a boost when working in Nigeria – so one of the thing he is thinking about today is how to improve public policy and company responsibility mechanisms to make the digital life of consumers more agreeable.
Marc: Christian and Dominik, could you please introduce yourself to our viewers?
Christian Thiel: Of course. My name is Christian Thiel. I am a scientist in machine learning. I did my PhD on emotion recognition on faces and on recognizing cricket songs. You can tell the biodiversity of habitats from their plentifulness of songs.
Marc: Oh, that is very interesting! Maybe we come back to that later.
Dominik Golle: I am Dominik. I am also one of the coordinators of the unit for consumer affairs here at Center Digitisation.Bavaria. I hold a degree in public policy. So naturally, my interest in technology is always the impact on society at large. In addition, I am data protection officer at Center Digitisation.Bavaria.
Marc: Very good! I am really looking forward to the discussion because the interview series is always about people – and it is a very nice constellation of people here! Both of you are really into the topic of the digital transformation. Therefore the first question is, “Which effects of this digital transformation that is ongoing are most prominent to you?”
Christian: When we started we had a very broad look and found very many things that are going on today. However, we thought the most important topic really is network products and the internet of everything. Nowadays everything gets connected. All my things are connected. All services that are provided rely on a big number of network companies and sensors to work together. This means we have a decentralization of technology – because all those small sensors are everywhere. But, at the same time, we have a centralization of information that is also a centralization of knowledge and power. Coming with the network products, we will have digital marketplaces where information is exchanged, where I can buy services, where perhaps my algorithms from my fridge can buy services. We have algorithms that will be running the show and that will really make the majority of the decisions. This is for us the central topic of the digital transformation today concerning consumers.
Marc: You already raised many interesting points that we will come to. The next question always is, “What are the chances that you see that come with this digital transformation?” You already raised some possible concerns, so what could be the chances that you have there?
Dominik: We are always looking at this from a consumer perspective. Looking at smart home speech assistance, which is what we are focusing on right now, we see that it is going to be much easier to communicate with all the things connected to the internet because you do not need clumsy text interfaces in the future. Already now, we have speech assistance and various ways in which the devices just talk to each other and really make our lives easier. They even recommend things to you, how you could live a healthier or happier life, which always comes with risks of course.
Marc: Risks is exactly the next the cue. What are the risks that you see there?
Christian: The dimensions. Network products are highly complex since they need to work together. This is a challenge because for the consumer there is no transparency anymore. The consumer does not know which part of the product does which job and which data needs to go where. This also means that as a consumer when I buy a service or a product, I do not know with whom exactly I got the contract. Is the service or product provided by the thing I am talking to, or by the back-end, or by the seven services that have to work together? And if something goes wrong who do I have to contact in order to get my money back? How do I get a compensation if something is broken? Apart from that, the network measure means that we have more surface for cybersecurity attacks and threats. The high complexity is the main risk. Many products only work connected and have to be constantly online in order to transmit data into the cloud and into the other services. As a consumer, I do not know which data is flowing. I just know that there has to be a lot of data flowing around me for things to work. The data flowing will then lead to automated decisions. These decisions can be good and over time people also go into trusting them – perhaps even to the extent that they trust the decisions and do not question them anymore, although the data or the algorithm could be flawed.
Marc: So, this is very interesting because you were going into the direction of assistance, and you were going into “having this data” and then the risk of “where is the data” as well as decisions, trust and so on. When you consider this as a timeline, starting sometime from before we had the digital transformation – I mean whenever that was because basically the digital transformation, if we look at the first interview that I did in the series, started a hundred years ago. But when we think about what we now see in the closer sense of the digital transformation: What would you say, how far are we? Speech assistance is something that popped up maybe one year ago and the products from the bigger companies are aimed at the private sector. How would you say is it in the industrial sectors?
Dominik: That is really interesting! Last week I went to a pitch night by the Danish consulate here in Munich that was all about robotics. In the industrial sector, we see a lot of robotic assistance that really work autonomously. They can use elevators, they move over the entire area of the plant and so on. Now we have it in industry and it is only a matter of a couple of years I think until we have these systems also in our homes and then in almost all private spaces.
Marc: So, you would say the industry is ahead of the private sector – probably because of cost factors. Or do you see other factors that play a role? Is it like that the industry is adapting faster because there the people are forced to do it or is it really that you think they are first because technology is expensive and then when it becomes cheaper it also comes to the homes? Or is it maybe something totally different?
Dominik: From what I saw there – however, I am not an expert on robotics – it was due to the fact that the workplace environment is much more structured than any home and you can easily have visual cues for the robot and so on. Such an environment makes it much easier for robots to interact safely. It is similar to autonomous driving which works very well on a highway but not so well in a city.
Marc: That means there is just more infrastructure that is given. And it is also more homogeneous in the sense that they have equipment from one manufacturer maybe or from a few of them that follow some standards and therefore it is easier to have interoperability and to really implement that. So, coming back to the timeline: Would you say – I mean this is a totally philosophical question – that we already have seen a lot or that in the upcoming years we will have even more fundamental and bigger changes that will be happening in this direction? I am also thinking of that, when you talk about the digital transformation, people often say that it is a very disruptive process and that this disruption might be something that we only see in some years when we look back. Personally I am perceiving it not so much disruptive but more as an evolution. Now I got an assistant and the assistant is partly doing what I did. Maybe one year later I have gotten used to it and I think asking the assistant for a recipe is totally normal, even though one year ago I did not have this device. How is the perception for you? Is it disruptive? Do you think it will become disruptive? Or do you see the trend at the moment that is more evolutionary and that some processes will change?
Christian: When we are looking at the numbers, two years ago there was no speech assistance in the households. Now even in Germany there is around 20 % household penetration of these assistants. It is coming very, very fast, similar to the mobile phones. From the roll-out perspective it is really disruptive. Focusing on the assistance, we can say it will be even more disruptive in the future, because they will be becoming the new tool to access information. You will not have a screen anymore, you will not have choices of 10 results and the ads, you will only have one answer. If you are taking it to the extreme this will really pull the floor out of the ad revenue market model of the internet at the moment. If it stays this way, it can totally be disruptive. And we have not yet looked at the other uses you can have, the robots in your home, or the regulation of the environment.
Marc: Which are the main players in this game? We definitely have Google and Amazon. Whom else do we have? Is there diversification at the moment? Apple, of course. These are the three major companies from the US. So, if I would buy a home assistant device it would probably be from these three companies. Does Facebook have something in line? I am not aware of it.
Christian: No, not at the moment I think. I mean there is all those established players like the local Bosch company. They also have home automation products. They have all the technology for home automation, sensors to actors. But so far they did not invest in the voice interface.
Marc: And probably they are more for specialized markets like in the car. I would expect to find some voice control maybe from Bosch that is in a Daimler car or a BMW car. But at least I am not aware of a home device.
Dominik: It is not a consumer brand.
Christian: The technology for the voice in the cars often comes from the Nuance company. There are not so many companies that do voice recognitions and also the home automation companies still have to focus on doing everything in-house, not giving data to the outside.
Marc: This is related to the question of having just a few players that are doing voice recognition. At least in these three companies parts of the voice recognition and especially of the service are offered in the cloud where the data is sent, as you said before. Recently there were these scandals that there are some people sitting and listening to what you are saying and that they are not only recording when you say the “magic word”. Of course, because they want to have good service, so they have to pre-record and so on. It is clear that they are surveying the people and still the people seem not to be too concerned. This reminds me a little bit to of these Edward Snowden revelations where at least I as computer scientists was totally shocked thinking “Oh my god, now the world has some serious trouble”, and then three weeks later it was out of press. Do you think with the home assistance it is already out of press?
Dominik: Usually, it is in the press for some time, then it gets forgotten and people get used to it. We take a little bit of another approach. We try to find the companies who really want to provide technology more consumer friendly and we try to help them. Because often it is not an easy problem doing speech recognition completely offline for example. There needs to be more research and we are the ones that are taking the research, that are trying to test it, to prototype it, and ask, “What are the problems and what can we learn from that?”
Marc: So you are like advocating a little bit for the people to protect them from the companies of the worldwide West.
Christian: It is part of the approach. But it is also making the worldwide rest friendlier by teaching the companies how to achieve the same results by incorporating more privacy friendly technologies.
Marc: How do you, here at the Center Digitisation.Bavaria, shape this digital transformation? I mean you started already with some contributions.
Dominik: We wanted to draw a little bit of a bigger picture because we are a part of a larger institution. The Center Digitisation.Bavaria is an institutional Bavarian government and we are tasked to really push forward digitalization in the entire estate. This we do by transferring knowledge from academia into practice and vice versa. It is a very practical approach where you see that there is new knowledge and then ask, “How can it be tested? What can we learn from it? And how do we get companies on the ground to use the new knowledge to make innovation out of inventions from science basically?” This is a really interesting task. The Center Digitisation.Bavaria is structured in different topic areas. We have units on consumer protection as ours but also on digitalization in energy or medicine, on the future of work, or on the digitalization of cultural institutions for example. It is a very broad field and we bring together many of the most relevant ministries in Bavaria who deal with digitalization. This is also really exciting. From a science perspective, the Center Digitisation.Bavaria funds 20 new professors here in Bavaria, 10 junior research groups and every year 10 fellows who do their dissertation on digitalization topics.
Christian: When we are looking at our unit of consumer affairs, we see two people and the backoffice staff. In addition, we got a voluntary advisor board that consists of Christin Eisenschmid, the head of Intel Germany, Prof. Dirk Heckmann who recently became an IT security expert at Technical University of Munich, and Prof. Christian Thorun from the ConPolicy Institute in Berlin. They are also providing their view of the industry and of academia. We use our network to bring together experts from academia and from the companies in order to help the companies incorporate technology that is more consumer friendly. On the other side, we are talking to politics. If we see that anything is wrong or cannot be changed by teaching companies to be more friendly perhaps there has to be more regulation on it. We are also talking to NGOs, to consumer advocacy groups, to see what are the daily needs that they really get complaints about. We are asking for their ideas and are talking with them about our ideas. That is how we work on a daily basis and from that we have got two lines of actions that we use to address things.
Dominik: I can chip in there. So, the first one being what we call “corporate digital responsibility” which is a very new and kind of buzzy term. What we are currently working on is, “What does ‘corporate digital responsibility’ mean and how do companies really act on that?” The dialogue has been going on in Germany for maybe one and a half or two years. Recently, we did the first Germany wide white study on what do companies actually understand when we talk about “corporate digital responsibility” and on how far they are implementing strategies on becoming more ethical or more consumer friendly. But, as we said before, we also fund projects to actually get tools on the ground. Here we funded a data process modeller which is now in the making that helps companies to model and then visualize how they process personal data for their business goals and link the two. I as a consumer or as a consumer advocacy organization can then go to the website of the company and see not only the data protection information but clearly how the personal data that the company is proceeding is linked to their business goals and decide whether I am fine with that.
Marc: That is very nice! It is a great thing that you are doing because it is such an important thing – and indeed it is also very complex thing as you also said at the beginning. Therefore, it is very important to have an institution like yours and people like you who are then advocating for the people and doing something for the good of the people regarding this complex topic. When I talk with people about this privacy and security issues, I have the impression that this is a particular German topic. Is this the case? Would you say that or would you say – as you probably have more experience and see the bigger picture there – that it is okay now with people at least in Europe caring and maybe the people in the US not so much caring? What would be your view? Is it something that is in particular interesting in Germany or could you also imagine having the ZD.B existing in America, a “ZD.A”?
Christian: Well, a “ZD.A” would be a good idea, or a “ZD.E” – E for Europe. Because the main job of the center is to bring together experts from different disciplines and also from academia, from politics, and from economy. So, that would be a good idea. But the center itself is concerned with economy and with people. Europe managed to have a GDPR. It seems that in Europe generally there is more of a respect for privacy and a feeling that it is important. This is of course a crass contrast to the United States and there is obviously also a difference in importance in the European countries. But the extent of this is, I think, not easy to describe.
Dominik: But it is a fact that Germany was not really helpful in the negotiations around the GDPR and always wanted to lower standards. I think there is a strong European voice, it is not that Germany is the privacy champion.
Christian: When you are looking at the privacy start-ups, the most active ones are in Paris at the moment. There are many just going into GDPR compliance but also more interesting ideas.
Marc: That is probably from the cultural background of the society. They may be more into defending the rights of the people and also getting active at the same time while the Germans in my impression are often like “We want to defend them but we do not want to get too active, we want to stay on the traces that we have been on…”.
Christian: There is another thing that is rather German, it is our second approach which is “We are going to do it by technology!” We are helping companies to develop the technology to become more consumer friendly. The official name for that is “privacy by design” which means to develop a product from the beginning to the end with privacy in mind, with respect to user privacy and total transparency. This is rather German.
Marc: Do you think we would need a special seal for that? A label so that the consumers can actually see it like this traffic light approach for food where you say it is green, orange, or red? Would we need an indicator for privacy? Because when you talk about privacy, it is something in the ambience and the people know they have to take care but they are not aware of what is happening. I mean you talked about the website already, somewhere people can look it up, which is good, but they still have to look it up. If you would force them to have something on a product, do you think it would change the interaction? Or would it be another label that people just do not care about because everybody has it or does not have it?
Dominik: I think we both have strong opinions on that.
Christian: When you are talking to consumer advocacy organizations, they are not very hot on creating a label. They do not see the impact. Then you are talking to companies, with ones that have a strong standing in privacy. They say, “We would do it but for us it is also really hard to talk and to make a public marketing claim about.” So, the companies are not hot on the labels either. In reality, it is not going to happen. Still the consumer has the problem of “I see a product and I cannot tell what the privacy implications of it are”.
Dominik: I would argue that it should not be the individual consumer’s responsibility to recognize good privacy because privacy is a human right. It is also written in the German constitution. I think we have a massive lack of law enforcement – we have the law, we just do not enforce it. I think this is really something we need to work on and then I as the consumer can maybe decide between different products that give me better margins of privacy. But the foundation – that has to be just clear by law enforcement.
Marc: Another topic that I always love to hear the opinion about from people is jurisdiction. The law enforcement, the judges – I often have the impression that we would also need some
training programs in digitization for them. Because they have the law, the written law, and they sometimes seem to relate it with too little technological background in that direction. Do you have an opinion there?
Christian: I got the impression that it is getting much better, that the judges are proficient with the technology and then can take good decisions. But it will get worse in the direction that the services are becoming more international. If I got a problem, if I want to sue somebody and this somebody is a corporation or a network sitting in three different countries, not in Europe, I will have a big problem of enforcing.
Marc: When we think about the dimension of speed. We are in a softwareized world and software can be changed very fast. I deploy an update tomorrow and then my picture book is having a different functionality and I am doing something totally different. The German or the European approach is to say “We change the laws”, and this takes time. Then Mark Zuckerberg is in hearings and he is talking about it. Do you have their ideas?
Dominik: The Data Ethics Commission just published its report in Germany. It has an opinion on that, which would be that for a critical automated system the state needs to have a life access to the system. And this is something that is not, I say, beyond critique. It is a very scary proposition. I see there needs to be some kind of automated checks and balances because it is working too fast. I just do not really see that it should be the state. It should be a third party that has no interest in controlling, for example, what is going on in the picture book and what opinions are spread there and which are not. We are always proposing that these kind of functions should be deployed by some third party that has no self-interest in how the things are solved but that has a mandate to solve.
Marc: That is interesting. I was just attending a film watching recently, which was “La bataille de libre”, and it was about open and free software. This could also be something towards privacy. Because when the people can look at the code, those that are able to – and we have many computer scientists in Germany or people who are interested in that – could do some checks and with these checks they could install some balances. Do you consider open source as something important towards security in privacy?
Christian: I am a little bit reserved. You mentioned there are many computer scientists that could look at the code, but the number of proportion is quite low. What is good for privacy is that if you got free source software you can make the tools yourself. You know how they work, and you have more transparency. This is of course helpful. When we are doing a project, we are also including classes to have the software developed there to be open source.
Dominik: But you somehow have to make sure that people actually do the code review, not that we have the principle of many eyes but the practice of few eyes.
Christian: The big challenge you see when you look at the network world and algorithms doing decisions, is that you have to have layers in order to look into the system. The lowest layer is for engineers – well, really nuts and bolts and the data flowing. And then you have more and more abstraction layers. Usually you can pick an abstraction layer, and see, and understand what is going on. Also, you have to have switches in these abstraction layers. Because if you only have the lowest layer and everything runs on top, then one day you will be at a point, where you cannot switch off the system or change the system because you do not understand how it works anymore. Then you have given the power to the machines. So, you have to have wires and switches in there which is part of the privacy philosophy to still have control over the machine and the data.
Marc: This human-in-the-loop, having someone there. This spawns another interesting aspect, which again brings us back to your start topic, namely complexity. People are then able to take informed decisions. But at the same time, do they want to take the informed decisions? You said, in the ultimate case they want to do that, but do you rather see that we need automation in these decision processes? There was this web approach where you could define some policies and then the web browser was automatically negotiating. It did not really go through in the end. I think it vanished. But do you think it will go more into that direction so that we have semi-automated or fully-automated decisions based on standard profiles? And then Germany will define a standard safe user and this is what will happen and you will only be asked if you deviate from that. Do you have ideas in that direction?
Christian: I think it is not a bad idea, drafty standard profiles. You can have different organizations providing them, one by Facebook, one by the consumer advocacy NGO, and then you can kind of select from the organization and trust their knowledge. If you are really crafty you can go in there and say, “I don’t want cookies at all. I always switch cookies off.” That would be a good idea but then you still have the automatic negotiation based on your preferences. This is a system that could actually work as a future system, to put my preferences into these automatics.
Dominik: I really think we need those shortcuts. Because if you look around as a consumer, if you want to take responsible and sustainable consumption decisions regarding your privacy, maybe environment, your finance and so on, you overload. It is impossible to take everything in consideration. A day has 24 hours. So, we need those shortcuts to really help us be more responsible and be more sustainable.
Marc: User data is getting collected. Do we need more open data? It could be that the Alexas and Google Homes – and whatever we have there – are sharing the training data for instance. Then we would not have to require so many data and people could look at it. Everything would be more open. Would that be something that you would for instance recommend to a company when you are talking to them? Would you tell them that transparency and maybe also sharing with others could be a good idea?
Dominik: This is quite a difficult question to answer for me, although being Data Protection Officer. Because from a privacy standpoint open data comes with some hooks. Especially form a law perspective it is very difficult. But from, let’s say, a humanitarian, egalitarian perspective you need a lot of data to also see where, for example, discrimination happens. Here you need personal data to help people to get to their rights. Here we need to realign our understanding of data protection as my right to my data. It has a larger benefit as well. It is really difficult to align those two but I think this is the way we need to go.
Christian: You can also think about different types of data. The one you mentioned to improve
speech recognition, there you need speech recognition corpus. This is a large corpus and it is not changing so much. It is just there. In comparison, we have the sensor stream, which is just a stream of data going from here to there. That is the daily data that is coming. For the latter it is more difficult to have this sharing character.
Marc: In the end it is about profiles, it is about having a digital me. Could you imagine that in the future we will have a trusted third party that is having my profile and sharing then something with the companies? I might just elaborate a little bit. As you said at the beginning, what we want to have is comfort services. We want to have more comfort. Therefore, it is of course beneficial when the system knows all the details it needs to know to bring me more comfort. At the same time, we want to have privacy. We do not want to share unnecessarily much information. We want to have a minimum of this information and so on. One idea could be to say, “We share a lot of information but then it stays somewhere trusted or local or decentralized. Only that much is then automatically revealed that is needed.” Is that a direction that you think it could go?
Christian: There is some efforts in this direction. The utilities, the ones providing the gas and electricity in Germany, think to reposition themselves as a data broker for this. For example in Munich, they have a central smart city platform with the data. They started to integrate their services and have fine grained control over which data goes where. They think about positioning themselves as a data broker. All the data about the citizens comes in and they release it for the services as they see fit. You can easily imagine them really using all of this data, which is used also by third parties at the moment. It is a closed system. It is an interesting approach but it is of course very, very complicated. You need very much IT savvy to pull this off and there is the big commercial interest also not to have this happen at public utilities in the cities.
Dominik: I think here we really have to be careful that, for the sake of controlling our data better, we do not just give it to this one central third party that we all trust and then it does not work out at all. I am a very big proponent of having this data decentralized. The third party is just the one who knows where it can take it and where it is allowed to take it.
Christian: Tim Berners-Lee has this approach with the decentralized kind of anti-Facebook where it is all in your port locally or at your own hoster. If somebody wants to see it, you got to go to your own port and read it from there.
Marc: Are you on Facebook?
Christian: I am on Facebook with a part of my identity.
Dominik: I am also on Facebook but I am probably going to delete it by the end of the year.
Marc: That is interesting. In one of the previous interviews, I was discussing with my friend Hassan and we were talking about that we were also joining Facebook when it was coming out and that at the beginning we were both very caretaking of what we put online. Two years ago, I would also have answered that I am aware of what I have there and it is only a really selective part of me. Then both of us came to the point that at some point in time we were like “It does not make too much sense anymore to do this!” Some of the online life and the offline life converged so much that we then stop this behaviour. Not because we would want to stop it but more because from a practical perspective, it was just not viable anymore to keep this separation. We felt both a little bit relieved afterwards because it was like apparently nothing bad happened and now we are more united to one persona again. Do you have similar considerations? As you said, Christian, you are very selective there. Do you think it is making sense? Because I always tell people, “You cannot protect because in the end they are more powerful than you. It is good when you are informed, so that when something happens with your data you are not like ‘Oh my god, how do they know that about me?’ but more like ‘I probably looked at this and that and therefore they have the information’.” Any thoughts in that direction?
Christian: I am still a long way from posting everything online.
Marc: I am also not doing that. Before I was really selective in “I only want to give this and that information”. I mean I am still a quite sparse poster – but I am like “If I think I want to share because this is a nice article to both my friends and I!” then I just do it and I am not considering what profile they would change.
Dominik: The problem is often not what you post or like on Facebook but that, if your friends like certain things and post certain things, they can just calculate what you probably like. This makes it a lot more complicated. Even if you are not on the network, they can make the connections and know things about you that you never actively gave into the internet.
Marc: This is also a very interesting thing that you just mentioned. I think it was 1999 when the movie “Public Enemy No. 1” was coming out, when we still had tapes and all these things. Which is like total reality today. I mean today we would not be surprised by having all this. The interesting thing there is that recording data is cheap. Data storage does not cost anything. When I talk to friends that are not so tech savvy and I tell them, “Well, the problem is not that at the moment no one cares about your data. Imagine you run for American president in 20 years and then they want to find out some scandals – they will for sure have the data and then they can do something against you!” This is rather the problem. With machine learning you now have a powerful tool with fuzzy logic that can mine this data automatically and with which you can find out things about people that you would never have looked at because normally they are not interesting. But now, as you have the algorithms that basically do it for free and that are very fast, you can do that. Do you also see this data storage or data kraken thing of these data lakes as a problem? Or do you see even more or other problems?
Christian: This data and power could be abused by any government in the future. So yes, from a citizen perspective this is a problem.
Dominik: You do not have to look far anymore. Even in Europe, there are two governments that totally would abuse or do abuse this data and force their policies.
Christian: Looking at Egypt there was a WhatsApp group – or a Facebook group, I do not remember – on gay pride of five hundred people. The next government just imprisoned all these five hundred people in the group because they knew they were interested in the topic or gay. This was not acceptable anymore, so they went to jail. That is an extreme example.
Dominik: Thinking about ethnic cleansing. You can so easily define a group and then know where to go and who really is part of that group. It is also something of control of digital weapons that are being exported to other countries.
Marc: Then again also come the companies into play that are processing this like WhatsApp
and Facebook. Very interesting. Now coming back a little bit more to the Bavarian perspective again. So, the first question that I wanted to ask but did not ask is, “Is there a German equivalent to the Bavarian Center for Digitization?”
Christian: No, not really. It is quite unique in Germany.
Dominik: They are funding a similar center to this one in Brandenburg, in Potsdam. But on a German level it would be an institution that coordinates between all the different ministries and this is not something we have right now.
Marc: Related to that then the question, “Would you say that privacy and security by design is at the moment an advantage for companies or a disadvantage?” Obviously, it costs effort because they have to do that. I mean you are supporting them, which is a great thing, but would you say it already pays off? It is also related to the label we talked about before. Is it visible? Are the customers then in the end, if we talk about companies, seeing it and valuing it in the sense that they prefer those companies?
Christian: Let’s have a look at concrete examples: We did “privacy by design” workshops for students as the future developers of systems all over Bavaria. We had local companies there that already implemented “privacy by design”. They were telling the students they are using it because they see a competitive edge on this. When you are looking at concrete examples there are some companies that say, “I do this for business reasons”. When you are looking at it more from a broader perspective, when you, Dominik, did your interviews, you had I think similar results.
Dominik: Yeah, it was really interesting! When we first started the study on “corporate digital responsibility” we were aiming at business-to-consumer companies naturally because of privacy, personal data, and so on. Then we found that even more business-to-business companies are really into these kind of topics because the consumer can only see so much – we had that before, it needs shortcuts! But companies have large compliance departments that very closely look at where they buy software and whether they fulfil all the privacy requirements, especially now that the GDPR is in place. There we have a lot of professionals who really look at “Do they do what they say in their data protection information?” and so on.
Marc: That is very interesting. I was not aware of this aspect but, of course, it is logical that in the b2b market it is even stronger. They have to comply then with the rules that you help shaping and making. That brings us already a little bit to the end, namely, what is it that you now perceive when you think about “Being Human with Algorithms”? What does it mean to you living in the world with the digital transformation? Is it good? Is it bad? Is it threatening? Is it inspiring? We covered actually all the dimensions and so it is just, “What is your personal thoughts about that?”
Dominik: I think it is totally all of that, of course! I think we need to learn much more about it – everybody should. We need much more data literacy, computational literacy but also, as I said before, it cannot be the responsibility of individuals. We need better, we need stronger collective actors who act really in our interests as consumers or as citizens. Then we can really reap all the benefits that digitalization can offer. Be it using data to acting more sustainable for example, decreasing energy consumption, increasing happiness and all these kind of things that we could do with this awesome technology once we get this step further.
Marc: Christian, do you want to add something?
Christian: I’m looking from a personal perspective at the algorithms. What is happening still is very interesting for me. I got into computer science because I want to know how the algorithms work. Each time I see a decision I want to know, “How did this come to happen? What is the mechanism? What is the data behind it?” When I look at the trends in the algorithms and the data one term I like is “peak information access”. It is similar to ”peak oil”. What is the maximum amount of information that I can access? I think we are past that point because the internet gets more crowded. It is more difficult to find relevant information, and to find non-marketing information. The companies that had open APIs to pull data are closing this data down. They want to have it for themselves. I can no longer freely go to Google and say, “Give me the search terms of last year!” in order to find any trends. I could be a researcher and apply to a portal to do so. The data is getting scarcer, even though there is more data. That is why it is called “peak information access”.
Marc: Very interesting and also very interesting topics that we covered! We would love – if you, our viewers, have questions and comments – to discuss with you in the comment section below. For now, I really thank you, Dominik and Christian, a lot. It was a very, very interesting interview. Thank you very much!