TechTalk ep4: The One with the Privacy Professor

Paramita [00:00:03] Hello and a warm welcome to PwC Luxembourg TechTalk. Today's guest is Rebecca Herold, a US based entrepreneur and the CEO of the Privacy Professor, an information security company. She was a keynote speaker at the PwC Luxembourg Cybersecurity Day 2019. We talked with her about privacy issues and cybersecurity in general.

Paramita [00:00:31] This is our first podcast after our cybersecurity day this year. And we are with one of our speakers, Rebecca Herold. We are so excited... we attended your speech yesterday. We were there. It was great.

Rebecca [00:00:49] Thank you.

Paramita [00:00:50] We have a short amount of time. So we'll just get going with the questions.

[00:01:00] The first question I wanted to ask you, because there was another speaker at our conference yesterday who said that today we are actually moving from the scarcity model of business to an abundant business model, because you know, we have everything nowadays. There's everything and everything is connected. And there's the IoT and, you know.

[00:01:29] So how can... because your talk was focussed a lot on data privacy... so how can both data privacy and an abundant big business model exist together? Because they're contradictory.

Rebecca [00:01:48] Well, they seem contradictory and it is very challenging. So I think the awareness of what your business environment is is the key to being able to address privacy. When we're talking about privacy, we're talking about privacy not only of your customer information and maybe your patient information, but also think about your employees' data. You have to think about employee privacy as well. So when you have this very complex and so many items and devices are all connected, like you are saying, you have IoT devices, everybody is wearing or carrying some sort of computing device. How do you address privacy? It's challenging, but that challenge can be met if you raise the awareness of what types of devices do you have within your business environment? What are you allowing into your business environment? Where is the data being collected from when you're doing business, what type of data is being collected? Because really too many organisations, they know devices are in their business environment, but they haven't really thought about all of the different data, which includes a vast amount of personal data, which is what impacts privacy. And they don't think about where all of that data is going. So, for example, when you have your workers and you want them to have their mobile phones, you have to let them have their mobile phones. I mean, people depend upon them. You need to also think about beyond letting them use their own mobile phones. Think about, well, what do they have on their mobile phones that is going to pose a risk to our business. And too many businesses don't think beyond just letting them have their mobile phones. Because think about all of the apps that you have on your device. Oftentimes when I ask people how many apps do you have on your mobile phone, they'll think and they'll say maybe 15, 20. And then I'll say we'll look at your phone and tell me how many apps you have. And always they say, oh, my gosh, I have 80. I have 100. They are'nt using most of those apps. But guess what? Most of those apps are still collecting data from off of their phone or it's listening and so on. So here's what businesses need to do to, as one of many things, to help meet the challenge of privacy in this abundant business model. They need to establish policies that say you are allowed to use your personal mobile phones, but you cannot have these specific apps on your phone, or easier to manage, say you can only have these particular mobile apps on your phone to help us to protect the data that is within the business environment and could be collected from these other apps that we don't want you have.

[00:05:08] So it's about having your documented policies, procedures, making sure you enforce that and making sure you give training to your employees. Otherwise, you're going to end up having a lot more apps on there if you don't have any tools to automatically scan to see what they have on there.

Luis [00:05:25] I have a question, a doubt about that. If, for example, I bring... I understand that when it comes to the company phone. But if I bring my personal phone and in this case, for example, this company, we have a network and we can connect our phone, personal to the network. That brings a different... the situation changes then because then accessing my apps will be, sort of, you know, because it's my privacy so if you really are going to see what I have, because that can be dangerous for the company because of the network... that's a bit tricky.

Rebecca [00:06:01] It's very tricky. That's why you need to think about each business environment and what kind of information that's available there. There might be some businesses where it's OK to say, you know, maybe if you are actually connected to the network, you cannot ever use these types of applications on your mobile phone. But if you choose to use your own mobile phone and use all of your apps too, we're not going to allow your mobile phone to access our network. So you have to think through what's the options. And I think too many organisations that look at these challenges, they make a mistake by saying it's either everything or nothing. And you can't really think about that. You have to think about, well, what are the choices we can give? It might not be ideal for everybody, but if you can make a compromise so that you're still protecting the personal data of your customers and employees through your decisions, but yet allowing your employees to use what they have on their phone without making big changes, then that will help to protect privacy better.

Paramita [00:07:19] We are talking about this in a business environment but I have this question all the time, what about us? I mean, individually speaking. For example, you know, during your talk, you were talking about the data privacy law that came in California after this actress was stalked and murdered and very recently, there was this news about a Japanese actress, where someone located her from the reflection from her eyes. And I mean, I know there are privacy laws and everything, but how do you.... Because let's face it, it's somewhere because of the naivety of us that this entire Brexit data collection, data harvest took place. It's one of the reasons. So how do we deal with that?

Rebecca [00:08:24] Well, it really is important. It makes it so important for your information security and privacy programme to provide not only regular training, but also to send out regular reminders and newsletters. So, for instance, one of the things that I've been doing since 2007 is I create a monthly privacy professor tips message. And within it I talk about these types of recent things and what people can do to not have the same type of negative impacts on them. So over the years now I've got over 5000 subscribers, but around 80 percent of those subscribers are businesses who forward those on to their employees. So that way... and actually for my next tips, I'm going to talk about that particular situation, because that's not something that a company can really prevent. I mean, somebody is taking their photo and they have really big eyes that reflect that. So what they need to do is say, you know, before you actually post a photo to Instagram, look at it and see if maybe there's something in that photo that could cause you harm and then use that real example to say, for instance, someone had an image that showed their location in the reflection of their eyes. If you're going to post that to Instagram, then just go to, you know, Photoshop and just remove the reflection or at least part of it. And then that way you don't have to worry. I mean, you have to empower people to take some responsibility for their own privacy as well, because there's nothing that a company can do to prevent all privacy issues. That's why it's so important to have training and awareness. So that's why sending out these frequent messages to say, hey, here's something that happened. Keep this in mind, because we as a company don't want bad things to happen to you, our workers. And actually, when you start doing that for your employees, then they're going to start seeing more understandably why they have security and privacy policies for their company, because they'll see, oh, well, this is how it can impact me personally. And the company is concerned about that. And they're concerned about our customer information, too. So that's why we have to have policies and procedures to protect that information as well.

Luis [00:11:09] Now that you mentioned California and and citizens, I recently read that California has banned street cameras. Am I right?

Rebecca [00:11:19] Well, facial recognition.

Paramita [00:11:21] For the police, I think...

Luis [00:11:24] It's to avoid maybe abuses. But there are two sides of the coin. Some say that this is naive, it's silly. That this will not help. On the contrary, it's more positive than negative what this facial recognition is doing... So what is your opinion?

Rebecca [00:11:44] Well, you know, here's another big part of it. And this might not be something that has been heard a lot outside the U.S. But a big part of the facial recognition is that that's a type of artificial intelligence. The problem is they found that the A.I. or artificial intelligence facial recognition algorithms had bias to them. And so when they were created, they were created with a very small set of test subjects. And so for certain types of people with certain colours of skin and so on, the facial recognition would work fairly reliably. But you get outside of that and all of a sudden, it was giving false positives to people or assigning someone else's identity to a different person based upon facial recognition. So besides being a privacy issue, they were also concerned that all of a sudden somebody, let's say, might do a crime and it might be someone who falls into that outside range where all of a sudden you're going to have an innocent person that's being arrested for someone else's crime.

Paramita [00:13:02] Absolutely. And I think because since we're talking about the mass, you know, outside the business environment, I think Luis had a very interesting question. We were just having a discussion and we just started talking about it. And it was about the construct that you were talking about...

Luis [00:13:19] Well, you know, I've read a very, very good article saying that we like privacy as humans, but historically, over the last three thousand years, we have given up this privacy right because of convenience. Either because the king was happy about that and that gave us access to certain resources or even now because we get free access to the Internet at some information, we don't really mind. I think we are a bit unconscious or we don't really know what the impact is, but we we just you know, we go and we say yes because we get information quickly and we want it now.

[00:13:59] And I feel like in the end, the story about the Internet, for example, is a trade. They give us information, Internet gives us information for free and what we are giving to the system is our personal data. And I thought, well, that makes sense. Is it that maybe the future of the Internet is paid Internet where we can control these things? What was your take on that?

Rebecca [00:14:34] Oh, there's been a lot of discussions over the years about having the Internet free or not free. But here's a problem. I think that people who give that information to get a specific service, they're like, OK, I'll do this because I want to be able to get a coupon from this restaurant if I come in the door. What they don't realise is when they give up that information, they're not giving it just to that one enterprise. That enterprise, because it's free to them that's being monetised out to potentially dozens or hundreds of other third parties. And it's the negative impact that they might get from how all of those other third parties are using that data. So I think that's an important point for people who are like you know, I'm going to try this app, so I'll just agree to everything. They don't realise that all of that data might be going out to so many others. So that's one point.

[00:15:32] But with regard to the Internet and you know, whether or not we're going to charge people or get paid, kind of the same issue, people are saying, well, why doesn't Facebook pay us to be part of their system. You know, that's been talked about since Facebook started in, what, 2006, 2007, whenever. And the problem is people get so used to talking with their friends and so on that even the ones who say, I'm fed up, I'm quitting and then they they drop off of Facebook and  they even make a big announcement I quitting Facebook after one week they come back. They're like, sorry, cause I see that happens to me a lot. I see somebody I was friends with all of a sudden I'm getting an invitation. So I think it's somebody that's posing as them and they're like, no, no, it's really me. It's like I thought I could stay away. So that's where I think the genius Mark Zuckerberg and his team is that they've been providing free things. It's not free. It's costing you your personal data that they're taking from you. But are they going to ever pay you to be a part of that? They won't because they know that you've kind of got addicted, if you will, to the Facebook environment.

Paramita [00:17:06] You know, related to this, I think part of Luis's question was also that is data the new currency?

Rebecca [00:17:13] Oh, yeah. It is in many ways. In fact, let me give you a quick example. In the past month, in the United States, I've had three... public school systems in completely different areas of the United States, three parent teacher associations have contacted me who get my free Privacy Professor tips where I get questions from my readers all time and three different parent teacher groups contacting me because in their schools, and they were three big school systems, there's a lot of sports teams and I don't know if you've heard about a lot of concern now, especially in the U.S. for head injuries and how that impacts your brain... So most public schools don't have the budget to prevent this or to know when you have brain injuries. Well, there's a research company who had this brilliant idea, we're going to offer free saliva testing, DNA and RNA testing to public schools, to their athletes. And we won't charge the school system anything. And we'll do the tests before the event and then after the event, and we'll let them know whether or not there were brain injuries. But all of that data that they're collecting is going into this big pot of research and then it's going out to all these other research companies that's buying it from them. So talk about monetising your data. The parents got to me because they were concerned because the schools at the request of this agency that was collecting the student saliva for their data, they said you must require the students to give consent before they can play on the team. So all of a sudden they had consent forms that they had to sign or they would be kicked off the team. And, you know, it's like, well, is it consent, if you're told you have to give a part of you away.

[00:19:29] So I guess, you know, long story short, you know, there's ways that people are giving value, in this case a free health monitoring tool to a school system that couldn't afford it otherwise in order to get children's DNA and RNA. And you got to think, well, how's that gonna be used? I mean, when they apply to schools, colleges, are the colleges going to see that maybe we don't want to accept this person because it looks like they've had brain damage? And they might not be, you know, good students. Think about employers. We don't want to hire someone if they've had 25 concussions in high school. So, you know, there's so many implications that the typical person doesn't think about in the moment in time that they think, hey, this is a cool idea. Yes. I want to know if I had a brain injury during this game, because they are thinking in the moment. Whereas when people are collecting your personal data, it's not for that moment. It's for the rest of your life.

Luis [00:20:37] Yeah, well, I think, you know, the United States is living a 1984 reality. President of the United States wants to ban encryption. And this is another huge issue.

Rebecca [00:20:51] It's a huge issue. And, you know, it bothers me because technology is where my core is. And building ways to break encryption does not help anyone. And I've written about this many, many times.

[00:21:08] But here's an important point, and I don't know if your listeners have heard of the term deja vu all over again from one of our U.S. baseball coaches, but long ago. But anyway, in the 1990s, under the Bill Clinton administration, there was a thing called the Clipper Chip, and they were trying to push it through.

[00:21:28] What was the Clipper Chip? It was encryption that they were going to force U.S. companies to use that would give the government access to them. And they were pushing this for years. So, you know, this ultimately failed because they showed thankfully, those of us in the technology business, there are researchers that showed how not only could the government get into clipper chip encrypted data, but that it was weak enough that others could break it. Hackers could break it. And after that was revealed, you know, they lost their argument. So now here we are, you know, over 20 years later and they're trying to do the same exact thing, even though the issue is still the same, the technology weaknesses that would be built in. So I am so against that.

[00:22:26] And here's another point related to that. There's strong encryption available worldwide. If the criminals want to use strong encryption that don't want people to get into their data, they're going to go somewhere else and use that encryption. So the only people who will be using weak encryption are those of us who are not criminals and our encryption then is going to be more susceptible to being hacked because it's weak.

Luis [00:22:54] That's exactly what Koen Maris, our Cybersecurity leader said, when we had a talk about this... He said, well, if people, you know normal people on the streets don't have access to encryption anymore, and don't have this righ let's say, the bad guys will have that. They will be the only ones you know, with very sophisticated encryption and that is even worse.

Rebecca [00:23:16] Well in fact one of my tech colleagues, Phil Zimmermann, he created Pretty Good Privacy and he's just an acquaintance of mine. I don't know him really well, but I admire him. He created Pretty Good Privacy or PGP encryption that's a strong encryption back in the 1990s. He actually had to leave the U.S. because they were going to arrest him and put him in jail for PGP. But anyway, had a great quote. It was something like, if you outlaw strong encryption, only outlaws will use strong encryption. So I thought that was a great quote.

Paramita [00:23:57] We were running out of time, unfortunately. But before we end, coming back to the business environment, yesterday, your talk focussed on digital blinders and how they're crucial for... First of all, can you explain what do you mean when you say digital blinders? And why are they crucial?

Rebecca [00:24:23] Yes. So what I found... I've worked with hundreds of companies from the largest to the smallest and often times the security officers and those who were responsible for security, they focus on the devices that are within their business, that they've purchased and have control over and the applications and systems that they have control over, which they have to do anyway. I mean, that's a given. They have to think about those. And that's what oftentimes most regulations are specific to. What I'm talking about, getting rid of the blinders is the fact that today every person who comes into work often has their own mobile phone and they might be wearing other types of IoT devices. How many people do all of their work within the facilities, business facilities? People are working in the airports or working in hotels. They're attaching to free Wi-Fi systems. So you have to go beyond the blinders of just looking at your devices and realise that you no longer have a brick wall around your network. You're out in the open now. And so you take off the blinders to understand that your data is located throughout the world, perhaps. And so you have to go beyond protecting devices and applications and think about how am I going to protect data that is business data but it's outside of those devices that I can control. And that's where you have to start thinking about policies and procedures and doing risk assessments to find out where those locations are setting up, you know, training to let people know when they put data at risk.

Paramita [00:26:23] And you gave a great example. And I really you know, identified with it, you know, talking about hospitals and how you went with your phone, your smartphone and...

Rebecca [00:26:34] Yes. When I was at one of my musculoskeletal surgeons, I was in there to get a shot to my hip. And while waiting, I thought, I'm going to trial my cool new BlackBerry. This was in early 2016. And I thought, look at all these medical devices that are around me. So before the surgeon came in, I used a free app and I could identify over a dozen open access points on all these medical devices the patients depended upon for their health that were attached to the hospital network. If I had been malicious, I could have gone into there and changed the settings on the medical devices. I could have taken all of the data. I could have done so many bad things that I wouldn't even leave a trace because of how I was connected in this way. And just think about if you had even, you know, some sort of terrorist attack or something. I've always thought that would be the most insidious type of attack because doctors wouldn't know what was going on. What doctor would think oh, the medical device was hacked when their patient starts getting worse. None of the doctors would think that. So, of course, it would take a very long time before that particular root of the problem was actually identified or thought about.

Luis [00:28:00] I'm obsessed with the dark web. So the question is this: can we shut it down? And if we cannot, what is the benefit to keep the dark web alive?

[00:28:12] Well, the dark web is just kind of a result of how the Internet has evolved because no one... the way the Internet is created there are servers throughout all of the world that connects other servers together. So there's not one authority that has oversight of the Internet. So you can't control what's going on in different parts of it. And the dark web, there are people who are trying to monitor it. And I can tell you, like in the United States, there's the National Security Association or the NSA, they have folks who try to infiltrate and become part of the groups so they can start catching things like pedophiles.

[00:29:10] And they do that by actually going into those groups. And really that's what is most effective is to become part of the group so you can see what they're doing, not breaking into encryption. That's not going to help anything but becoming part of the the bad people group is how you're going to be able to effectively identify and stop the criminals doing that. But yeah, unfortunately, though, the beauty of the Internet is there's not one single fail point for it because it's spread all over the world. There's no owner of it. So you don't have one ruler or lawmaker.

Luis [00:29:50] Probably, that's also good. I mean not having a ruler. Maybe there should be a committee of people, but not having one institution ruling. But it's a very interesting thought. I really haven't seen the dark web, but I would love to see what it looks like. How have you visited the dark web?

Rebecca [00:30:07] Yes, I've been there. I have purposely not gone to some of the... there's just disturbing things.

Paramita [00:30:14] Yeah. Apparently you need a different operating system or something.

Luis [00:30:19] It's called Tor.

Rebecca [00:30:19] Well Tor is definitely a browser that allows you to get in and not be tracked and all. Yeah it's very interesting.

Paramita [00:30:40] So thank you so much, Rebecca.

Rebecca [00:30:42] Yeah, well, thank you. I have I enjoyed speaking with you.

Paramita [00:30:56] So Luigi how was it?

Luis [00:30:58] I think it was really great. It was really nice. And what I like about these conversations is, you know, we have some questions already defined and in the end we ask things that come to our mind.

Luis [00:31:15] I just wanted to share with you an idea. Recently when I was travelling, that's the moment when I profit from the time I have and I read a lot of things I haven't read. And there was this article on privacy and the evolution of the privacy concept over time.

[00:31:32] And I read a quote that privacy may actually be an anomaly. And that really shocked me because what history reports over time is humans giving up privacy. It made me think.

Paramita [00:31:51] So should we leave our listeners with this question? Is privacy an anomaly? If you're listening to this, is privacy an anomaly? Please comment. We'd like to hear from you. So that's it. Thanks a lot.

Contact us

Pauline André

Director, Head of Marketing & Communications, PwC Luxembourg

Tel: +352 49 48 48 3582

Follow us

Required fields are marked with an asterisk(*)

By submitting your email address, you acknowledge that you have read the Privacy Statement and that you consent to our processing data in accordance with the Privacy Statement (including international transfers). If you change your mind at any time about wishing to receive the information from us, you can send us an email message using the Contact Us page.

Hide