Skip to content
InTechnology Podcast

#25 – What That Means with Camille: Privacy and Its Policy

On this episode of What That Means, Camille talks cyber security privacy and policymaking with Claire Vishik, an Intel fellow and Chief Technology Officer of the Government Markets and Trade division at Intel. Her work focuses on artificial intelligence, hardware, and network security, trusted computing, privacy enhancing technologies, some aspects of cryptography and related global policy and trade issues.

Claire is also on the board of directors of the Trusted Computing Group and TDL, otherwise known as Trust in Digital Life. She’s co-chair of the IEEE effort on blockchain and advisor on numerous international research and policy initiatives.

 

This episode covers:

 

•  Various definitions of “Privacy” shared by Claire to fuel your thoughts on privacy and security

 

•  How different international standard bodies establish different frameworks and guidelines to protect individual’s privacy

 

•  US vs. Europe: Both communities’ approach to privacy are very differently

 

Some interesting quotes from this episode:

“[Each standard body] is using their own definition of the privacy space that is necessary for them to work in this area. There is no disagreement and really no multiple views. What we lack is some kind of high-level definition that will define privacy in all these very different aspects.”

“In Europe the foundation is in the principle that privacy is a fundamental human right… In the US, we do not have a federal privacy law.  We have a hodgepodge of different privacy regulations in states that aren’t harmonized approach, different areas of privacy.”

Share on social:

Facebook
Twitter
LinkedIn
Reddit
Email

Camille M: [00:00:37] Hi, thanks for joining me for What That Means: Privacy and It’s Policy. Today, we are going to talk with Claire Vishik, who is an Intel fellow and Chief Technology Officer of the division called Government Markets and Trade at Intel. Her work focuses on artificial intelligence, hardware, and network security, trusted computing, privacy enhancing technologies, some aspects of cryptography and related global policy and trade issues.
Claire’s on the board of directors of the Trusted Computing Group and TDL, otherwise known as Trust in Digital Life. She’s co-chair of the IEEE effort on blockchain and advisor on numerous international research and policy initiatives.
She has served on high-level expert and advisory boards, like a ENISA, which is the European Network and Information Security Agency (footnote: this is kind of equivalent to NIST in the United States, if you’re familiar with that.) She’s got a PhD from University of Texas at Austin, and she’s worked at Schlumberger Laboratory for Computer Science and also at, AT&T laboratories–both in leadership roles. She’s also authored numerous peer reviewed papers and book chapters, editor of several books, associate editor of two scientific journals and inventor on over 30 pending and granted U.S. patents.
So hopefully that does the trick, Claire, welcome to the program.

Claire V: [00:02:05] Thank you very much for having me. I’m delighted to be here.

Camille M: [00:02:09] It’s really good to have you on. The first thing I want to do is just get your definition, uh, in under three minutes of what is privacy and its policy?

Claire V: [00:02:21] Well, your first question is probably the most difficult, the whole. Privacy does not have an accepted definition. If you look at the Oxford dictionary definition, for instance, it will say the following of “the state or condition of being free from being observed or disturbed by other people.” That is pretty general and does not apply to all the situations that we are talking about today when we say privacy. And that is why one of the privacy top researchers, Alessandro Acquisti of Carnegie Mellon University had this famous slide, uh, that was entitled “What is Privacy Anyway?”
Scorpio in 1994 said, uh, it’s freedom to develop. Blaustein in 1964, uh, said it’s an aspect of human dignity. Warren and Brandeis in 1890 thought that privacy is right to be left alone. Uh, and many people will repeat this. Ability to control access to one’s information–that comes from a Noem in 1996. There are many definitions, many approaches to privacy, as you can see. So, uh, when people talk about privacy in a specific context, for instance, data privacy, uh, in a certain area or privacy or personally identifiable information. Or privacy that prevents you from tracking, uh, or specific standards associated with privacy. They are using contextual definitions, the definitions that define the part of the very wide, uh, privacy space that applies to their case.
So, uh, you know, do we know what privacy is? Sort of. But please keep in mind that it is many things.

Camille M: [00:04:52] Well, thank you. That’s a pretty fascinating introduction to it. I have about 25 questions burning right now immediately. So let’s dive a little deeper.
The first thing I want to know is you said that we, we don’t really know what it is, or we don’t necessarily agree on it. So does that mean we don’t have international standards on it?

Claire V: [00:05:18] No, it doesn’t mean this at all because privacy means so many things. There are areas of privacy that have standards applying to them with definitions that apply to their part of space. Let me mention some standards that have appeared in different areas. In ISOIC, International Standards on Privacy framework describes, uh, activities that an organization takes in connection with, uh, privacy. It is based on actions, actors, and processes. On the other hand, if we are talking about, technical standards, uh, we have standards such as also ISO IC standards on autonomous signature and autonomous authentication.
These are highly technical areas though. That anonymous signature standard is based on group signature approaches. And it defines a number of protocols, such as the direct, uh, anonymous at a station protocol that is commonly used. Uh, there is also a cryptographic standard for identity-based encryption that also defines a part of, uh, privacy related issues. Uh, there are several standards, very technical under development, uh, such as a standard for homomorphic encryption.
On the other end of the spectrum, we have standards that are really guidelines, uh, frameworks such as that new NIST privacy framework that was put into practice in 2018 and is available to provide guidance in various privacy situations.
So each of these standards, uh, is using their own definition of the privacy space that is necessary for them to work in this area. Uh, there is no disagreement and really no multiple views. Uh, what we lack is some kind of high-level definition that will define privacy in all these very different aspects.

Camille M: [00:07:43] So are there differences in privacy models as you change, uh, geographies or countries?
Claire V: [00:07:49] Oh, definitely. The easiest one that we talk about all the time, uh, we—I mean the privacy technology community–is the difference between the European and, uh, the U. S. approach to privacy. In Europe the foundation is in the principle that privacy is a fundamental human, right. From this kind of legislation, uh, that is principles-based. Not specific area driven, and does not defined specific technologies. It defines, uh, the desired outcomes. Uh, the general approaches, for instance, at reasonableness with regard to protecting personal data from re-identification. And, uh, it also defines penalties if the responsible approach to privacy has been violated.
Let’s move to the United States now. We do not have a federal privacy law. We have a hodgepodge of different privacy regulations in states that aren’t harmonized approach, different areas of privacy. Some of them like the new regulation in California, a fairly comprehensive. We also have, uh, market segment-based privacy requirements that are enshrined in laws and regulations.
Several examples here are the first one in this area is something called HIPPA. This is what controls privacy in healthcare environments. It sets up a disclosure process, but it also has some technical requirements such as protecting data in transit. And other example is, uh, something like GLBA that sets up privacy requirements for banking data. And, uh, there are a lot more areas that we could discuss here.

Camille M: [00:10:00] You mentioned that, uh, individual countries or populations just view privacy differently. So you said in Norway, what were you telling me that, that, um, what kind of data is public that we wouldn’t have public in the United States, for example?

Claire V: [00:10:16] Yes. Even though, uh, Norway follows, uh, the GDPR and privacy guidelines of the European Union, there are local requirements that are different than those in other countries. In Norway, you could find tax information–which means income information–for all Norwegians citizens.
In other places such as Estonia, that is also a European Union country. In order to protect the data is Estonia, a different approach. Their view is that, uh, a breach is likely to happen anyway so they segment the data so that, uh, the downfall from every particular breach is minimal. It’s uh, minimal because, uh, only a small segment of data is going to be available. The rest of the data is going to be linked from other databases and not as easily available if a breach occurs.
So every time when a new data set is required by the government, a special committee decides what kind of data parameters it’s going to directly contain and what parameters are going to be linked from other data sources. Moreover those data sources are placed in a variety of locations associated with embassies of Estonia. This is an interesting way to control data privacy.

Camille M: [00:12:23] You mean storing data literally overseas on Estonian soil by virtue of the fact that it’s at an embassy?

Claire V: [00:12:31] There are more complexities in this data, but they try to segment the data as much as possible to minimize the down downside of any breaches.
On that hand, if you look at countries like Russia, the situation is completely reversed. All the information about the Russian citizens has to be contained in Russia. That has created, uh, a lot of issues when the systems were first set up following the data sovereignty law in Russia, the data localization law. It had to do with for instance, visa applications where information about the Russian citizens also had to be replicated in foreign countries since they were applying for visas there. So there are definitely significant differences in how data privacy, data localization, data sovereignty and data movement, all the components of the big picture are addressed and approached in different countries.

Camille M: [00:13:40] So let me ask, I, you know, I have to say, I often thought of security and privacy as kind of synonymous. And it’s recently occurred to me that they may actually be conflicting sometimes. So I guess I just wanted to ask you do privacy and cyber security requirements clash sometimes?

Claire V: [00:14:02] Unfortunately they do, uh, which means that, uh, we really need to put a lot of forethought as we build our systems that require both privacy and security. In many cases, this is an easy task because the difference–potential difference–between the security and privacy requirements is minimal. In other cases, it is different and much more complex. This is why we have come up with the concept of “trustworthiness.” We is again that the technology community and trustworthiness is an integrated parameter that allows you to evaluate in a joint fashion privacy, security, and safety so that requirements could be created in all three that don’t contradict each other.

Camille M: [00:15:00] Can you give an example of a contradiction that’s come up?

Claire V: [00:15:04] Well, it probably will be an artificial example, but it will illustrate how, uh, those contradictions come up all the time. Uh, let’s think about privacy and security input in connected cars, for instance, need to be encrypted for privacy when you are talking about personally identifiable information. But some of this data, for instance, localization data also is instrumental in operating the car and in dealing with security aspects of systems.
So if you encrypt the data, it may be that the time to decrypt the data is too lengthy then you will fall outside of the safety requirements. We have described many of those situations in papers, so ? we’ll all tell business? and our academic partners, uh, there is really an abundance of them so a lot of work for technologists like you Camille to do.

Camille M: [00:16:06] I sometimes wonder, are we just like old or ivory tower debating this topic? I asked my ten-year-old daughter the other day, what her opinion on technology and privacy is. And I wasn’t even sure she would know what I was asking. Um, so she, she looked at me like I asked her what her favorite CD was and she’s just like, “mom, all my devices are already listening to me all the time. There is no privacy.” And I was like, “Oh, is that so?”
I’m just wondering, like, do you have a response to that coming from a year old?

Claire V: [00:16:44] Uh, I think, uh, you are talking about a very important issue. The first parameter that will make privacy more important, not only for societies, but for every individual user is awareness of privacy issues. Uh, what happens to the data? uh, how you want to protect your data? Once this awareness happens, customers begin to ask for privacy. You see evidence of this in many ways. So for instance, uh, recently one of the messenger companies announced a change in their privacy policy and immediately millions of users left the application for other applications that are more, uh, privacy preserving.
So when there is an understanding, there is demand for privacy, and this is going to change society, when other things we need to be aware of though, privacy is an issue that is the most acute in the developed world. There are many countries where it is inclusion rather than privacy that is more important. There are countries where many, and sometimes majority of citizens do not have a birth certificate, are not presented in the state records. Not much is known, uh, about them. Not much outreach is done to them. So this is, sort of a pre-privacy stage. You need to include people in all the processes that operate in a given country only then privacy, uh, begins to play out in the way it plays in the developed, highly connected world.
So, uh, I think in answering your daughter, there is no simple way to present this issue. And if my daughter was capable of questions like that at age 10, I would tell her, uh, “well, why don’t we sit down and learn more about privacy security, an how the devices work, and how they collect and share data and how data can be protected.” And, uh, maybe after that, the next stage will be a deep understanding and a conscious selection of the modes that your daughter is going to use when she is using her electronic devices.

Camille M: [00:19:34] Well, it’s, it’s truly fascinating. Um, I have, uh, one last question for you, uh, which is just, we’re all working from home right now. So I know, you know, just personally, I like to have a notebook that, uh, I mean, not a–one of these suckers–paper, actually around me to take some notes as I’m doing stuff. And at work, I just get to the end of the notebook, you know, lock it up in my desk at the end of the day and then take, when I’ve filled it up, I take it to the burn bin and it’s gone. Right?

Um, everybody’s working from home now. So how is the world dealing with privacy? I guess quickly as the work environment is colliding and you have potentially confidential or private information from other people from, from your work, that’s now in your home setting?

Claire V: [00:20:25] Right? That’s an interesting question in everyone is working on this today. Multiple guidelines were provided, IT departments in many companies and universities are very careful to provide guidance and adjust their systems to minimize, uh, downfall from potential issues. But I think the solution that will make those things easier to handle and easier to understand are still in the future.
We really, really need the standard solutions that will ensure privacy and security working from home. In the past, since work from home, work during travel was necessary and was allowed, a lot was done already. But this standard environment, when you work from home all the time, uh, so that it is the same for your collaborators outside of your organization, this really needs to be defined.
Some of the areas that might help there is something where many companies are active today–virtualization, very tough protective containers for different types of information, different types of types of activities that you are taking on your platform. And there are other promising approaches, as well, in hardware, in software, and even in how we use the networks. So excellent question. An excellent last question, because this is one where we need to go down heads together and come up with standards and shared guidelines really, really quickly.

Camille M: [00:22:13] Very cool. Thank you so much for joining me today, Claire. I really appreciate the conversation.

Claire V: [00:22:19] Thank you very much for the invitation. It was a delightful discussion.

Camille M: [00:22:23] It really was. And I did want to just point out to anybody who’s listening. You brought up, uh, terms like homomorphic encryption and blockchain and artificial intelligence. There are a couple of episodes if you look at Cyber Security Inside, we have a few different episodes. Two different ones on blockchain. Um, and one on homomorphic encryption and at least one on artificial intelligence. So if you want a little bit more depth on some of those specifically, um, you can check those out, too.

More From