Skip to content
InTechnology Podcast

#111 – Cyber Responsibility: What Do You Need to Know?

In this episode of Cyber Security Inside, Camille and Tom get into cyber responsibility with ​​Dr. Magda Chelly, Cybersecurity Leader, Author, and Entrepreneur. The conversation covers:

  • Who holds responsibility for cyber security, and what that responsibility is.
  • What assumptions there are about cyber understanding and why we should simplify training and explanations.
  • What the security defaults in technology should be and why.
  • Why diversity in the field of cyber security is so important.

And more. Don’t miss it!

 

To find more episodes of Cyber Security Inside, visit our homepage at https://intechnology.intel.com. To read more about cybersecurity topics, visit our blog at https://intechnology.intel.com/blog/

 

The views and opinions expressed are those of the guests and author and do not necessarily reflect the official policy or position of Intel Corporation.

 

Learn more about Intel Cybersecurity:

https://www.intel.com/content/www/us/en/security/overview.html 

Intel Compute Life Cycle (CLA):

https://www.intel.com/content/www/us/en/security/compute-lifecycle-assurance.html
 

 

Here are some key takeaways:

  • Cyber security, as a topic, is constantly in the news and has been for a long time. However, the current challenge is really the common public understanding risk with technology. You have some responsibility when it comes to cyber, as does everyone.
  • An example of responsible cyber security is social media platforms. There are settings on these platforms that give the user the ability to configure controls for security and privacy which are not configured for us. We have to do that as users.
  • In the general public, there is a huge diversity of abilities and understanding when it comes to technology. So how do we educate everyone, from the basic user to an expert? Magda says this is the responsibility of the cyber security industry professionals, especially when it comes to awareness.
  • To start this communication and understanding, cyber security professionals have to simplify, and can’t assume any knowledge. For example, if you are talking about phishing to an audience, define what phishing is. Don’t assume everyone knows the term when you begin. Simplify more.
  • This applies to vendors and providers as well. A customer cannot be expected to know all of the risks associated with a technology they are using. The customer should be able to trust that the provider will give them the support they need to protect themselves.
  • If you think about directions for building furniture, they have changed quite a bit over the years. They used to be complicated and assumed a lot of knowledge. Now, often, they are just pictures and simplified instructions. Can the security industry learn from this?
  • Right now, security is not often the default in technology, and is often something you have to pay extra for. Why isn’t security the default rather than an option you have to configure? From there, users could choose additional security depending on their needs.
  • Framework and regulations might be the first step to achieve this, especially because some tech providers might not be motivated to provide high security to their users by default. This might be true with a company trying to collect and monetize data. To enforce ethical behavior, there might need to be laws and regulations.
  • Doing this is very complicated, though, since it takes a lot of time and the need to answer the question of how those laws would be enforced. For companies to entirely change how they operate takes time and they might not be motivated to do so. We would also need to make sure they are enforceable with capability.
  • Magda gave some examples of clauses or protections that you might want to include in security agreements. One was to enforce the training of the developers so that you know the software is built to a security standard. Another is what an incident response looks like in the case of a cyber attack or data breach.
  • Another example she gave was to include a clause about managing security vulnerabilities, including security updates as a free service in the contract. Be sure to include these things in the initial scope, and have a conversation with your provider.
  • Diversity is incredibly important in the cyber security field, not only to encourage the next generation of young women and people of color to go into the field, but also to have more innovation and inclusion in the development of products and services.
  • Security needs to be embedded in any field that uses technology to increase knowledge and close the disconnect between cyber and business. That way we can maximize the benefits of technology and minimize the risks.

 

Some interesting quotes from today’s episode:

“We still are facing a very big challenge with the common public or anyone understanding really the impact of using technology and the risk associated. Responsible cyber is all about understanding the responsibility of everyone while using technology. It’s not a matter of someone protecting you. You need to protect yourself first, and then have the support of others to ensure online safety.” – Dr. Magda Chelly

“If we would like to bring that awareness to the general public and ensure that users who are just using technology for daily activities understand the risk associated, take the right steps, we need to change how we communicate around cyber security.” – Dr. Magda Chelly

“We need actually to simplify, and always take a step back and assume that the other person doesn’t know our industry.” – Dr. Magda Chelly

“We are working and living in an ecosystem. We’re not working in silos. It is the responsibility of every stakeholder to help the other stakeholders within that ecosystem to protect not only the users, but as well as themselves.” – Dr. Magda Chelly

“Imagine building a house without a door or without a lock. Is that even possible? No, it is not imaginable. However, we still do that when providing, for example, IoT devices without the possibility to enforce a password… Make security by default and not as an option afterwards.” – Dr. Magda Chelly

“We need to have additional, not only regulatory frameworks, but laws and regulations in general that put a stop to that, and ensure that there is actually protection of individuals, protection of users in general.” – Dr. Magda Chelly

“If we put in place a law, we need to make sure that the companies and the users are able to implement it. I’ll give you a simple example. This is a personal opinion. If we force companies to have chief information security officers, we need to ensure that there is a supply. If we don’t have the supply, we cannot have a law and enforce that.” – Dr. Magda Chelly

“Diversity and inclusion is not only very important to achieve much more, again, capabilities, opportunities, drive more innovation, but as well we need that in order to drive the younger generation into our field. If not, it just continues to be very non-inclusive and non-diversified.” – Dr. Magda Chelly

“What we have seen during the last years is that there is a clear disconnect between cyber security and business. Whenever we are talking about cyber security, we’re talking very often about vulnerability, technology concepts, or reports to boards on things even they cannot understand. This disconnect needs to be closed.” – Dr. Magda Chelly

Share on social:

Facebook
Twitter
LinkedIn
Reddit
Email

[00:00:35] Tom Garrison: Hi. Welcome to the Cyber Security Inside podcast. I’m your host, Tom Garrison. With me, as always, is my co-host, Camille Morhardt. How you doing, Camille?

[00:00:43] Camille Morhardt: Hi, Tom. I’m doing well.

[00:00:45] Tom Garrison: Boy, on store for today is one of those big brain topics. That is responsible cyber and what that means. It’s a pretty interesting concept when you try to tackle it piece by piece.

[00:01:01] Camille Morhardt: Yeah. We’re used to hearing about responsible AI. That’s the talk of the day. But really, AI is accessing a bunch of other information, which falls into a broader category of cyber security or cyber privacy or trustworthiness.

That’s what she gets into is really sussing that out, and figuring out does everybody now need an advanced degree in cyber security, or is there some kind of onus on providers to actually bring people up to speed in a moment, or make a reasonable decision on their behalf, and can they do that, or are motivations aligned or misaligned.

[00:01:40] Tom Garrison: Yeah, exactly. There’s so many different aspects here, including if you want to even try to educate people about being able to make reasonable choices regarding cyber privacy or their overall security. That, in and of itself, is a massive undertaking when you think about an entire culture or across the various societies around the world.

How can we take small but measured steps that are going to have a real impact? That’s what we’re going to dive into. What do you say? Let’s get into it.

[00:02:13] Camille Morhardt: Yeah. Let’s give a listen.

[00:02:20] Tom Garrison: Our guest today is Dr. Magna Chelly. She is a world-renowned cyber security leader, author, public speaker, and serial entrepreneur. She is a certified information system security professional, and a certified information security officer. Welcome to the podcast, Magda.

[00:02:39] Magda Chelly: Hello, Tom. Thank you very much. I’m very glad to be here today.

[00:02:43] Tom Garrison: Yeah. We wanted to spend our time today and talk about responsible cyber security. Let’s just start off with what does that mean for you.

[00:02:56] Magda Chelly: I think the topic of cyber security has been in the news for many years. However, we still are facing a very big challenge with the common public or anyone understanding really the impact of using technology and the risk associated.  Responsible cyber is all about understanding the responsibility of everyone while using technology. It’s not a matter of someone protecting you. You need to protect yourself first, and then have the support of others, to ensure safety online.

[00:02:35] Tom Garrison: Can you give examples of what you mean by responsible cyber security?

[00:03:42] Magda Chelly: One example that I really like to use is actually a very simple one. We are all using social media. On social media platforms, we have settings. Those settings can allow us to configure certain controls, either for security or privacy. They are not preconfigured for us. We need, as individuals, to choose them and activate them.  That’s a very good example of how the responsibility comes down to the individual, to the user. They need to take that step in order to protect their accounts.

[00:04:19] Tom Garrison: How do you see this playing out? I’m thinking about the huge gap in skillsets between various users. There’s the people that really don’t understand cyber security, for sure, but technology more generically. They just use it, but they don’t really understand it or how it works. Then you have all the way up to experts.

The latter group there is pretty obvious that they can be trained or be made aware of what it takes for them, personally, to protect themselves. But what about that first group that doesn’t really understand how technology works, and may not really understand even the risk? How do you tackle that?

[00:05:04] Magda Chelly: Well, I would say it’s our responsibility. It’s the cyber security industries’ professionals who need to ensure that they align a little bit more the explanations and especially the awareness around those technologies in a better and more acceptable way for anyone, any user. We have seen, commonly, technical topics or even articles in the news talking about concepts that even certain technical people will not understand, because they’re specific to a particular domain. We’re talking about acronyms that are not also easy to understand.

If we would like to bring that awareness to the general public, and ensure that users who are just using technology for daily activities understand the risk associated, take the right steps, we need to change the way how we communicate around cyber security. We cannot use the same way and approach of a little bit showing off how much we know by using those concepts. We need actually to simplify, and always take a step back and assume that the other person doesn’t know at all our industry, our domain.

For example, clearly, we are talking about sometimes, of course, about phishing. Why do we assume that the person in the audience understand the term phishing? We cannot assume that. We need to completely forget about those concepts from our perspective, and put ourselves in the perspective or in the shoes of someone who has absolutely no idea about our specific domain.

[00:06:48] Camille Morhardt: It’s not so much to have a goal of educating or making everybody aware, and using appropriate or accessible terminology to do that. You’re saying it’s a step past that. The onus really needs to fall on the service provider, or the goods provider, and the information provider to either do a real time awareness or real time training, where it’s sort of like, “If you check this box, this is the implication,” or to make some set of assumptions that fall under what a reasonable person would want, and then apply that with an ability to change. But not just throw their hands up and say, “Well, here’s a bunch of really complicated information. We’re going to default to collecting all of it unless you know what to check.” Would that be a fair assessment?

[00:07:38] Magda Chelly: Yeah, absolutely, Camille. I think we are working and living in an ecosystem. We’re not working in silos. It is the responsibility of every stakeholder to help the other stakeholder within that ecosystem to protect not only the users, but as well themselves. What does it mean? If I am a customer and I’m using a product, I should trust that vendor or provider to give me the right support and the right awareness to make sure that I can protect myself online. I cannot just ignore the fact that every technology comes with associated risk and, for example, share complicated documentation and expect that user to be capable of understanding that and implementing it.

It’s a shared responsibility. It’s a social responsibility that needs to be taken a little bit more at heart or from the heart of everyone providing technology, not just trying to bring or delegate that responsibility to the users. Because as we said, and Tom mentioned, they might not know.

[00:08:48] Tom Garrison: It seems to me like if we were to look at maybe a different industry, like … I’m thinking about furniture. When you buy furniture, it used to be, I’m remembering back when I was much younger, that the instructions … You needed a PhD to assemble the furniture. It was ridiculously complicated. They’d have it in six different languages and da, da, da, da.

But over time, the instructions, they changed them. Now, the instructions are a lot better. That’s not to say they’re perfect. I’m sure there’s people out there groaning, saying, “Oh, the instructions are still terrible,” but they’re designed with the idea that the person on the other end really doesn’t know what they’re doing. They’ve limited the choices. In a lot of cases, they just use pictures.

I wonder if there is an analogy here on the security side. Part of what the industry has done is try to make the whole installation process so easy that they’ve taken the choice away. They’ve just gone to a one click install. Embedded in that is a whole bunch of security selections and whatnot that need to go on.  We’ve vacillated, in the industry, back and forth between hyper complicated and super, super simple, but then shielding the security element.

[00:10:12] Magda Chelly: Tom, I love the analogy. I do think that there is a lot of advancement that we can do in the security, and in general, in the technology industries, to achieve that.  Make it, literally, first of all, by default. Security has been optional for so many tools that we are using, and still providers even, or even requires an additional payment. We need to ensure that the service providers, the technology providers, actually build software, build tools with the wide security by default.  Imagine building a house without a door or without a lock. Is that even possible? No. It is not imaginable. However, we still do that when providing, for example, IoT devices without the possibility to enforce a password. The first point, again, make security by default and not as an option afterwards. The second, of course, there might be several level of security.

Certainly, for example, if you need a very strong door, I’m coming back to the analogy with the house, you might have several locks, and you might have special keys, or an electronic lock that is very much more advanced. Now, if you don’t need that, because you consider that you just need a simple lock with a key, because you’re living in a safe country, because you don’t have valuables in the house, you might choose that option, but you understand very clearly the difference.

If the software or the tool is built with security by default, you have then the second step, which would allow the users to choose the level of additional security required, depending on their environment, context, requirements as well.

[00:12:01] Camille Morhardt: In your opinion, is it enough to say this is a social responsibility, and the tech providers need to make this clear and help people make decisions? Or is there just an inherent conflict that arises, in some senses?  I can think of collecting data, for example, where there’s too much motivation on the part, maybe, of the tech provider to encourage the person to allow the collecting of the information, and so you think that we need more stringent standards or regulations to actually make sure it’s happening. Do you have an opinion on that?

[00:12:37] Magda Chelly: Yeah, absolutely, Camille. I think whenever we want to change the overall ecosystem that has been there for many years already, functioning in a certain way, and accelerate the maturity, and perhaps enforce the social responsibility, we need to have a certain framework, or at least regulatory requirements to force that.

What I have been working on, for example, a lot of my clients in general, is that I bring awareness around why you need security clauses in the contract. It’s not only about trust with your service provider. It is also ensuring that things are done and aligned with your expectations, when everything goes well and when something happens. We are in commercial world where, of course, if we have a company, we try to bring additional paid services additional, like you said, monetize data, or do any other related activities that help us to increase that revenue. Therefore, if a company has a certain, I would say, ethical behavior, there is still a balance to reach.

But I do believe that, in order to enforce that, we need to have additional, not only regulatory frameworks, but laws and regulations in general, that put a stop to that, and ensure that there is actually protection of the individuals, protection of the users in general.

[00:14:06] Camille Morhardt: Okay. Then a quick follow up question would just be, what are those thresholds that a society should look at in order to know that it’s time to add that.

[00:14:17] Magda Chelly: It is a debate, Camille. I think it will take hours and hours to answer that. I don’t think it’s a black or white answer. I think it very much depends. Like any law and regulation, it is a result of a lot of research and many lawyers coming together in order to understand what is the best way to address certain challenges.

Especially when it comes to data privacy or data security in general, we have seen that privacy laws took many years, and in some countries, they are still not enforced, because of the challenges that it leads to. Companies cannot just, in a matter of days, implement certain new aspects for their businesses.

The same would apply if we just enforce something else around general security. If we put in place a law, we need to make sure that the companies and the users are able to implement it.  I give you a simple example. This is a personal opinion. If we force companies to have chief information security officer, we need to ensure that there is a supply. If we don’t have the supply, we cannot have a law that enforce that.

So again, it is a very hard question to answer. I do not think that it’s either simple or trivial, but it is, I would say, a long collaboration and work together with the right people, legal, privacy professionals that would help to achieve this particular balance.

[00:15:50] Tom Garrison: You mentioned before that when you meet with your customers, you coach them about what sort of clauses to put into their contracts and whatnot. Can you give examples of the kind of clauses or the kind of protections that you suggest be put into agreements?

[00:16:09] Magda Chelly: Absolutely. For example, very often, I have clients using outsourcing software development companies. Those software development companies might not have the right secure coding practices. So I very clearly advise and recommend those clients to actually add clauses that enforce that and enforce, for example, the training of the developers. So at least you know that the software is built with a specific standard.

On another topic as well, there are two other aspects that I really like to recommend is incident response. What happens in case of an incident, a cyber attack or a data breach? Not only from notification perspective, but if the supplier or the provider has a breach, what are the next steps, and what are the requirements from the client?

The last one is around actually managing security vulnerabilities. This depends, of course, on the particular service provider or product. But I do encourage my clients to actually include all the security updates as a complimentary service or a free service as part of the contract, rather than coming in front of the situation, and then the provider asking for additional fees, because it’s not part of the initial scope.

[00:17:33] Tom Garrison: Is there pushback on this, or is this generally accepted by the providers?

[00:17:41] Magda Chelly: Well, I would say it’s generally accepted. But I would say it also relates to the fact that when you are a client, you basically have a little bit more power over them, because you’re buying. So you have the capability to say, “Those are my expectations. I want that to be achieved.”  But again, it depends on the context as well. If you already signed a contract and you are renegotiating, that might be requiring a different communication, and perhaps it might not be as easy as when you have that expectation from the initial negotiation and initial scoping and contractual discussions.

So I would say, yeah, it’s mostly implemented if it’s a new contract. If it’s a renewed contract where those clauses were not included, then there might be a little bit more challenging. But it’s all about how to bring that communication and discussion with your vendor or suppliers are supposed to be your partner. So just not come up as only expectations, but explain why it’s important, and perhaps try to find a way to make sure that it actually brings that visibility that is for the good of both. Because when something happens, then the consequence are not only on the client, but as well the service provider.

[00:19:02] Camille Morhardt: Magda, I know you focus a lot on diversity and inclusion as it relates to cyber security. I’m wondering if you can explain to us why you think that’s important in this field.

[00:19:14] Magda Chelly: In the latest years, we have seen statistics about really very low of female professionals in the cyber security industry. You were talking about 11% and then 20%. Of course, those statistics not only are low in general, but they also discourage perhaps the younger generations into getting into the field.

I do believe that cyber security is extremely interesting. I’m passionate about it, learning every day and discovering about different things. So I would really like to see more diversity in the field, and not only from the gender perspective, but just in general. Why? Because it’s very, very interesting, and as I mentioned, allows to learn continuously.

Now, in order to achieve that diversity, we need to have role models. Those role models encourage the younger generations, like I mentioned, but as well for example, young girls in schools that did not see previously professionals in cyber security leading, or having exceptional careers, or providing really the services that we provide today in general.

So diversity and inclusion not only is very important to achieve much more, again, capabilities, opportunities, drive more innovation, but as well we need that in order to drive the younger generation into our field. If not, it just continues to be very non-inclusive and non-diversified.

[00:20:53] Tom Garrison: Yeah, no. In a strange way, it ties back to what we discussed earlier, which is that awareness on cyber security and making those intelligent choices as you have whatever the product or service that you’re installing, that you’re making the right choices. There’s a level of awareness that you need. We could also not only educate for that purpose, but educate earlier in schools, and get females and minorities that are underrepresented today, get them excited about cyber security.

We actually kill two birds with one stone in a sense. So I definitely think there’s an opportunity there, for sure.

[00:21:36] Camille Morhardt: I’m wondering if, from a perspective on approach, is it more important to highlight cyber security, and create courses and programs that everybody can access, or that are even required in some sense, or is it more important to merge cyber security with other fields. Where if you’re studying artificial intelligence, then this is a core part of it. If you’re studying manufacturing, or you’re studying supply chain, or elements in critical infrastructure, or systems thinking, or sustainability, then this automatically is one of the prerequisites to continue in that field.

Has it not gotten as much attention in the past because it’s a standalone thing sitting in its own area, like computer science, versus saying, “I’m not interested in computer science, but I’m doing aeronautics and astronautics. Now, I’m very interested in cyber security, because it relates directly to my field.”

[00:22:37] Magda Chelly: Absolutely. I think both are needed. But what we have seen during the last years is that there is a clear disconnect between cyber security and business. Whenever we are talking about cyber security, we’re talking very often about vulnerability, technical concept or report to boards even on things that they cannot understand.  This disconnect needs to be closed. If not, we’ll never achieve the outcome that we want to achieve, which is reducing the risk associated with the technology that we are using from cyber attacks and data breaches.

If we are, for example, learning artificial intelligence, we should be able to learn as well about the risk associated, and cyber is one of those risks, in order to, of course, efficiently implement that technology afterwards and maximize its benefits and its usage.

[00:23:38] Tom Garrison: We do have one segment that we always like to close on. That is called fun facts. So I wonder if you have a fun fact that you would like to share with our listeners.

[00:23:49] Magda Chelly: Yes, Tom. I have actually a fun fact. I would say one of the very interesting activities of myself and my husband is actually to rides camels on the beach in Tunisia.

[00:24:02] Tom Garrison: Is that comfortable?

[00:24:04] Magda Chelly: Well, yeah, not bad, not bad. It’s entertaining.

[00:24:10] Tom Garrison: Oh my gosh.

[00:24:11] Magda Chelly: Definitely entertaining.

[00:24:12] Tom Garrison: Nice. Very cool. Very cool. How about you, Camille?

[00:24:16] Camille Morhardt: Since summer has officially started, I have a very fun fact. Here it is. Are you ready?

[00:24:23] Tom Garrison: I’m ready.

[00:24:24] Camille Morhardt: Why did the cantaloupe jump in the swimming pool?

[00:24:27] Tom Garrison: I have no idea.

[00:24:29] Camille Morhardt: It wanted to be a watermelon.

[00:24:32] Tom Garrison: God. Oh, Camille. I knew it was going to be something like that. It had to be.  All right. My fun fact is short and sweet and, I thought, very interesting. Did you know that in badminton, the top speed ever recorded for the shuttle is over 300 miles per hour?

[00:24:59] Camille Morhardt: Wow. I thought it was like 55 miles an hour.

[00:25:03] Tom Garrison: No. Anyway, cool stuff. All right. Well, hey, Magda, thank you so much for joining us today on this topic of responsible cyber. It’s a great topic. I think we can all take a little bit away from this conversation.

[00:25:16] Magda Chelly: Thank you very much, Tom. Thank you very much, Camille. It was my pleasure to be here today with you.

More From