Skip to content
InTechnology Podcast

#38 – Conversations We Should be Having About Risk Mitigation

Risk mitigation isn’t just about calculation, it’s about contemplation. In this episode of Cyber Security Inside, we speak with Malcolm Harkins, a Security Executive, Board Member, Advisor and Coach/Mentor whose thirty-year career in the tech industry gives him incredibly valuable insight into a whole host of key issues surrounding cyber security.

 

We covered many topics with an overarching question in mind – How can we collectively become better choice architects in the face of inevitable risk?

 

We discuss:

•  The ideal skill set for a CISO/CSO, which should include a breadth of business, risk compliance and technical acumen

•  The kinds of vital questions missing from board discussions, including moral and ethical concerns

•  The importance of long-range planning when it comes to risk preparedness and damage mitigation

•  What can be learned from a disaster like the recent Colonial Pipeline ransomware attack

… and more.  Join us for this fascinating discussion, and become a better choice architect.

 

Here are some key take-aways:

•  It’s physically impossible to completely eliminate risk, but you can ask better questions in board discussions to help manage it.

 

•  Similarly, you can’t know everything, but with the right group of people and data, you can forecast a variety of different risk scenarios and become better prepared to minimize damage.

 

•  Ethical and moral questions need to be coming up far more in board discussions – these issues can be a matter of life and death, and should not be ignored.

 

•  When it comes to the language of board discussions, there should be more of an even playing field – non-technical members should begin to employ a basic understanding of security and tech nomenclatures, and vice versa.

 

•  And while it’s important to train people to be on the lookout for ransomware attacks like phishing attempts, it’s not a sufficient strategy – accountability should ultimately be driven back to the security community across the vectors of risk, total cost, and control friction.

 

Some interesting quotes from today’s episode:

“I think it’s high time that we start expecting the non-technologist board members to at least be able to understand the basic nomenclatures in the security and technology space.”

 

“I think there’s an ethical and moral accountability that is missing in many of the discussions around risk; that’s a question that I can tell you has never come up in any of the board meetings I’ve ever been in, but one that should.”

 

“Before I had that dialogue with them, they were not looking at that data integrity with that lens, which would have potentially caused people to get sick or die, and it certainly would’ve had a substantial revenue brand or organizational implication if that were to occur.”

 

“I think we are doing bandaids, bubblegum and baling wire making up for dated security technologies and other technologies that don’t work.”

 

“We’ve got to start weeding and feeding our environment. Go look at the effectiveness and efficiency of control, and if it’s not effective and efficient, shut it off. Get rid of it and buy something better.”

 

“If technology companies spent more time making sure that every engineer who created code or developed technology understood security vs. just functionality, again, you would change the technology vulnerability dynamics by focusing on that training which we don’t do enough of.”

 

“I’ve always thought of my role as architecting choices for the business…if I architect choices the right way, we’ll make better business decisions.”

Share on social:

Facebook
Twitter
LinkedIn
Reddit
Email

Announcer: [00:00:00] You’re listening to Cyber Security Inside, a podcast focused on getting you up to speed on issues in cyber security. With engaging experts and stimulating conversations. To learn more, visit us at intel.com/cybersecurityinside.

Tom G: [00:00:42] I’m Tom Garrison. And with me is my cohost Camille Morhardt. How are you doing Camille?

Camille: [00:00:48] Tom horrible. I’m doing horrible today. I’m just kidding. Do I ever say that? I’m doing really well today as usual.

Tom G: [00:00:55] Well, today we’re going to have a conversation with Malcolm Harkins, who is the well amongst other things. He was a former CIS and a privacy officer at Intel. And since then has moved to other companies, but he is, uh, going to share some of his thoughts on. A whole host of different topics. I thought it was a really interesting conversation.

Camille: [00:01:21] Yeah. I think, you know, for me, the opportunity to sit down with a Chief Security and Privacy Officer at a Fortune 100 company is like asking that person what the best questions they heard from the board was–because he’s been on multiple boards too. um, and asking him what people should be asking him or people in his role, you know, direct from the source. It’s kind of like what sort of thing might you be trying to cover up or not answer? What, what sort of question should be directed to somebody in your role to really. check under the hood that things are doing well. So I thought that was fascinating to be able to ask those questions and he was very open about it.

Tom G: [00:02:09] Yeah he was, it was super open. I think that’ll come across, uh, as part of the interview that we had with him. And then just also some, I think pretty insightful tidbits of wisdom for all of us, which is, you know, how do you cut through the chatter of when somebody says that they know cyber security, or if you’re going to hire somebody under the board, like how do you get past the sound bites to really understand what’s important when it comes to hiring somebody at such a senior level in the organization? I thought his insights, there were really, really interesting.

Camille: [00:02:46] I agree. And I also thought it was refreshing that he said, you know, a lot of times you’ll get senior people in their specific role–maybe even particularly security–and it’s like, well, everybody else is a fool and they need to do all this stuff just to get the basics done. And that’s really not his approach, his approaches. We need a very eloquent solution to address security. People aren’t dumb. People can be tricked. It doesn’t mean that they’re, they don’t care or they’re foolish. We need to really think about how we can help protect people who are being reasonable, but are trying to get their work done. They’re not focused when they’re doing email necessarily on security and they shouldn’t have to be, and that’s our job.
And I liked that. He owned that and really talked about how you can structure things in a way. Of course he supports training too, but you know, he, he really said it’s not enough to just pass that off onto the end user. It’s really our role as security professionals to make it easy for everybody.

Tom G: [00:03:48] I think the listeners here are really going to learn a lot from listening to his perspectives. So with that, let’s start off with Malcolm Harkins. Our guest today is Malcolm Harkins. He has over 30 years of experience in the tech industry. Most of it focused on security. He currently sits on the board of the Cyber Risk Alliance, as well as Trust Map. He’s also an advisor to several security startups. He was previously the Chief Security and Trust Officer at Simantec and ?Silence? and the VP and Chief Security and Privacy Officer at Intel.
[00:04:30] He is the host of Security Speaks podcast focused on having real and raw cyber risk dialogue with practitioners. So welcome to the podcast, Malcolm.

Malcolm: [00:04:40] Hey, thanks Tom. Thanks for having me.

Tom G: [00:04:42] That is quite a background. And of course you and I knew each other and worked together, uh, while you were here at Intel. So it’s great to see you.

Malcolm: [00:04:47] Thanks. Great to see you too.

Tom G: [00:04:53] So I’d like to start off with just your experience as, uh, somebody who sits on the board. Obviously you were a CISO. And just talk about your experiences there, and then we can go from, from there.

Malcolm: [00:05:06] Yeah. You know, it’s, it’s been an interesting journey. I’ve, I’ve done, uh, advisory and some board work now for several years, post my departure from Intel and all in companies that are in the security industry. So that also is a little bit different. But, you know, on the board seats you have a fiduciary accountability, right–to the shareholders. Um, you have to have a governance role in understanding, uh, the strategy and direction of the business and, and try and navigate all of those different things. It’s certainly different than Intel’s board of directors or other companies, because these are privately held, but the role is by and large, the same as to when I used to go present to Intel’s board of directors, you know, running security and get questioned around different things with respect to how my organization was operating in them, what that meant for the company. So it’s typical to most board roles, but it’s really that fiduciary/accountability/governance role, um, strategy. And to some extent, some coaching, right, of some of the other executives in the company.

Tom G: [00:06:08] Well, it seems like cyber security is, uh, is certainly in the news all the time now. So I would imagine that there, even for companies that haven’t historically had a CISO type function on the board, there’s probably more demand for that skill set.

Malcolm: [00:06:23] Now, there, there definitely is. It’s, it’s an interesting thing. There’s, there’s been some proposals here and there about having a security professional on the board of directors. And, and while I agree with that, I think there’s some different ways in which you can evaluate it. If you just went and grabbed some security professional and said, “hey, we’ve got a security person on the board,” If they don’t have business understanding and business context, it’s a little bit odd, right? Because that board makeup needs to have people that have depth of knowledge in particular areas, you get the diversity of perspective on the board, but they need to have a broad understanding of the business aspects of things as well. And if they don’t, they’re going to be misplaced.
And then the other thing on it, you know, and I’ve seen some people that go do this, they’ll go get a CEO of a security company. Right. And I, and I know a lot of them they’re business leaders. They’ve never run security. Right. So in some cases, they’re, they’re not actually a security practitioner. They’re a business executive or maybe a [00:07:30] technology executive who happened to become CEO doesn’t mean they actually know how to run security internally because by and large, they never have.
And so it really depends upon what the company’s looking for for that, that expertise on their board. And I think if, if you were thinking about getting security expertise for the board, getting a practitioner, who’s really run security and risk functions who happens to have the business acumen. You’d be better suited with them on the board.

Camille: [00:08:00] [00:08:00] Malcolm you in one of your books, you actually described this as Z–the shape of Z. Can you, can you tell us a little bit more about that?

Malcolm: [00:08:07] Yeah. So there was a design company, Idio 70s/80s and stuff that used to talk about the ideal technologist and they were T-shaped. They had a breadth of business acumen and the depth of technical acumen. And I started looking at the right skillset for a Chief Information Security Officer, Chief Security Officer, and I called it Z shaped. They had to have a breadth of business acumen so that they can have the right dialogue with the business, understand the business velocity, the business objectives and those types of things. They had to have a breadth of technical acumen because they had to be able to. work with them, the IT side and the technologist; or in a company like Intel, the technology side. Right? So the product group technologists. Either way, you had to have that breadth of technical acumen and then the hash that created the Z was your risk, security, controls, compliance depth.
And so you have to grow on all aspects of that–not only as an individual, but your organization to be Z shaped because if you’re not, you’re not connecting the dots between the technology, the business, and then the risk issues that could come out of either the development of technology or the management and use of it in the context of the business.

Tom G: [00:09:22] You also have to be able to speak the various languages. So, you know, the language of business is totally different than the language of, uh, you know, technology or the risk of compliance. And you need to be able–skillset wise–you need to be able to converse depending on the audience that you’re talking to and in language they understand.

Malcolm: [00:09:43] Yeah, a thousand percent agree with you, Tom. And I saw that in every company I’ve worked for, including at Intel, you know, but the thing that I think we sometimes don’t do, if I go into the board, if I don’t understand gross margin, net income, if I don’t understand, uh, turnover, time to market, you know, all those other accounts receivable, you know, all that type of stuff, the board looks at me as somehow less than right, because I’m not speaking in their language. So I think at the same time, the boards need to step up and understand the language of what an APT is, the language of, of some aspects of technology and some aspects of security. Because I think we’re being too soft on boards because even if I was an attorney and I was looking at a board–and there’s a lot of attorneys on board–I would be expected to understand the basics of a PNL, right?
And so we expect people on boards and in business to have that as acumen, regardless of their role. I think it’s high time that we start expecting the non-technologist board members to at least be able to understand the basic nomenclatures in the security and technology space.

Camille: [00:11:01] Hey, Malcolm, what is, since you’ve interacted with a lot of boards without disclosing which one it came from, what is one of the best questions you’ve received around security from a board member?

Malcolm: [00:11:13] There’s there’s questions and some of them are make sense of the surface, but frankly, they’re a little bit irritating, you know,” are we secure?” Well, you can’t eliminate risk. So there’s context to that. So there you have to answer that question like “are we spending enough money?” Well, you know, again, depending upon the issues, that’s, that’s a challenging item, you know?
The ones that I think sometimes are more relevant would be, are we exploitable today in a way that would cause material and significant risks to the company or material or significant risk to our customers? Or is our use of technology or our production of technology, potentially creating a systemic societal risk? And if so, how and what should be, do be doing to manage it?
Because I think there is a, uh, an ethical and moral accountability that is missing in many of the discussions around risk. And that’s a question that I can tell you has never come up in any of the board meetings I’ve ever been in, but one that should.

Tom G: [00:12:33] When I heard you speaking there, what I was thinking of is you only know what you know. Uh, so when you’re talking about risk, based on everything you’re aware of, you have been able to sort of mitigate the risk, you know, to a reasonable level. But it could be that tomorrow something comes to light and you realize that it’s been hiding in plain sight the whole time, right in front of you and nobody could see that there actually was unmitigated risk and you just didn’t know it.

Malcolm: [00:13:04] I completely agree with you, but that’s the job. Our job is not to just calculate risk it’s to contemplate it. You know, it’s one of my biggest irritations. Even over the past 15, 16 months, people were like, “Oh my God, the pandemic” and their hair was on fire and they didn’t know what to do. And they didn’t understand this and “who would have thought…” Well, when I landed running security and business continuity in late 2001, we were starting to build pandemic response plans 20 years ago. So there’s, there’s that aspect of you can’t know everything, but with a high degree of likelihood with smart people and some data, you can truly forecast the potential of different risk scenarios that are even remote to just be prepared for them. You might not be able to prevent it, but at least you’re going to know how to respond to it quick enough that you can minimize the damage potential to the company and its shareholders and potential customers.

Camille: [00:14:09] Well, isn’t part of the question then. Who’s kind of your source for. Yeah, figuring out what sort of trends or threats might be out there?

Malcolm: [00:14:16] Yeah. And that has to come from external sources to the one hand, right? The security industry itself and, and threat and intelligence and those types of things. On the other hand, it can come from, you know, governments. Whether it be InfraGuard the ?I-SACs? or, you know, if you’re in certain industries, you’re going to have certain, in some cases, classified or confidential meetings where, they’ll give you some perspective on, on different things.
But on the other hand, it has to come from inside. You know, again, back in my Intel days, we had an Emerging Threaten and Intelligence program that we started building in 2003, 2004. Why? Because I wanted to know what might be coming. We would send people out to go figure out what vulnerability researchers were thinking about before they even published a vulnerability paper. Why? Because if I knew that and then smart people we could predict and then therefore prevent or be better prepared to detect and respond to things before they ever occurred.

Tom G: [00:15:21] Yeah. You want to have basically Scouts that are out, you know, out in advance of the researchers trying to figure out you may not be perfectly able to predict what they’re going to find, but if you know, sort of generally the direction they’re looking, you can be out there ahead of them.

Malcolm: [00:15:37] It’s it’s no different, again, put on the business discussion on the board. A good company is doing long range, strategic planning. They’re thinking about markets. They’re thinking about economics. They’re thinking about. um, competitive changing dynamics. They’re thinking about interest rate and exchange rate changes. Why? Because the all that has macroeconomic and microeconomic implications on the buying patterns and what products you need to do. And I just think that way for security and is it’s, that’s the irritating part that the industry doesn’t do it. A lot of security professionals don’t do it. And the business folks don’t think about it in that same context, because it’s no different, it’s just a different type of competition. Right?
You’re competing against the threat actors and threat agents and mistakes and other issues just like in, in a marketplace you’re competing against the market competitors, but there’s all these externalities that can affect the business that you don’t control. No difference. Just the context is different.

Camille: [00:16:36] So Malcolm, I was interested in something, you said a couple of minutes ago about the question that doesn’t often come up or never comes up on societal risk or, or I’m not sure exactly what you meant. I can, I can imagine where you might’ve been heading, but I’m interested a little bit more in that topic.

Malcolm: [00:16:57] Yeah. Well, let’s just, you can look at something like SolarWinds. I don’t have any firsthand knowledge and stuff like that, but that created a systemic societal risk considering the pervasiveness of it in people’s compute infrastructure and then the implications of that, not only for their business, but again, how that could be weaponized that could affect others. Now, SolarWinds may have purely just looked at their IT risk and said, “okay, typical desktops, laptops, networks, and stuff like. They may have had a product security stuff that says, “Hey, do we have the security development lifecycle? And is it doing okay?” Did they ever connect the dots between a compromise of their systems that could then compromise their products? Don’t know.
Most technology companies by and large don’t do that. They silo it security from product security assurance. That’s a, uh, an angle of, of potential societal risk or another example of one would be, if I was in the grocery store industry, right? You go, you can think of cyber risk and have wide variety of ways. I got a rewards program. So I’m collecting personal information on people who buy stuff. Okay. Risks, maybe some privacy implications. I got payment card industry standard compliance. Well, I might have some problems there and if I don’t do that, right, I might not be able to process a credit card. That’s revenue risk.
You go, what might be a societal risk being in the grocery industry when it comes to cyber. Well, I don’t own the slaughterhouse, but food safety data from the slaughterhouse all the way through the point of sale, if that data could be played with, people could die or get sick. And so then you have to start thinking about, is there an animal rights whack job activists out there who might want to save 10,000 cows tomorrow who might be an employee, might be an external person and might want to play with the integrity of that data in order to create a food scare in order to save cows.
Well, that’s a plausible scenario, but [00:19:00] people could die on the other side of it. And I know some folks in the food and beverage industry that before I had that dialogue with them, we’re not looking at that data and that data integrity with that lens, which would have potentially caused people to get sick or die. And it certainly would have had a substantial revenue brand and organizational implication if that were to occur.

Tom G: [00:19:25] Let me switch gears a little bit and, uh, get to something that’s also in the news. So we just had this ransomware attack that hit the pipeline over on the East coast. What can you share about what we know so far?

Malcolm: [00:19:40] Yeah. So again, I don’t have any firsthand knowledge of a lot of this is just my opinions from what I’ve read and then just discussions with a bunch of other peers. Um, you know, so on, on the one hand, if what’s in the news is accurate, the ransomware item was sold and weaponized for somebody to make money. And not directly a nation state actor. And I think there’s a lot of reasons to believe the organized crime unit that came out and said, “Hey, we didn’t intend for the ripple effect this way” to believe that because I think if it’s truly a nation, state actor would have been much more precise, directed, and implications on a particular target versus this type of thing.

Having said that, a lot of hacking organizations and organized crime, you know–lots of course being a relative term. I don’t know the exact percentage. Some of them are the equivalent of pirates doing the bidding of nation, state actors. So going back to referencing pirates in the 1700s, right. They, they were authorized, but if they get out of control, you deal with them in different ways. Right? So the, some of the, those hacking groups and organized crime stuff are employed by as to some extent, cyber mercenaries by nation state actors. So is it possible that some of these groups could be used by nation states? Yes. Was this one likely directed by a nation state? No.
And then you got to go now, now let’s look at the Colonial Pipeline. Again, if, if what’s in the news is accurate, their IT environment–so their typical enterprise desktops, laptops servers network that their employees use for running the business—and had gotten compromised and it had a spill over into their OT environment, that operational technology for the pipeline. Now, if that were to occur, then, then you got to ask the question, well, why were those things connected? Because a lot of security folks would say those things should be separate environments. Maybe they were, maybe they weren’t. Maybe there was a Sneakernet that crossed the, the isolated environment. Maybe they were sloppy and had a flat network and everything was all connected on the same things. It’s hard to know because that hasn’t been revealed.
But again, my own experience yet again, that pipeline is the equivalent of factory, like an Intel factory, right. That Intel factory is connected to a wide area network. It’s also got an adjacent office building. So even though there’s a level of isolation and stuff like that, that you do for factory systems, there is in most cases, some level of connectivity because your ERP system has to send an order to the factory, your inventory and your cost management system. There’s always going to be things that are flowing in and out a little bit, even if it’s a narrow pipe or somebody physically going from one [00:22:30] environment to another environment.

So depending upon how, how that crossover happened, you know, which I haven’t seen that, that level of detail at this point, it’s entirely possible they had a decent set of controls and a decent set of isolation and something crossed the chasm so to speak. Or they could have had a flat network and that was then therefore, you know, little to no control that allowed that spill over to occur.

Tom G: [00:22:58] Do you think there’s again, I noticed this is so early, so it’s kind of hard for us to do this in retrospect, but are there key learnings that we know of at least so far that would apply for like our listeners to say, “okay, well, you don’t want to be the next, you know, ransomware big attack on the news or whatever.” But we can make sure that people have on the front of their mind of, about looking at their infrastructure and making sure they’re not going to fall prey to the same issue.
Malcolm: [00:23:23] Yeah, well, I mean, the reality is you can’t eliminate risk, right? We can’t eliminate it physically logically you can eliminate it in financial markets. Heck even if you just sat on cash, you would face inflation risk. Right. Even though it might be perceived safer. Um, so, so that’s true in the digital sense of, um, the word as well.
You know, the, the hard part is, and I posted something on, on this, on, uh, yesterday on LinkedIn that the, the news articles, this is a wake up call, critical infrastructure, ransomware, stuff like that. That’s just, frankly irritates me. If it woke you up, you weren’t doing your job. Um, you were asleep at the wheel–and that’s true of both boards, CIO, CTOs, and the security professionals and organizations if this was a wake up call for you. Or even politicians.
Ransomware has been going on now for years. We’ve had decades worth of cyber risk occurring. And again, getting back to our “who would have thought?” well, Not that hard to think of malware, uh, going into an environment is shutting it down and then holding you hostage. I mean, physical kidnapping and ransom happen, unfortunately too often, too.
That’s just a failure of doing your job. I think when you then say, okay, well, now that I’m. Let’s say I did get woken up on this. What should I do? I’m a big believer that in some cases, and this may sound contrary to most people’s view I think in some cases we’re overstepped on information security. I think we are doing band-aids mobile gum and bailing wire making up for dated security technologies and other technologies that don’t work [00:25:00] they’re insufficient and flawed controls. And what we’ve got to start doing is weeding and feeding our environment.
Go look at the effectiveness and the efficiency of control, and if it’s not effective and efficient, shut it off, get rid of it and buy something better. And hold the security community accountable towards three vectors–risk, total cost, and control friction. Cause controls are also a drag coefficient. They slow down people, data, business processes, right? And, and in a high control friction environment, what happens your user and business goes around the control. Why? Because you’ve impeded the business. Right.
And so we’ve got to hold CISOs more accountable to real outcome-based metrics on risk, total cost and control friction. And in doing that, they also need to drive accountability back to the security industry and what we should be doing when a real breach like this occurs–whether it be Target, Anthem, Home Depot, Colonial, SolarWinds–we need to do almost like a National Transportation Safety Board type review. And go, what controls failed and then publicly label it, including the company who sold you, the control that didn’t right

Camille: [00:26:15] Right. To get better instead of kind of pinpointing it on a single company looking at the overall controls and processes and saying, what is the greater industry need to learn and apply? Because I, I have a question about blame. Okay. So can we, uh, for a lot of these things, one of the ways in for ransomware is say a phishing attack–could even be a, a text message, um, that you respond to. So we drop all the fancy stuff, technology and software and helpfulness and do we just invest in training people?

Malcolm: [00:26:48] Yeah, I I’m a big believer in training people. And that’s necessary. It’s not sufficient. That whole approach has turned into blaming the users for using computing. “Be careful on what you click on.” “Don’t open these attachments.” Now on the one hand, I want employees to be cautious of that. It’s like the, do not talk to strangers. But how do I use my computer? I click on things. I opened things, right. If I’m afraid to go do that, I’ve just reduced, what computing is about and how I use it and how I engage it. So we’ve got to tell people to be cautious, but if every time before I opened an attachment, that looked like it came from Tom, because we happened to work at Intel, but I didn’t know we was going to send it to me, I had to call him, think how inefficient that would be.
I think this is where we’ve got to start spending real time, deeply understanding the control environment and then moving to advanced technologies that can prove they can mitigate and manage the risk. But since technology is ever changing, usage models are ever changing, threatened vulnerabilities are ever changing, you kind of have to be on the forefront of emerging security technologies, because if you’re not, you’re falling behind.
I had this discussion with the former CIO of Qualcomm a couple of weeks ago, and he’s like, “look, the lifecycle of security technologies you should be pushing that stuff every couple of years about looking at changing stuff out.” Why? Because the company that you’ve bought stuff from it’s a cash cow. Right. They want to increase product margin, which means they’re probably going to starve R&D and then they’re going to market the crap out of it to make you think it’s still sufficient when it might not be. So you have to look at the people you’re buying from and look at how they make money and what their incentives are in some cases it’s to keep you hooked on stuff that doesn’t work anymore.

Tom G: [00:28:57] Yeah. I, I agree. And, and I think the, you know, one of the talking points that I’ve had with folks is when it comes to security, do the basics. Just start with the basics, start with them. That it’s not a full answer. Start with the basics. And starting with the basics are things like, do you update your machines? Do you make sure that you’ve got the latest, greatest, you know, vulnerability fixes and so forth across your entire infrastructure? Because how many times have we heard about exploits that are taking advantage of things that have been known for five or 10 years? These are not brand new exploits that just happened. These are things that the industry has known about for five or 10 years, but the bad guys are counting on the fact that you didn’t fix it.
So we can close a lot of the vulnerabilities if we just do the basics and most companies don’t do an adequate job at even doing things like updates and patching and so forth. That’s number one. And then number two is what you started at Intel as an example of just training people. And yes, I agree with you. It is not, you shouldn’t blame the user, but you can prevent a lot of bad results. If you have employees that are sort of trained to be, at least be cautious.

And, uh, and then of course you do need tools that you can’t have a single point of failure anywhere. So if your employees fail and they click on something, they shouldn’t, that should be the, the way into your entire network. So you need solutions beyond that as well.

Malcolm: [00:30:27] Yeah, I agree with you, Tom. And two other aspects on that in on the training side, and you think of the lion’s share of training that’s occurring it’s with the general user population. And in reality, we should probably be spending more training on the technical population–the IT professionals, the app developers, the people who manage the websites. If we have them better skilled, technically in security and how to manage the configurations, how to do these things, we’d probably have a higher payoff.
If technology companies spent more time making sure that every engineer who created code or develop technology understood security versus just functionality again, you would change the technology vulnerability dynamics by focusing on that training, which we don’t do enough of. And then on the other side of it, with respect to how you, you look at these things again, if you hire smart people and you’re making business decisions and you ask questions around not only risks to the business again, but the risk of your customer, societal risks and your future casting out, you can nudge a lot of risk out of the way, just by what I call being a choice architect because I I’ve always thought of my role as architecting choices for the business. Sometimes I get to make the choice. Sometimes the CIO did CTO business unit, GM, CEO, board. But if I architect choices the right way, we’ll make better business decisions, which will then subsequently allow me to lower risk manage total costs and reduce the control drag coefficient of friction that’s occurred with the controls.

Tom G: [00:32:20] Before we go, we do have a segment that we call Fun Facts. Wanted to hear your take on something you think our listeners would find useful.

Malcolm: [00:32:30] Yeah. You know, uh, interesting fun fact: the Dr. Seuss book, Green Eggs and Ham. I love it. Raed it. It’s still, occasionally we’ll reference some of the rhymes in it for different things, but that was written, um, by Dr.Seuss to win a bet with this publisher who bet him, he couldn’t write an entire book using only 50 words. And, and so it interesting fun fact and then the tie into the dialogue that we just had is that level of minimization simplicity and focus, if we brought into the security space, the technology space instead of being data hoarders and just, we need more and more and more and more and more and got pithy on certain thing we’d probably shrink a large amount of the risks that we’re seeing.

Tom G: [00:33:17] Interesting. I love that book. I did not know that it was 50 words or less, uh, but I know that the big, long monologue, uh, I used to try to do that entire thing on one breath. And let me tell you, it is not easy. It is not easy.

Camille: [00:33:32] Remind me what makes the eggs green?

Tom G: It doesn’t say.

Malcolm: [00:33:33] Yeah, I was going to say, I honestly don’t know,

Tom G: [00:33:44] probably green dye, number five. Uh, so Camille, what, uh, what kind of interesting fun facts do you have today?

Camille: [00:33:49] Well, I was going to dispel an old fisherman’s tale, I think today. So I’ve been told multiple times since being in Oregon, that the Willamette River, which is one of the main rivers that flows right through the center of Portland and a good portion of the rest of the state. Is one of only two rivers in the world that flows north–that and the Nile. And of course, everybody knows the Nile. So I was going to have that be my fun fact. And then I went online just to verify it. And there’s lots of rivers that flow north. And in fact, of course what’s more logical is that rivers will flow whichever way is easiest and they flow down. Not North or South. So it depends on how your watershed is situated. Um, so the Willamette, it does flow north, but so do lots of other rivers.

Tom G: [00:34:36] Yes. So what I’ve heard–and again, I don’t know, I haven’t independently verified it–but I’ve heard it multiple time is that the Willamette, by volume, is the largest volume river that flows south to north in the Northern Hemisphere. There’s lots of South to the North and the Southern hemisphere, but, um, in the Northern hemisphere, apparently it’s—

Camille: [00:34:57] So if we qualify it enough, we can get to (laughs) it is one of only two.

Tom G: [00:35:04] Make it special one way or another. That’s right. And it happens to be right in my backyard. So, uh, I love that river. Well, so here’s my fun fact. I went back to the dog one, uh, because everyone who knows me knows, you know, Chester, my dog usually comes in and invades my meetings. A dog’s sense of smell is a 100,000x more sensitive than humans, which I thought was interesting. I never knew the number. I knew that obviously it’s way more. Uh, sensitive, but at the same time they have one sixth, the number of taste buds. So that kind of explains the things that they’re willing to eat, I guess.

Tom G: [00:35:49] All right. Well, Hey Malcolm, thank you very much again for joining us today. I thought, you know, the, the various topics that we covered today were super, super cool. [00:36:00] And like I said, we could have gone much, much further but thank you for joining us.

Malcolm: [00:36:01] Hey, thanks Tom. Thanks Camille.

Announcer: [00:36:11] Stay tuned for the next episode of Cyber Security Inside. Follow at Tom M Garrison and Camille at Morhardt on Twitter to continue the conversation. [00:36:30]

More From