Skip to content
InTechnology Podcast

#2 – The Next Security Frontier: Taking the Mystery Out of the Supply Chain

In this episode of Cyber Security Inside, we’ll discuss what we’re calling “the Next Security Frontier.” It’s a new take on cyber security that argues for taking the mystery out of the supply chain.

Share on social:


Tom Garrison: Hello, and welcome to the Cyber Security Inside podcast. In this podcast, we aim to dig into important aspects of cyber security, which can often be highly complex and intimidating and break them down to make them more understandable. We aim to avoid jargon and instead use plain language for thought provoking discussions.

Every two weeks, a new podcast will air. We invite you to reach out to us with your questions and ideas for future podcast topics.

I’d like to introduce my cohost, Camille Morhardt Technical Assistant and Chief of Staff in Intel’s Product Assurance and Security Division. She’s a co-director of Intel’s Compute Lifecycle Assurance, an industry initiative to increase supply chain transparency. Camille’s conducted hundreds of interviews with leaders in technology and engineering, including many in the C suite of the Fortune 500.

Hi, Camille, how are you doing today?

Camille Morhardt: I’m doing well Tom.

Tom Garrison: So what’s on your mind for today’s Security Matters segment?

Camille Morhardt: I’m wondering what kind of data we should be collecting about our own products and is there any inherent risk in collecting that data?

Tom Garrison: Hmm. And we talked about in our last podcast, we got started talking about, you know, supply chain data and so forth. What elements are, are you thinking about now in the context of data?

Camille Morhardt: Well, I’m interested in, I think a lot of companies can stand to learn a lot about their products by collecting data on how products are used. What’s happening then in collecting the data is you’re inadvertently or on purpose, collecting and storing potentially data on the users.

Tom Garrison: Let me ask you, let me ask you a question. When you, when you buy products and you have the infamous little checkbox, do you share data to help the company improve their product? Do you check yes or no?

Camille Morhardt: Uh, I checked, no. I, the only place I checked yes is actually probably the most private when it comes to health, when a hospital or something is, you know, can we submit a part of your tissue or whatever, to some broad sample that they anonymize it?

Tom Garrison: Yeah. I always check no too. You know, why do we both check no? We’re both in this industry and yet we check no for making products better. Why is it? I have my opinion, but I’ll share in a second. Why, why do you have the no?

Camille Morhardt: Fear and the lack of trust. Is that data stored and is it traceable back to me?

Tom Garrison: Yeah. Interesting. So I’m all for making the product better, but I think that they’re watching me. And it’s, it’s more about, not about the product it’s about, are they using this data in some way, either intended or unintended to watch and listen to me. That’s not okay.

Camille Morhardt: How many Intel engineers do you know who put a band-aid over the camera on their computer.

Tom Garrison: Yeah, sure. A band-aid or a piece of tape or lord knows whatever else.

Camille Morhardt: Yeah, exactly.

Tom Garrison: So that is, that’s interesting because we’re in the industry and we know that that kind of information–the information about how a device is used and what works well and what doesn’t work well and maybe what features people actually value and use all the time versus ones that we put a lot of effort into putting them in place and nobody ever uses them–that kind of information is super valuable to product designers, product engineers, making the experience even better.

But yet, you and I at least as data point of two, we choose not to do it because we’re afraid our information is going to be misused.

Camille Morhardt: Yeah. Or maybe I want something out of it rather than just a better next generation of the product. Maybe you’ve got to give me some kind of a kickback if you’re going to start watching me or collecting my information.

Tom Garrison: Uh, so Camille has a price. You have to, you have to pay Camille for a day.

Camille Morhardt: That’s right, if I’ve got 30 different devices in my house and I’m getting, you know, 1 cent, every time something’s collected, then maybe it would be worth it to me.

Tom Garrison: If you could absolutely guarantee there was no data about you, the user, would you be willing to share, for example, how many hours a day you use the device? Would that be information you could share?

Camille Morhardt: Yes. I think it would be information that I guess I would consider more generic. So if my car were collecting information on how much I drove it, you know, in a week, that would be okay, but not if they collected information on where I was driving or what specific time I drove, even though I suppose that could be even more valuable to them.

Tom Garrison: Right. Interesting. So I think within the technology space, we have a similar set of decisions to make about what kind of information we use and what kind of information we share. For example, last week in the podcast, we were talking about supply chain and that kind of information. You know it occurs to me that that kind of information doesn’t really have anything to do with the user at all. It’s really about the device. To me, that’s information that’s valuable from the manufacturer to the user. So that the value is actually not going from the user back to the manufacturer. The flow of value is going from manufacturer to the user.

Camille Morhardt: Yeah. You’re flipping the direction of the flow on that one. Um, and I think that’s good. I just think that’s only good. I think that provides, um, the user of the device with as much information as they want–we’re talking philosophically here, right? However much information they want, they should have access to about any way a product was made or designed or any kind of the specifications or capabilities, or even collecting patterns of that device.

As a user, I can choose to not read that or not care about that if I don’t want to, but if you don’t need to make it available, then I can’t trust you, right? If you’ve, if you’ve made it available for those people who care more about that and I choose not to, I still feel better about it knowing you’re being watched.

Tom Garrison: Well, and I think we’re used to it in a different industry in the auto industry, right? We’re used to who manufactured the car, but then also the, the car facts of that car over its life, you know, we’re used to having that kind of information. We don’t necessarily have that when it comes down to pieces of technology.

But I think this conversation, in general, this is an episode. This is what we should do for this podcast. Moving forward. You agree?

Camille Morhardt: What do we collect and why?

Tom Garrison: Great.


Today’s podcast is titled The Next Security Frontier: Taking the Mystery Out of the Supply Chain. I’m happy today to introduce our guest Mike Mattioli. Mike and I have a long history working together. He is a longtime member of Intel’s Client Board of Advisors, but also he has a role as the Hardware Engineering Lead at Goldman Sachs. He’s responsible for the design and engineering of the firm’s digital experiences and technologies and is also responsible for the overall strategy and execution of hardware innovation, both within the firm and within the broader technology industry. Mike, welcome to the show.

Mike Mattioli: Thank you for having me.

Tom Garrison: There’s a little bit of a story behind today’s podcast. We were at one of these Client Board of Advisors sessions and you and I sat together at lunch and we started talking about supply chain security and, and how we thought it was an important capability that needed an industry solution. And so I thought maybe it’d be good to start back from that lunch and maybe give some of the background of what led us together to working on the white paper that we coauthored.

Mike Mattioli: It’s funny. It was only six months ago, but it was a totally different world compared to where we are today. We were talking about just hardware security in general. And then one of the things that we kind of landed on was it was a very simple question. If somebody were to take a component and put it somewhere on a board or inline somewhere on a piece of hardware, how can you tell if that was done–at any point in time–whether it is done in a factory somewhere it’s done in transit done while it’s in your own data center? And then ultimately we, we connected with Baiju and we started writing this paper and, you know, at the heart of the message is hardware security. And how do you really have secure and trustworthy hardware systems?

But the purpose of the paper and the supply chain was to highlight how there’s so many opportunities throughout the entire supply chain that you’re exposed to a variety of different attacks in one form way or another.

Tom Garrison: The conversation, as you said, we’re talking about the article that was written a Bloomberg article, where there was rumored to be a chip that was added to a board. And it does turn out to be very difficult to find something that’s been added to a board.

But, as you pointed out, our conversation led to this broader challenge. And so on one level you can say, you know, what, if somebody adds something to my board, but on the most basic level, the question is, do I know what is in my platform–whether it’s a PC or server or whatever? And it’s a question that should be answered, but today, it’s really, really difficult to do that. In fact, really the only way to do it is through either a combination of visual inspection or specialty services that are offered by certain manufacturers.

There are some people that are absolutely on board saying they want to endorse what we’re saying, and really get behind an industry solution. And then there’s other people that say, “This isn’t a problem at all. This is a technology looking for a solution.” So I wonder what your response is to that latter group.

Mike Mattioli: So I think the people who don’t see the problem, don’t understand the problem. They feel that they’re not exposed and that sometimes are the class people who, um, they look at a certain attack, whether it be hardware, software, or otherwise, and they say, Oh, well, that’ll never happen to me. Or I’m just John Smith. No, I’m not important.

But the, the truth is, is that you don’t have to be a government or a high powered company to be the target of an attack. It can happen to you just as he can to everybody else. Everyone is subject to these attacks and everyone deserves the same level of trust transparency.

Tom Garrison: I think you may have already partially answered it, but why is the supply chain security important in general? And to Goldman Sachs?

Mike Mattioli: Hardware is in many ways, the foundation for all of the electronic transactions that we perform today–whether it be financial medical, anything in between. And if there’s something wrong with the foundation for those transactions, everything above that is at question. If you’re exposed at the foundation, how do you know that everything on top is truthful is honest–it actually is what it says it is? For the firm, more specifically, we’re trying to build out what we refer to as our “financial cloud.” And this is one of the building blocks of that. This is how do we transact securely with our clients and our customers?

But more generally speaking, moving back to the industry in general, I think that this year is a very interesting year for two reasons. The apocalypse is upon us and we all have to now work from home. And so in the midst of that, everybody started ordering hardware in droves from Amazon, CDW, Best Buy, wherever it may be. And all these people ordered hardware and started doing business transactions, or now they’re even doing a health tele-visits with their doctors.

They’re doing all these things more and more remotely in physical places that are unknown and untrusted over networks that are unknown and untrusted. The only thing that you can have, even some semblance of trust in is the hardware that you’re using. But does anybody really know if it’s secure it if just came off the shelf on Amazon? Not saying that they’ve done anything wrong, I’m just saying, how do you know?

And then on top of that, this dovetails very interestingly into the election this year. And a lot of people are talking about remote voting, absentee ballots and things like that. And a lot of those things are still done on paper or they’re analog, if you will. How do we do voting or elections or ballot submissions, if you will, using secure hardware?

Camille Morhardt: Hey Mike. So could you talk a little bit about, you seem to be addressing kind of who is maybe going after our hardware at this point. And I’m just wondering if you could talk about how you’ve seen threats to hardware evolve since you’ve been in the business?

Mike Mattioli: Sure. So I think that hardware has for a long time been overlooked and I think in recent years it’s become much more prevalent. Spectre and Meltdown, when those came out a few years ago and then multiple variations thereof, all these different attacks–and while today they’re very, very primitive–I think that we all have to realize is the game has changed.

People are attacking in a whole new way and we need to be prepared because we have no– we have some defenses–but the defenses that we have are very, very primitive. And as those attacks get more and more advanced, if we don’t keep up, then we’re not going to have anything to defend ourselves with.

Camille Morhardt: So the next thing I guess I’m curious about is you had mentioned a few minutes ago that there can be attacks at various points in the supply chain. It seemed like you were extending the supply chain past the point. The product had shipped. Can you say more about what kind of threats you think exist in where a supply chain might have exposure and sort of your definition of a supply chain?

Mike Mattioli: Yes. So all the way in the beginning, when you design ICs, there’s many things that are designed by hand or designed by the designers themselves like memory controllers, for example. But there are lots of different components that are part of a design that gets sourced from companies like Cadence or Synopsis. So, right off the bat, how do you know that whoever’s doing the designing–the in-house component, like the memory controller–how do you know that there aren’t bugs in it? Or how do you know that somebody didn’t do something malicious like they put a trojan in there?

Or when you source that, you know, third party IP, how do you know that there’s not a bug in there or there’s a trojan in there? And then go further down the line to the foundry, right? When you send your design out to the foundry to be fabricated, how do you know if anything was changed?

And then let’s say, you know, you trust your foundry, you trust your designers. Now moving a little further down the line, once they’re fabricated and they get sent off to the ODM to be assembled, how do you know if something happened over there?

Finally, goes onto the courier and it gets shipped out–whether it be a boat, a plane, a train. Couriers are interesting because there’s a lot of different ways that they can interact with systems. They can swap out hard drives, they can swap out memory chip, they can swap out fans, right? There’s all sorts of different components that people can play with. Point being is even when the courier tries to take it from the factory to the reseller or your home or your data center, even there’s exposure there.

And then while you’re using it, right, while you’re operating and using it, I’d say somewhere down the line, a fan fails and you need to replace it or a hard drive fails, you need to replace it. Whoever goes in and, you know, physically touches the device, how do you know that they didn’t do something malicious at that point?

Moving even further down the line. Once you go and sell it or you recycle it or refurbish it, or you give it to somebody else. If you’re the person buying it or receiving it, how do you know what happened to it? How do you know its history? And we sort of, you know, we verbally, we were talking about, about the analogy for almost like Carfax for PCs or, or electronic components or, or some way to, to have like a history of every single thing that happens to that device along the way of his life, and it’s immutable. Like you can’t go back and erase it and you can’t go back and change it.

Part of what we’re trying to express the people is that even if you can’t stop something, or even if you can’t prevent somebody from doing something malicious, at absolute least you should have transparency. So if somebody was able to compromise something, at least you know and then you can go and take that out of service or you can unplug it or do something. But, but at the very least, knowing is very, very powerful.

Camille Morhardt: Okay, so how does that happen when obviously supply chains–especially in the compute space–are complex ecosystems and sometimes competitors need to collaborate with one another? So you’d mentioned carriers. I would say definitely third party logistics, handoffs–especially across international borders–it’s going to be a place where you have competitors collaborating. So what are options for establishing or maintaining that trust and the transparency of data in these scenarios?

Mike Mattioli: Whoever’s making entries into the ledger or whoever is actually saying these events occurred, you’re placing trust in whoever it is that’s saying that. Now on the flip side, you know, you’re trusting hardware itself–which I’m personally more of a fan of–that ideology sort of describes a way in which components on the devices themselves are actually the ones that are saying, ”Hey, this happened or that happened,” or “this is talking to this thing” or “I’m receiving the signal from there.” The platform route of trust or self route of trust approach, while it takes out some of the transparencies that others can’t see it, you have a much more clear cut, defined approach to the information that you’re getting.

Camille Morhardt: So can devices actually self-report these days?

Mike Mattioli: Yes, but at a higher level. And I think what we’re trying to get to is a much more lower hardware level. That’s where the attacks are starting to come from. They’re starting to come from that foundation that everything else was built above.

Tom Garrison:I know at least from my perspective, hardware is, is obviously if you’ve got trusted hardware, then you can start building a solution on top of that. With active components, meaning firmware-bearing components, the hardware itself may be operating exactly as expected. But the firmware that runs on that device has actually been somehow manipulated and corrupted. And I wonder if you could just speak for a moment about that class of attack?

Mike Mattioli: It’s very easy to manipulate firmware. And one of the benefits of one of the proposed solutions that we put out there was, if you at least know that the firmware that you have is out of date, was changed was modified, has been tampered with whatever it may be, at least by knowing then you could take action upon that. And I think that’s what Camille was referencing earlier.

The idea here is you’re not trusting a sole entity. The proposal that we had here was, you have a bunch of different components–classify them as active and passive. And all the active components that you have in your system are able to communicate with one centralized, call it, master component, if you will, on the board. And that master component reports back to you and says, “Hey, these are certain attributes”–let’s call them date of manufacturer, place of manufacture, firmware version, et cetera, et cetera, whatever those attributes may be. And at least, you know, “Hey, I’m expecting to see these things,” or “I need to know what these things are.” And if these things aren’t what I expect them to be, or they aren’t what I want them to be, then I need to act on that or I need to mitigate that, or I need to do something about that.

Tom Garrison: Right, yeah. It’s not about having the end-all be-all solution that does everything automatically. It really just is as simple as “do I know what’s in my system?” I think that’s pretty powerful.

Mike Mattioli: I think people will start to think, think about this in different way. Like in the, in the paper, we, you referenced a PC or a compute device, but we do so in very, very generic terms because people relate to PCs or computers and servers and things like that.

But the reality is, is that this can happen to anything. This can happen to your power grid. This can happen to the autonomous vehicle that’s driving your children to school. This can happen to a self-driving tractor-trailer, truck, that’s hurling down a highway. People are going to have to start thinking about this holistically and widening the scope outside of just my laptop or my iPhone.

Tom Garrison: So I want to change gears here a bit and maybe just have a little bit of fun. We’ve been doing this with our first several podcasts. And so I’d like to get your take on this one, Mike. And it has to do with this COVID-19 and all of the changes that we are interacting differently now, we’re supposed to be at home. And I’d like to get your take on what—in this new world that we find ourselves in–can you not wait to get back to the old way of doing something? And then also, what is it that. Now that we’re in this new world, you don’t want to lose it?

Mike Mattioli: Um, so the one thing that I definitely want to get back to, and actually this really disappointed me a week or so ago when they made the announcement, I believe they announced that CES 2021 was going to be canceled, or rather, I should say it’ll be a virtual event. And I know you and I have a love-hate relationship with CES. Every year it gets worse and worse. But I think that is definitely something that I’m looking forward to in 2022.

And then the one thing that I, I don’t want to lose, I would say is not having to get all dressed up for work every day, kind of getting to casually kind of do whatever. You can only see me from the torso up. So I think that’s an advantage (laughs).

Tom Garrison: (laughs) I can comment firsthand of all of the different t-shirts that Mike wears. He is the connoisseur tour of wild t-shirts!

Mike Mattioli: I’m waiting for the glow in the dark one you said you we’re going to send me with LED lights? Maybe you can put a, maybe you could put a Silicon wafer in the center, right?

Tom Garrison: It’s on its way! Well, hey, thanks Mike for joining us today. And I think there’s a lot to really dig into when it comes to supply chain security. And I do invite all of the listeners to go on to the website and we have the supply chain white paper that I referenced at the beginning available for download. It is something that was co-written between myself and Mike and also Baiju Patel, who is one of the Intel fellows.

We spent a lot of time trying to dig into the details of why supply chain matters and what things should be on people’s minds as they think about their hardware purchases moving forward. So hopefully there’s a lot to learn from that as well as what we talked about here in the podcast. So. With that, Mike, thanks for joining us and for everybody else, join us on our next podcast in two weeks.

Thank you so much for listening. I’ll see you next time.

More From