Skip to content
InTechnology Podcast

#26 – What That Means with Camille: Crowdsourced Security and Bug Bounty

In this episode of What That Means, we’re talking crowdsourced security and bug bounty, and we’ve got a treat for you: double the brilliance with Katie Noble and Alexander Romero (RoRo). Both are Directors of PSIRT at Intel and both have extensive experience in cyber security as DC veterans (think Department of Homeland Security and the Pentagon).

 

Our convo covers:

•  The flavors of bug bounties

 

•  Crowdsourced security

 

•  Vulnerability Disclosure Programs (VDPs)

 

•  Security Technical Implementation Guides (STIGs)

 

•  CSIRT & PSIRT

 

•  Hackcidents

 

•  Red teams, blue teams, purple teams

 

•  IoT

 

…and more!  Join me for an interesting and insightful conversation.

 

Here are some key take-aways:

 

•  Crowdsourced security relies on the wisdom of the crowd to find vulnerabilities in systems that might otherwise be missed.

 

•  Bug bounties differ from VDPs in that they’re more of an invitation to find vulnerabilities and report back to the vendor. With bug bounties, there’s also an incentive (sometimes financial, sometimes not).

 

•  Bug bounty incentives can include money, airline miles, lunch with important people, pieces of hardware, and other things.

 

•  Static bug bounty programs are often open to all products and all people. Proactive bug bounty programs may be time-sensitive and only open to specific products and specific researchers.

 

•  Bug bounty programs aren’t for everyone. There are some steps you need to take beforehand, like deciding what you’re asking people to look at. You also need to have a strong VDP in place first, so you can deal with the submissions and effectively mitigate problems.

 

•  Problems and vulnerabilities with products are reported to PSIRT. Problems and vulnerabilities with infrastructure are reported to CSIRT.

 

•  There are legal coverage considerations that you must think of with a bounty. The scope should be well understood, but you also need to allow for the reality that you might not know everything that the system touches.

 

•  If you start with an internal bug bounty program, you need to teach your internal team to have a different mindset, a hacker mindset. The mindset of a builder is typically much different from the mindset of a breaker.

 

•  Until we get to a place where we understand that there is only one world now, there’s so much attack structure that is being left unsecured.

 

Some interesting quotes from today’s episode:

 

“We had our own tools and our own way of looking at problems, but when you bring somebody from the outside in, they have a different view of the world, different frame, different lens, and that’s very helpful at times to kind of see things from the perspective of an adversary or an actual criminal hacker.”

 

“There is this compliance checklist, for example. We call these Security Technical Implementation Guides — STIGs. And if you follow these, you should be secure. But that’s not really always the truth. Sometimes there are other things, some other interactions between software or the services that you’re using, that then lead to vulnerability.”

 

“But researchers had never really been given the opportunity to talk directly to the DoD in that form before. And it turns out they had other vulnerabilities that they were aware of, that they wanted to tell us about, but we didn’t have a good way to accept those.”

 

“A bug bounty, for better or for worse, is going to pull attention towards your product or your company.”

 

“Bug Bounties are not appropriate for everybody. There is kind of a push, like a ‘fear of missing out’ kind of deal. Like ‘Everyone has a bug bounty and I want one, too.’ But that may not be appropriate for your business. There are other steps that you probably need to think about before you start with a bug bounty.”

 

“You need to be able to decide, what is it you want people to look at? Are you asking them to look at your products or are you asking them to look at you? That’s going to be different.”

 

“I would say you need to have a strong vulnerability disclosure process in place…You need to be able to deal with those submissions. You need to be able to respond effectively, mitigate the problems. All of that takes policy, process, procedure, infrastructure. It’s not something that is an overnight sort of deal.”

 

“You need to have a method of receiving that information, triaging that information, mitigating it, and then communicating back out to the researcher that you’ve done those things.”

 

“I would recommend that every organization start out with a sort of ‘internal bounty’ first. So have your engineers, have your folks who understand the system, try to find vulnerabilities. And if they don’t, still pretend as though they did, and then run it through your process that way — so you can find areas where your process might have holes, or you don’t know who the system owner is, or who can take action on it. At the end of the day, that’s what you’re trying to do.”

 

“A lot of times when folks have designed the system, they’re looking at it from that perspective. And it’s hard to switch over to kind of an adversarial mindset, which is what these researchers bring.”

 

“Also, things change, implementations change all the time. So it wasn’t necessarily a flaw when the product was designed, or a weakness — it was something that was meant to be that way. But then a product changed or was implemented in a way that the original engineer didn’t anticipate.”

 

“The role of the PSIRT is to facilitate. It’s to be the coordinator. It’s to be the balanced voice in the room that’s kind of trying to move things along. We’re not tied to one perspective or another perspective. We’re willing to be open-minded and see all the perspectives. And you definitely need all the perspectives.”

 

“The goal is always to protect the user, and it doesn’t matter if you’re in government or if you’re in private sector. The goal is to keep the eyes on the prize, protect the end user, make this as strong as we can.”

Share on social:

Facebook
Twitter
LinkedIn
Reddit
Email

Camille Morhardt: [00:37:00] Hi and welcome to What That Means: Crowdsourced Security and Bug Bounty. Today we have with us on the phone to experts, Katie Trimble Noble who’s Director of PSIRT and Bug Bounty at Intel. She leads the Cyber Security Vulnerability Bug Bounty Program for Intel. Prior to joining Intel, Katie served as the Section Chief of Vulnerability Management and Coordination at the Department of Homeland Security Cyber and Infrastructure Security Agency. Her team is credited with the coordination and public disclosure of over 20,000 cyber security vulnerabilities within a two-year period. Katie’s work directly impacted decision-making for government agencies in the United States, the United Kingdom, Canada and Australia.
We also have with us on the phone, Alexander Romero known as RoRo. He is a Director of PSIRT at Intel, as well, and leads the resolution of high profile vulnerability reports for Intel. Uh, prior to joining Intel, he served on a self-described “SWAT Team of nerds” for the Secretary of Defense at the Pentagon’s Defense Digital Service, where he championed the first ever “Hack the Pentagon” bug bounty challenge in the federal government, along with expanding the first ever vulnerability disclosure program for the Department of Defense.
He actually started his career as an enlisted U.S. Marine. And later on, he became the Chief Information Security Officer at the defense media activity, overseeing security of systems, such as defense.gov, American Forces Network, Stars and Stripes, and many others. He’s also a fellow at the Aspen Institute’s Tech Policy Hub, where he worked on a policy project called HALP–Helpful Alternative Link Protocol–which creatively used a under-utilized cellular frequency spectrum with existing smartphones to allow for resilient communications during emergencies.
So basically we have the DC contingent on the phone today with us, um, who focus on PSIRT at Intel. Uh, welcome Katie and RoRo. It’s great to have you guys with us today.

Katie Trimble Noble: [2:43] Glad to be here.

RoRo Romero: [00:02:44] Thank you. Happy to be here.

Camille Morhardt: [00:02:47] So, uh, we’re going to start off as usual by asking the two of you guys to define Crowdsourced Security and Bug Bounty. And, uh, because you’re both experts in both. Why don’t we just have, uh, RoRo go through crowdsourced security and then we’ll, uh, pass the baton to Katie to get into bug bounty for us.

RoRo Romero: [00:03:07] I think crowdsourced security encompasses quite a few things. Um, there’s many different sort of approaches, but, generally, what you’re looking for is getting the wisdom of the crowd applied to specific problem areas, where they have expertise that sometimes you may not be able to get access to them directly. Either because they don’t have the time to focus a hundred percent or be hired by that organization.
More directly, I think, my familiarity with it is in the form of bug bounties and disclosure programs. And they’ve been incredibly helpful to the DOD when I was there to find vulnerabilities in systems that, uh, we didn’t really know were there, or they brought a different approach and a different way of looking at the problem set.
You know, we had our own tools and our own way of looking at problems, but when you bring somebody from the outside in, they have a different view of the world, different frame, different lens, and that’s very helpful at times to kind of see things from the perspective of an adversary or an actual criminal hacker.
That’s sort of, my definition may not be the most perfect definition of it, but, uh, that’s how I would go defining it.

Camille Morhardt: [00:04:12] That’s good. Um, it’s good sentiment, though. It gets across the, the meaning. And just to clarify for people, I think RoRo through in a DOD, which is Department of Defense. And since these guys are both a DC veterans, we’ll probably get a few more government acronyms. Um, Hey Katie. So could you just walk us through what is bug bounty?

Katie Trimble Noble: [00:04:35] So bug bounty kind of falls under Vulnerability Disclosure Programs of PDPs. So the bank main difference between say a VDP and a bug bounty is that a VDP is kind of a “See something, say something.” So it’s a, it’s a mechanism for an external security researcher or a member of the public to, if they see something to report it back to the vendor who owns that product, so the vendor has the opportunity to fix it.
It’s kind of just an open front door. Right. Whereas a bug bounty, we kind of refer to that as, as more of an invitation. Um, it says, “Hey, if you find something, we’ll pay you for it.” And that’s sort of an incentive and additional incentive on top of a, um, on top of a, “if you see something, say something. ” It’s kind of a more targeted method of providing a reimbursement or an incentive to researchers so that they will report vulnerabilities to you rather than reporting them to say somewhere on the dark web or, um, Twitter.

Camille Morhardt: [00:04:31] Okay. And when you talk about reimbursement or compensation, I guess we can get into what that entails. I think usually of money, but I think that it, it’s probably broader than that.

Katie Trimble Noble: [00:05:42] Yeah, it can be. So bug bounty is really common in lots of different flavors. Um, there’s what I kind of often refer to as sort of your static and your, um, your proactive sort of bug bounty programs. So a static program is kind of a, any product that you find a vulnerability and you can submit it and there’s a standing payment schedule. Um, so usually that follows things like CVSS scores or some sort of, um, some sort of payment schedule that’s very prevalent and very kind of industry accepted.
You also have a more proactive stance. So you have a more proactive bug bounty programs, which are usually things like time-bound challenges that are specific to specific products or problems. And they tend to be, um, open to maybe only a handful of researchers, or maybe they’re open to the public, but for a certain amount of time. Sometimes in those kinds of setups, you would end up with the incentives that were not cold, hard cash. A lot of academic researchers have a hard time accepting cash because they are part of an academic institution that has an honor system that doesn’t allow that.

Camille Morhardt: [00:06:48] So let’s dive a little deeper. So, hey, I’ve heard that, uh, for example, an airline, uh, gives miles as opposed to money. This true? (laughs).

Katie Trimble Noble: [00:07:03] Yes, that’s the thing. Um, so that’s where you kind of get into your things that may not necessarily be cash. There was a very famous government um, I think it was maybe the Netherlands, uh, example of people who, who found vulnerabilities could report it and they would get a t-shirt and the t-shirt said something along the lines of “All I got from my report was the stupid t-shirt.” Um, but that’s, there’s been a lot of instances of, um, of things other than so challenge goings maybe, or, um, lunch with somebody important or, uh, pieces of hardware, other incentives are our thing. Yeah. And there was an airline that, that definitely gives miles.

Camille Morhardt: [00:07:43] So you guys both have worked for the United States government, um, specifically around bug bounty. And I’m just wondering, like how that works? How do you set up something where you’re hacking The Pentagon? I mean, that doesn’t even sound like it should be something that’s being set up. So how, I guess, how did you convince now a) the government to do it? Um, and, and why did you think that would be wise?

RoRo Romero: [00:08:08] Yes, I should caveat and say, like, this is just my personal views and I happened to be there for a lot of the, the start. I think my perception early on was that we weren’t doing a good job of protecting some of them were more public systems and websites that we operated. And we were using basically every tool that you can imagine, all the static and dynamic analysis of code, uh, tools that were out there. And yet we were still missing things. And so, uh, basically as the Defense Digital Service was standing up, they, um, they wanted to test this whole idea of like this crowdsourced model, uh, bringing in researchers to just hack on our stuff through a bug bounty. Now, everybody kind of thought it was a bad idea, actually, in terms of maybe we shouldn’t call it “Hack the Pentagon,” that doesn’t seem like exactly the right term that we should be using. Let’s call it “Secure the Pentagon.” There was a lot of back and forth on that.
Essentially like the first, first few days we really realized like this was the good, the right thing to do. Um, but at the end of the day, I think we got huge value out of it. Cause they bring that entirely different approach to, um, to looking at problems that we sort of had accepted. There is this compliance checklist, for example, right? We’ve done all the things we call these things. We call these Security Technical Implementation Guides–STIGs. And if you follow these, you should be secure. But that’s not really always the truth. Um, sometimes there are other things, some other interactions between software or the services that you’re using that then lead to vulnerability.
So I think that’s what we discovered very quickly after the first pilot. So the pilot was just on five websites. But researchers had never really been given the opportunity to talk directly to the DOD in that form before. And it turns out they had other vulnerabilities that they were aware of, that they wanted to tell us about, but we didn’t have a good way to accept those.
So we realized that we should have a vulnerability disclosure policy of some kind that says, “Hey, if you tell us about these vulnerabilities, um, we’re going to thank you (laughs). Uh, and we’re not going to send you a letter to tell you to cease and desist,” which is what was happening before. Um, so not only were they dis-incentivized, they were actually in some, um, at some points actually sent letters to, to stop.

Camille Morhardt: [00:10:15] It must be scary, I would think, to send a note to the department of defense saying that you’ve found a way to hack it. Um, it would be even scarier, I suppose, if you’ve got a cease and desist letter. So you’re saying you’re talking, you know, part of this is just making yourself available. I guess it would apply whether you’re a government or a company to hear the bad news and provide a way to accept that (laughs).

RoRo Romero: [00:10:38] Yeah. Yeah. For some things they’re just misconfigurations or they’re small things. And so they wouldn’t necessarily go through something like Katie worked on before, um, through SIRT through the SIRT process, they wouldn’t necessarily be assigned to CV. Cause it’s something that is already a known issue. It’s just misconfigured on a system that is, that is public.
Um, and then this is probably around the time that I met Katie, um, years or probably a year after, um, we kicked off the pilot for the “Hack the Pentagon” um, was around this idea that if there’s like a, see something, say something or disclosure policy that applies to DOD, why not have that applied to the rest of government just seems like it would make sense.
And most people, they don’t know every nuance between like where they should submit a report if they find something on a government system, because government’s huge, there’s, there’s so many different areas. So this actually led to some conversations between Katie and I, where we eventually met. And, um, I don’t know. Do you want to tell that story, Katie? (laughs)

Camille Morhardt: [00:11:37] I was going to ask you, because you guys had mentioned that you finally came to terms of respect over tacos one time. So I’m wondering, where were you guys, how you guys met? and why, um, what differences of opinion you had in the past?

Katie Trimble Noble: [00:11:53] So I think this might be one of my favorite stories—“The Katie RoRo origin story” Um, so, uh, I feel like RoRo and I were on opposite ends of the spectrum. Uh, and you know, when you’re sitting in your perspective, you kind of see the other person’s perspective as “the other” sometimes. And I think that we interacted with a lot of people who kind of encourage that.
So we met because we were working on a project together and I think we were diabolically on the other ends of the project. Uh, I think at one point we were definitely at Def Con and I think I was giving a presentation on something and somebody told RoRo “you need to go talk to Katie and Oh, by the way, she’s here. So go, go to this presentation” and like, and so he did, and he, he like came up to me after the presentation and uh, said, “Oh yeah, I’m, you know, I’m RoRo.” And I’m like, “Oh, it’s nice to meet you. You’re the person on all those emails.”
And uh, we, we kind of like, I think both of us hadn’t eaten that day and it was, you know, bouncing from presentation to presentation, running from hotel to hotel and we ended up saying like, one of the two of us was like, there’s a taco stand over there. “I’m starving. Can we, I’m gonna feed myself. You can come if you want.” Um, and I think that was probably me that said that, um, and RoRo in his affable kind of way is like, “yeah, that sounds great. Let’s do it.” And. So we sat down and had tacos and margaritas and talked out our differences and found out we both, we had a lot more in common than we had differences.
Um, we were both military. We both had that kind of “secure the nation” sort of perspective. We both had that like desire to do the best that we could and help the most people. Um, and we kind of filled in a lot of the gaps on, on the project that we were working on. So.

Camille Morhardt: [00:13:35] By the way he told me that you were right. Whatever the disagreement was.

Katie Trimble Noble: [00:13:40] Yes! I knew it! I knew it!
RoRo Romero: [00:13:43] It turns out you were right all along.

Camille Morhardt: [00:13:45] So, um, tell me about how you, you guys are now at, uh, in industry at a big, big corporation. Um, how is that different, I guess, when you’re structuring these kinds of programs than it was at the U S government?

RoRo Romero: [00:14:04] Yeah. So I see, like in government there was, you know, we were taking a lot of our cues from what industry had done, for sure. And it was a proven, proven method, but just applied to, uh, to different customer in a sense. Right. So, whereas you wouldn’t normally be doing this for a product uh, the bug bounties that are ran within the Department of Defense were a little bit different in that they were just against systems or maybe specific products, but it was with the intent of understanding what the risk was inherent in any complex system.
But I would say like the customer in that sense, working in government is obviously the American people and trying to make sure that whatever was out there was secure as much as possible. I think we learned after a lot of the, you know, kind of famous government hacks that we weren’t doing a good job of securing some of the most, um, important and sensitive information.
So in that case, the customer is the American people, in this case, it’s in, you know, in industry it’s folks who are using your products and your services. So. Very similar mindset. Um, just kind of a different customer in a sense.

Camille Morhardt: [00:15:05] Can you guys talk about the different kinds of flavors of bug bounty? I feel like I have a reasonable understanding now of kind of what it is and the purpose, but how would somebody structure it?
Like what kinds of questions should somebody ask if they want to set up a program? What are the different types of programs?

Katie Trimble Noble: [00:15:24] Bug Bountys are not appropriate for everybody. There is kind of a, a push, I guess, a feel like a “fear of missing out” kind of deal. Like everyone has a bug bounty and I want one too, but that may not be appropriate for your business. Um, there are other steps that you probably need to think about before you get, you start with a bug bounty. Uh, bug bounty for better or for worse is going to pull attention towards your product or your company. And we kind of look at it from a tech vendor perspective, right? So as a, as a technology producer, we’re saying “if you find a problem in our device” right? But, but bug bounties are bigger than tech vendors, right?
There are a lot of companies, you know, mom and pops bakery out there who may, might white want you to take a look at their digital infrastructure, their enterprise networks. And if you find something there, report it. So you need to be able to decide like, what is it you want people to look at? Are you asking them to look at your products or are you asking them to look at you? And that’s going to be different.
Um, and I would say you need to have a strong vulnerability disclosure process in place. I see this as like, um, say you bought a house that was built in like 1885 and has no indoor plumbing. Uh, the city has been telling you for years that they’re going to put plumbing in your house. Like they’re going to turn the water on there. They’re digging up the trench. You can see them do it. You’ve been watching them for years, lay the plumbing in front of you or the pipes in front of your house. Turn the water on. And, uh, It’s kind of like the industry forces here.
So eventually what’s going to happen is they’ve laid all that pipe and one day they’re going to turn the water on. And if you don’t have the plumbing installed in your house, that water’s going to go straight into your basement and you’re going to get flooded. You’re not going to know what to do. So that is like, to me the ultimate example of why you need a strong VDP in place, first. You need to be able to deal with those submissions. You need to be able to respond effectively, mitigate the problems. All of that takes policy, process, procedure, infrastructure. It’s not something that is an overnight sort of deal, and you need to be able to have.

Camille Morhardt: [00:17:18] And that’s essentially like, sorry to interrupt, but that’s essentially like the PSIRT the Product Security Incident Response Team.

Katie Trimble Noble: [00:17:26] Yeah. Yeah. So it can be either a PSIRT or CSIRT. So again, so if you’re, if you’re a product vendor and you’re, you’re asking people to take a look at your products, then it would be a PSIRT. If it, if you are not a product vendor, say you’re a know medium sized bank and you want people to take a look at your infrastructure, then they’re going to report back vulnerabilities in your infrastructure, like “I was able to get onto your email server. I was able to do this and I was able to do that.” Those are CSIRT issues.
So you need to have a method depending on what your, what your industry is and what your company is. You need to have a method of receiving that information, triaging that information, mitigating it, and then communicating back out to the researcher that you’ve done, those things.

Camille Morhardt: [00:18:05] Right, I actually talked with, I think you may know her Lisa Bradley, who runs PSIRT at Dell to kind of get some insight into that. Um, so you’re talking about, you know, literally, I guess, opening the floodgates, um, in the case of the city, how do you put. bounds or, or is it better to not put any kind of scope or boundaries around what’s being looked at?
I mean, do you just say “here’s my company and have at it?” um, do you put like a, a tie a window of time around it, or don’t look at this system only look at that system> or how do you kind of structure? How do you structure that?

RoRo Romero: [00:18:42] Yeah, I’ve seen, I’ve seen everything, I think at this point. Where they’re just completely open sort of, um, bounties that are on specific system sort of categories or services. Um, I would say though that in general, you want to have a very well scoped and well understood scope if you’re going to have a time box challenge, for sure. Um, but then also recognize that you may not know everything that that system touches. And so having some coverage for that, because at the end of the day, these researchers, they’re amazing people. They want to make sure that they’re actually covered legally if they make a “hackcident” of some kind, if they go out of that–

Camille Morhardt: [00:19:19] (laughs) Oh my God. Now I believe you were on a SWAT team of nerds.

RoRo Romero: [00:19:24] There were others that actually coined that not me, I’m just using it, but, um, it’s a great term because it does sort of highlight, um, You know, things, things do happen, especially when you’re testing very complex systems and you don’t know what the interdependencies are, how they’re connected in weird nuanced ways. So what researchers really want is to understand that you got their back, right? And that the rules make sense, that the scope, um, is, is what they’re supposed to be looking at. And if there are any questions they can come to you for questions, uh, to have that resolved.
And I think going back to the previous question a little bit around the different types of bounties, there’s the private sort of model, there’s the public, uh, sort of bounty model where just out in the open, it’s kind of the time box one, as well. But I would recommend that every organization start out with a sort of a “internal bounty” first. So like, have your engineers have your folks who understand the system, try to find vulnerabilities. And if they don’t still pretend as though they did and then run it through your process that way. So you can find areas where your process might have holes, or you don’t know who the system owner is or who can take action on it at the end of the day. That’s what you’re trying to do.

Camille Morhardt: [00:20:32] Are those called red teams. If you’re doing that internally?

RoRo Romero: [00:20:35] You know, there are definitely red teams and blue teams, purple teams.
Uh, it was all different colors of teams, but, um. Yes so like on the offensive side, kind of the red teams, blue teams, more on the defensive side, but I think the best way to approach it is kind of having both perspectives. Somebody who understands the system from a defensive posture, but is not limited by that. Um, cause a lot of times when folks have designed the system, they’re looking at it from that perspective and, and it, it’s hard to switch over to kind of, um, an adversarial mindset, which is what these researchers bring.

Katie Trimble Noble: [00:21:12] yeah, I liken that to when you were in college and you wrote an English paper, right. You reviewed it and reviewed it and reviewed it, reviewed it, probably endlessly and your edits. But then as soon as you as soon as you hand it to somebody else, they’re going to see things that you didn’t see because you were so close to the problem that often it’s hard to see some of the other issues because you’re so close to it. Also things change implementations change all the time. So it wasn’t necessarily a flaw when the product was designed, um, or a weakness, it was something that was meant to be that way. But then a product changed or was implemented in a way that the original engineer didn’t anticipate. And so those things happen all the time.

Camille Morhardt: [00:21:49] Well, you have this clash of cultures. So, um, I was just talking with Isaura Gaeta. Who’s also at Intel and leads the offensive Security Researcher Team. And she was saying how, you know, she’s trained as an engineer. And now she’s working with people who are trained, like to break things and it’s so different. So you’ve got, and you’re saying really, you kind of have the same team if you’re, if you’re on this Product Security Incident Response Team, or PSITY that also runs coordinated vulnerability disclosures. These are people who are there for process and triage and hierarchy and all this kind of stuff.
And then you’re pulling in. It’s like the opposite, right? You’re pulling in people who are just what what’s broken, what changed, what where’s there whole, how does, “how is something that I can figure out now that wasn’t even able to be figured out before.”

RoRo Romero: [00:22:44] Yeah. So teaching them that, that hacker mindset, I guess, is what I was going for there. That, that internal team, if you’re doing an internal bug bounty first sort of teaching them what that looks like, um, and having them just try to break things, you know, in a thoughtful way, typically you’re not going to bring down a production system. Um, but it’s a different mindset, yeah, from, from the folks who would build and the folks who would break.

Katie Trimble Noble: [00:23:07] Yeah. We always kind of said there were kind of two categories, there were the builders and there were the breakers, but that kind of goes into what you were talking about, Camille, about the, like the PSIRT or in the role of the PSIRT, or like in what we do as compared to like say the Offensive Security Research Team. The role of the PSIRT, or it is to facilitate. Right. It’s to be the coordinator. It’s to be the balanced voice in the room that’s kind of trying to, to move things along. Um, we’re not tied to one perspective or another perspective. We’re willing to be open-minded and see all the perspectives and you definitely need all the perspectives. You need strong advocates for all the perspectives. So I’m not saying everybody should be that way. I think you definitely need strong advocates for specific perspectives. That’s how you get, that’s how you move forward and evolve.
But the PSIRT team, um, overwhelmingly it’s media, it’s mediation, more than anything else. It’s sitting into, sitting in a room with people who feel very passionate about the things that they do and trying to move past the, those kinds of passions and into the, how do we get this fixed for the user? The goal is always to protect the user and doesn’t matter if you’re in government or if you’re in private sector. Where you are. The goal is to keep the eyes on the prize, protect the end user, make this as strong as we can.

Camille Morhardt: [00:24:23] I love this conversation. Um, let’s see, I’m going to ask one final thing and then, uh, we’ll sign off. Um, what is, what do you think is the most kind of cutting edge idea or concept whether or not it’s been implemented yet in crowdsourcing security?

RoRo Romero: [00:24:46] Hmm. Um, If I had to take a stab at that, I would say sort of human, augmented, crowdsourced vulnerability finding of some kind. Right. And we kind of see that already today, folks have gotten tools that they’ve gotten used to, they know how to use very well to look for vulnerabilities in a specific area or product. Um, but more of that. Like how can we use like fuzzing at scale or other, like, let’s say AI techniques. Some way to like augment the humans ability to find vulnerabilities more at-scale more quickly, and then take action on them.
So I think we saw hints of this at the Cyber Grand Challenge that DARPA ran in 2016 at the, at DEFCON. And essentially it was, uh, having teams essentially use automated techniques to go out and find vulnerabilities. And then also I think, um, to a certain extent, tried to patch the vulnerabilities that were found and retain that functionality–all without any sort of, uh, human interaction there. So there was an actual air gap.
I think semblences of that into the future where we have like, you know, humans working with machines more closely and automating as much as we can, but still people are really good at certain things. Finding weird patterns and behaviors and how systems work or interact. Um, I think they’ll continue to be good at that for some time to come. That’s my stab in the dark at that question.

Katie Trimble Noble: [00:26:15] So I was kind of thinking about it as we were talking in the, in one of the things that like, I’m always that always kind of, I rephrase your question to what keeps me up at night and like, what do I worry about?
Um, and it’s, and to me, it’s IOT, um, it’s this world of “you can’t buy a electric tea kettle without it having a wifi device in it.” And there is, you know, the interconnectedness of the world is getting more and more and more and more, but there’s still a mindset in most of humanity and most of our, you know, Western cultures that there are two worlds, there’s the cyber world and then there’s the real world. And I think that. Until we get to a place as a humanity where we understand that there is only one world now, um, there’s so much attack structure that is being left unsecured, uh, and you see more and more and more of these IOT devices proliferating throughout the community all the way from insulin pumps and pacemakers installed in people’s chests to water plants in Florida to, you know, Fitbit’s and you know, your, your child’s wearable, you know, little Fitbit device, they’re everywhere. Um, everything has a, has a wifi chip in it, and we need to, as humanity, get to a place where we understand that those devices need to be secure.
Because there is no real world and cyber world. It’s not a nuisance anymore. There used to be this idea that if the internet goes down, it’s an, it’s a nuisance, you know? Oh no, you don’t have your Netflix for today. So sorry for you. But it’s not that way you can’t apply for a job without the internet.
You can’t, you know, do your basic, I mean, especially in COVID world, you can’t work without the internet. They’re so embedded in our daily lives, the S the cyber world that it’s not possible to separate it anymore. And I’m not sure that that would even be something we would want to do.

Camille Morhardt: [00:28:07] That’s so insightful. There is no, “I abstained from the internet.” Right. That’s pretty getting. You know, that’s, that’s pretty much, um,

Katie Trimble Noble: [00:28:17] off, I mean, off grid living is a real thing, but you take on, you take on some pretty serious hazards in that.

Camille Morhardt: [00:28:23] So, cool. I’ve really enjoyed this conversation with you guys. Um, thank you. I’m so glad you’re at Intel.
Now I can talk to you whenever I want. Um, I know I really appreciate your insight and it’s really cool that you came from such different and yet also similar backgrounds. Um, and I really enjoyed hearing your perspective, so thank you very much.

Katie: [00:28:47] Yeah. Thanks for having me.

RoRo Romero: [00:28:47] Thanks for having me.

Camille Morhardt: [00:28:50] I appreciate it. There were a bunch of different things, uh, that Katie and RoRo referenced today that I think you can dive into on some of the other episodes. Um, hope you’ll check out more episodes of Cyber Security Inside, and also What That Means.

More From