Skip to content
InTechnology Podcast

#10 – What That Means with Camille: Human Factors

On today’s show we discuss Human Factors–both in the traditional sense and in cyber security specifically–with Margaret Cunningham.

Margaret has a PhD in Applied Experimental Psychology. She’s a member of Forcepoint’s X-Labs as the behavioral scientist subject matter expert. X-Labs develops scalable human-centric, security solutions.

In this episode of What That Means, Camille has Margaret Cunningham of Forcepoint on to talk about the intersection of Human Factors and Cybersecurity.

They cover a lot of ground, like:

  • The intent of human factors practitioners
  • Where the fields of human factors and cybersecurity intersect
  • The questions guiding human factors practitioners
  • Why people break the rules
  • How human factors can be used to improve training
  • How you measure a person

And more. Have a listen.

 

Here are some key take-aways:

  • Human factors practitioners focus on the evaluation and design of everything, with the intent to optimize human performance.
  • Human factors can be broken down into different areas of interest like physical environments, cognitive function, systemic issues, etc.
  • Factors like culture, language, and geography can impact design.
  • The link between the two fields of cybersecurity and human factors is relatively new.
  • Human factors practitioners are now able to use behavioral analytics to highlight anomalous past behaviors with much more specificity.
  • We learn how to break the rules from other people, so behavioral analytics tends to uncover clusters of bad actors.

 

Some interesting quotes from today’s episode:

“We have to know the limits of people, whether it’s a cognitive or a physical limit doesn’t matter.”

 

“People are amazing, but we’re not really good at everything. So what can we understand about what we’re designing to improve human performance in areas where we’re weak, while also capitalizing on what we’re good at?”

 

“We really are starting to use behavioral analytics in a much more sophisticated way, where we’re building an understanding of people’s past rule-breaking or their past exploration that doesn’t really fit with their peers.”

 

“A lot of times, we learn how to break rules from other people. We learn what the implicit rules are versus the explicit rules from our managers or supervisors or our peers. And in that case, we can start seeing that there are clusters or groups of bad apples. And that can be very meaningful in terms of understanding organizational exposure.”

 

“I think that people who truly understand how to build metrics that can capture human behavior are going to be making great strides in this industry.”

Share on social:

Facebook
Twitter
LinkedIn
Reddit
Email

[00:00:00] Camille: Hi with me today to discuss Human Factors–both in the traditional sense and in cyber security specifically–is Margaret Cunningham. Margaret has a PhD in Applied Experimental Psychology.  She’s a member of Forcepoint, X-Labs as the behavioral scientists subject matter expert. X-Labs develops scalable human-centric, security solutions. Prior to joining Forcepoint, Margaret supported the Human Systems Integration Branch at Homeland Security.  One of her focus areas was [00:00:30] the technology integration for the next generation first responders APEX program. Welcome Margaret.

 

Margaret C: Nice to be here. Thanks for having me, Camille.

 

Camille: Can you define human factors? Uh, from the traditional human factors engineering perspective in under three minutes? 

 

Margaret C: Uh, I can give it a whirl. There are a lot of different ways to think about human factors, but, um, generally speaking, it’s an interdisciplinary, scientific field focused on understanding [00:01:00] human performance within a system of interest. And when I say system, I mean any type of system.  Ultimately, that means that human factors, practitioners focus on the evaluation and design of everything.  That means tasks, jobs, environments, products, processes, systems, uh, you name it.  And this is all with the intent to optimize human performance. 

We have to have a very deep [00:01:30] understanding of what people need to bolster their capabilities while also protecting against or mitigating the impacts of our limitations. So, uh, we have to know the limits of people, whether it’s a cognitive or a physical limit, doesn’t matter. 

A lot of people can break this down into different, areas of interest. So we have people focused on physical environments, office chairs, uh, you know, and more complex, [00:02:00] something like designing a space shuttle, or somebody to operate in this limited space for a long time.

Uh, we have people focused on the cognitive end. So, you know, how do we deal with mental overload? stress and performance? slips of memory? all these different types of errors that we make because of the limitations of our cognitive functioning.  

And then we also have people who focus on the broader systemic issues that are a little bit more organizational focus. So, [00:02:30] uh, how can we do structured communication for a crew resource management and aviation, for instance? 

You know, in my previous life, life, I focused primarily on the cognitive end, but as I’ve become more ingrained in the cyber security community, I’ve started to focus much more on some of the broader systemic organizational factors. So that’s the quick and dirty on what human [00:03:00] factors means to, uh, the more academic practitioners. 

 

Camille: That’s a great definition. It’s very comprehensive. And yet I think we have a lot more to get into as we look at how it intersects with cyber security and where that’s been taken. So, uh, let’s dive a little deeper.

My first question, cause I know we’re going to go, go off for awhile in the, on how it relates to cyber security, but I’m really interested. Two things kind of came to mind. Is there an [00:03:30] overarching goal that’s common across human factors. Like the different things you mentioned. There was one of them was kind of maximizing or optimizing performance.

One is maybe ergonomics. One was productivity. One was user experience. Is there a North star or is it, does it depend on what you’re doing it for? 

 

Margaret C: It kind of depends, but at the same time, it’s very performance oriented,. You know, what can we do to do something better? How can we team [00:04:00] with a machine better? How can we use a technology better? and how can we design to, you know, push this performance?  Because people are amazing, but we’re not really good at everything. So, um, what can we understand about what we’re designing to improve human performance in areas where we’re weak while also capitalizing on what we’re good at?

 

Camille: And then another question, does this feel different depending on [00:04:30] culture or geography or language? 

 

Margaret C: Oh man, that’s a good question (laughs). I think a lot of there’s a lot of commonalities across people, but, um, I would never say that all people are the 

 

Camille: same. Okay. So it could be that based on a culture or a geography or an environment that people are in that they tend to do things a different way. And so you would design it differently because of that, because of the cultural influence or [00:05:00] language influence or something like that? 

 

Margaret C: I mean, or even things like traffic patterns, right? So we design our cars in a very specific way so that we can drive and understand traffic situations. Not everybody drives on the same side of the road.

Camille: Right. 

 

Margaret C: So we have to design differently for different practices. 

 

Camille: Or like when I was driving around, um, Milan in Italy, I literally had to go around the central circle like nine times before I figured out how to exit because so many people were [00:05:30] coming on and it was so busy and I wasn’t used to that. So, I mean, it’s probably very efficient and high performance for everyone there, but for me, I was stuck in this loop until I could figure out how to get off. 

 

Margaret C: I think people find in the design of cities, for instance, that, um, areas that are much more congested and much more crowded, have different types of behavioral patterns that you can design to. There is a little bit of literature out there as well that focuses on cognitive differences. More, uh, people who grew [00:06:00] up in rural environments versus urban environments, because the visual landscape is so different. I mean, it’s really wild (laughs).

Camille: interesting. 

 

Margaret C: You can really get into the nitty-gritty on all of those things. I think you’ll find human human factors, practitioners in every area.

 

Camille: Right. You’re kind of in this, I don’t know if it’s a sub field or an adjacent field or an extended field, but you’ve taken the basics of human factors, [00:06:30] engineering or human factor research and extended it to cyber security. Is that a big field or are there like two dozen of you in it?

And what is that? Can you explain how, how that thing works?

 

Margaret C: I would call it an emerging field. And there are a lot of people who are now dedicating their academic and industry, uh, life to this area. I want to say that, um, the Human Factors and Ergonomics Society, which is the, you know, historical group for human [00:07:00] factors professionals stood up a cyber security technical group, I want to say, within the last five years. So it’s, it’s pretty fresh. It’s pretty new, but, uh, there are a lot of people exploring this area now much more seriously as like a dedicated field.

 

Camille: And what does it look like? So in this case, are you focused on human failure as opposed to more optimizing performance by minimizing failure? Or…

 

Margaret C: I mean, you have a lot of people looking at improving training, for [00:07:30] instance.  Because, as we know, like the yearly compliance training where we just like, let the video play in the background and we click the answers. Isn’t very good. So we have, uh, people who are really focused on skill building and transfer of training, for instance, of cyber security training to the region.

 

Camille: Like don’t click on the phishing, the phish, the clickbait, or the phish? Okay. 

 

Margaret C: And sort of the, um, continuous [00:08:00] evaluation of performance when you’re facing those types of challenges and how we can use those behavioral metrics to improve training and target people more specifically on what they need to learn, uh, which is a little bit better than the blanket treatment of everybody, you know, clicking through something.

 

Camille: Are you also looking at individuals or patterns among employees that are, that are maybe flags?

 

Margaret C: Yeah, so one thing that I’m really fascinated with [00:08:30] is why people break rules. Um, I think that rule breaking is a type of performance, in a sense. So, even really high performers break a lot of rules. If you look into human factors, literature, you’ll find a lot about workarounds in the medical field or aviation.

And, um, there’s a lot of exploration of what sort of systemic impacts different types of policies have that either force people to conform in a way that [00:09:00] is not palatable– like security, friction, for instance–and then the really creative ways that people work around that that can actually be beneficial or pretty harmful, frankly.

 

Camille: But you’re talking about people who are well-intentioned, they’re just maybe it’s that, uh, sometimes there seem to be processes or rules that border on unnecessary bureaucracy when you feel like you understand already the spirit of the law. So you’re going to work around it to [00:09:30] be actually more, more efficient and not cause any problems.

 

Margaret: Yeah. 

 

Camille: That’s different. You may inadvertently cause a problem, but that’s different than, than somebody who’s actually got malintention. Does this human factors field take into consideration rogue actors, if we want to call them that, and discovering them? 

 

Margaret C: Yeah, I think. Absolutely. And, um, I don’t know. I really believe that most people are bad actors or at least they don’t intend or start out to be bad actors. I think that’s a [00:10:00] very, very small sliver of the population.  But, there are organizational things that occur that make people sour, um, or angry or disgruntled. And in those cases, that can start to feed into that malicious intent. And that is where I think that, um, sort of the systemic view of human factors comes in into play.

 

Camille: I think there was a couple of different things that I’ve heard discussed before. [00:10:30] One is, you look for patterns of behavior in, let’s say the computer that are different. So you start to notice people are exporting large files when they’ve never done that before. And that makes you wonder why. Um, and especially if that coincides with them, you know, joining a mergers and acquisitions team or something. 

And then, you know, the other thing is just, you worry that they’ve been hacked and somebody else actually has possession of their system. Are, are those two things you’re looking for? Or…

 

[00:11:00] Margaret C: Yeah, so,  it can be pretty complex, but it can also be fairly simple. So what we know is that people tend to have a trail behind them of bad behavior. It’s not usually like the one day somebody comes in and they do a majorly terrible thing. And we really are starting to use behavioral analytics in a much more sophisticated way where we’re, we’re building an understanding of people’s [00:11:30] past rule-breaking or their past exploration that doesn’t really fit with their peers. Or it seems a little bit odd, like, “Hey, you know, I don’t need to have access to a finance system” or, you know, “why is my researcher looking into HR?”

And so we’re really starting to break it down into different types of groupings where we can highlight anomalous past behaviors with much more specificity.

 

Camille: And, and do you apply that? I don’t mean you in your [00:12:00] role? I just mean in general, um, human factors researchers who are in this field of cyber security, are they generally applying it across a company or a broad swath of people within a company like the entire legal department or something? Or are they honing in on individuals?

 

Margaret C: You know, and, and I will say this, I don’t think a lot of people in traditional human factors are looking at insider threat. I think it’s a sort of newer link between the [00:12:30] two fields of cyber security and human factors. In my personal experience, I prefer to think about organizational trends, until there’s a need to be more specific.

A lot of times we learn how to break rules from other people. We learn what the implicit rules are versus the explicit rules from our managers or supervisors or our peers. And in that case, we can start seeing that there are [00:13:00] clusters or groups of bad apples. And that can be very meaningful in terms of understanding organizational exposure.

So, you know, I have this big group of people, they all seem to be kind of rule breakers, but they are highly technical. They have access to a lot of stuff. It’s not just the one person it’s a broader potential exposure for a company. 

 

Camille: So then you would look at either of these are malicious actors that have somehow infiltrated or they’re not [00:13:30] following the rules because they don’t like the way you’ve structured them and maybe you need to go look at your processes and policies and make sure they’re more effective for, you know, helpful for these kinds of people or something? 

 

Margaret C: Yeah. I mean, maybe it just doesn’t work for the marketing team to not be able to use USBs (laughs).

 

Camille: Right, right, right. 

 

Margaret C: Silly example. But is it because there can be a lot of exposure, uh, transferring data in ways that aren’t approved by the company, especially because when people break rules and they do these workarounds [00:14:00] in groups, we lose visibility. We can’t actually follow what’s going on because it’s outside of our, our lines.

 

Camille: Right. So what are, what are some of the, um, kind of hot topics in this area right now of cyber security human factors? 

 

Margaret C: Oh, um, you know, there will always be a lot of focus on training and human factors groups. But I think that people who truly understand [00:14:30] how to build metrics that can capture human behavior are going to be making great strides in this industry.

You know, how do you measure a person? A lot of human factors people are great at figuring out good ways to understand cognition, measure cognition, measure performance. And when we can get that more social and behavioral science baked in with traditional security indicators. Wow. That’s a whole new world of [00:15:00] possibilities.

 

Camille: What are people arguing over in this field right now? 

 

Margaret C: Oh, you know, I think there’s a little bit of academics wishing that they were heard (laughs). And there’s a little bit of a gap–actually a pretty big gap–between what practitioners and industry are ready to implement and what the body of knowledge in academia can provide. And that I think is [00:15:30] a big, not really a point of contention, but that’s where the rub is right now. Like how can we take these best practices and standard processes from great researchers and pull it through into either products or a better way to, um, provide coverage in the expanding cyber security landscape.

 

Camille: Well, very interesting. And you’ve of course, bridged academia and industry. So it’s good to hear those comments from you. [00:16:00]  I really appreciate your time today. Thank you for joining us on the show. 

 

Margaret: Yeah, it’s been a pleasure. 

 

Camille: If you want to learn more on this topic and specifically human-centric security, give a listen to Episode 6 of Cyber Security Inside with Alan Ross.  A reminder: he’s a Fellow and Chief Architect at Forcepoint. That episode is called “Addressing Cyber Security Threats with a Human-Centric Approach.”  And stay tuned for the next episode of “What That Means”.

 

More From