[00:00:35] Tom Garrison: Hi, and welcome to Cyber Security Inside podcast. I’m your host, Tom Garrison. And with me as always is my esteemed colleague and cohost Camille Morhardt. Well, how are you Camille?
[00:00:46] Camille Morhardt: This is good. I made it to the, to the realm of esteem. I like it.
[00:00:50] Tom Garrison: Yeah. I’ve got to think of better adjectives before I introduce you,
[00:00:55] Camille Morhardt: I’ll send you a list.
[00:01:00] Tom Garrison: (laughs) Because you deserve all of them. So today we’re going to talk about, I think, a long overdue topic for us on this podcast. I note on the What That Means podcasts we’ve talked about privacy, but specifically around data and how is data managed and protected is something that is really on the forefront for the IOT business. And that’s who we’re going to be talking to today as an executive from the IOT; but it really applies to everyone in all industries. And I think it’s an important topic for us.
[00:01:33] Camille Morhardt: It’s kind of like jumping into the wolf’s den, if I’m allowed to say that, because you’re right, on What That Means we’ve talked with, you know, PhD anthropologists about privacy and psychologists about privacy–some of our fellows here at Intel, you know, actually people who are working to set standards for privacy around the world, contributing to global standards.
Well in the interview that you and I do now, we’re talking to somebody who’s commercially in the business of collecting information about people via cameras. And I might have thought that his perspective would be, “too bad. You know, you want to run through the world and you want to have any sense of security, then we’re here to protect you and you’re going to make some trade offs.” But he does not have that take at all. He very much is concerned about privacy. So I thought it was quite interesting.
[00:02:30] Tom Garrison: Yeah. And, uh, he, he definitely has interesting perspectives around the type of regulations and laws, and he’s pretty opinionated in terms of how we need to hold companies and executives accountable.
[00:02:47] Camille Morhardt: Yeah. I think that it is really interesting to hear his perspective on it and what he thinks everybody should be watching out for and accountable for, especially as this is his living.
[00:02:29] Tom Garrison: Yeah. Well, let’s jump right into it.
[00:03:07] Tom Garrison: Our guest today is Pierre Racz. He is president of Genetec, a private family-owned company that develops network, physical security, data analytics, and operational efficiency solutions. Over 1,500 people are employed by Genetec around the world. With over 40 years of hardware and software development experience, Pierre has extensive knowledge of the physical security industry. Prior to founding Genetec in 1997, he worked as a Principal Engineer at DMR–that’s Fujitsu Consulting. Pierre holds an honors degree in Electrical Engineering from McGill University in Montreal, Quebec.
So welcome to the podcast, Pierre.
[00:03:53] Pierre Racz: Oh, Thank you for having me.
[00:03:55] Tom Garrison: Can you tell us a little bit about Genetech and the types of security solutions that your company focuses on?
[00:04:04] Pierre Racz: Genetech, uh, like many companies fell into security serendipitously. In the late 1990s we built a system to manage video security cameras. And our vision was that in the future, it would be all done over the Internet because once it’s digitized, you can do things like sharing it and transmitting it using multicast. So we sort of stumbled into that and we had other product lines going, but that was the one that took over and we abandoned our other projects.
Since then we have built other successful software in physical security, uh, namely an access-control system called Synergis. And we are now number one, worldwide for networked video solutions for physical security. And we are number two worldwide for access control solutions for physical security.
If you have started traveling again, we do over 200 international airports. Uh, we do a lot of schools, hospitals, public infrastructure, and our activities are focused in G-18 countries.
[00:05:14] Tom Garrison: Oh that’s interesting. And I think we’ve all seen those kind of high-powered cameras, but from a security standpoint, tell us how those differ from maybe either other cameras or other devices that we may be familiar with in our daily lives as consumers really?
[00:05:34] Pierre Racz: Almost everything we have today has a computer in it and cameras is no different. Some of the stuff that is produced for consumers is inexpensive, but it doesn’t have very much computing power to do a security. And sometimes the software is a bit on the sloppy side.
When you’re starting to do security for infrastructure, they will put in higher resolution cameras, cameras that can produce the video in multiple resolutions. And the best manufacturers have hardened their cameras against cyber security breaches. There was the famous Mirai Botnet attack about four years ago where it turns out it was gamers that did it, but they commandeered about 1.5 million cheap IP cameras that were around the world. And it was, uh, an attempt to enhance their score at, I think, Warcraft or one of those massively parallel online games. And, uh, in the process they took down the DNS that served Facebook, Twitter, and a whole bunch of other important web properties. (PayPal, I believe also went down in that botnet.)
I think the lesson to learn from there is that putting junk IOT devices on the internet, not only can put your system at risk, but what it does is it, it actually damages the ecosystem for all of us. There’s the Bono Pastori Principle: if you’re going to be a good shepherd of your IOT devices, put quality stuff on the internet and put stuff that comes from trustworthy manufacturers.
[00:07:16] Camille Morhardt: When you get the images, when you’re, I guess, collecting video information, like in an airport, are you doing the processing in the camera on site or are you sending all of this data to some backend server somewhere to do the processing?
[00:07:32] Pierre Racz: One of our claims to fame is that we can manage very large networks of cameras. Uh, our current biggest system is 200,000 cameras. Now you will not do the processing of this on a central server. Our technology let’s you move the workloads around and try to put it where it is the optimal for your configuration. So a lot of the high-end cameras, we can actually move some of our processing onto the cameras. Otherwise, if the cameras are unable to do it, we then can move the video to a server that is located nearby the camera, avoiding expensive networks. Uh, do the processing there. And then we can then send the metadata to further servers that are upstream in the server room.
For a lot of the work that we do in the United States or cities, we use the secure government cloud that is operated by Microsoft. About 30% of our customers work disconnected from the network. And in that case, we run it in their data centers.
[00:08:41] Tom Garrison: This opens up an interesting conversation that we can have hopefully and that that’s around this idea–especially when you’re talking about cameras–the notion of privacy and security and how they relate to each other; and how can, and should companies be thinking about privacy issues and what they do with information that gets created from these devices while trying to also provide a level of security. I wonder if you have thoughts on that, Pierre?
[00:09:13] Pierre Racz: My house was on the Internet in 1985. And the philosophy of the Internet was it should be free. And what we would do is we would run the Internet from discarded or unused machines you know, the corner of our labs. And, uh, then, but as the popularity of the Internet grew, it’s no longer just, uh, the discarded machines that run the Internet. Actually today, the Internet consumes a little more than 12% of the electricity that is produced worldwide. So somebody has to pay for this. So we started seeing advertising and other things like that.
Uh, then people decided, oh, well, there’s all this information that nobody really cares about. And they decided that they can, first of all, help themselves with that information and monetize that, that information. This has positives and negatives. Uh, certainly the old Silicon Valley people don’t think that this is what they meant by the Internet should be free. And, uh, I personally am of that, of that belief. And for example, that’s why I subscribe to a lot of my information sources. I pay for someone to curate it, so I don’t get junk. Certainly a couple of years ago, my, my young son didn’t agree with me. He preferred to get it for free, even if I would pay the subscriptions for him, you know, he wants it for free.
This is a societal debate is when does free become too expensive. Uh, certainly when it’s done without really thinking about the consequences, we can get into a little bit of trouble. Uh, if you remember not too long ago, there was this incident where us military personnel were wearing mobile health devices, jogging around military bases in theaters of war. This information was public on the internet and basically that gave anyone wanted to disrupt the military operation, it gave them a nice layout of the military bases. That’s an example of a fail. We have to rethink whose information is it? and who has the right to help themselves to it? Right now, the onus is on the information source–that’s us–to make claims about the privacy of the information and basically whatever we don’t claim as private, other people think that they can help themselves to that information.
I think that there’s something we’re, we’re figuring out and, you know, over the next 10 years or so, that’s my estimate, we’ll be able to domesticate this technology. We’re going to figure out what we want to give up and what do we get for giving it up? I don’t think that we all want to be the product.
[00:11:50] Camille Morhardt: Well I have to bug you just a little bit because your son is making a conscious decision when he’s subscribing or when he’s clicking on something to trade some of his personal, let’s say browsing history or something for that, letting the cookies come. Whereas you’re manning cameras for cities or for airports, which are sort of, you know, people have to pass through them. They’re not necessarily making that conscious decision to trade their video image of themselves. So how do you reconcile those two?
[00:12:24] Pierre Racz: Uh, can you make an excellent point? And that’s why, we invented the privacy protector and we are the only company that has now 12 years of accreditation by the European Union privacy guards, that this technology protects the privacy. And certainly we have customers that deploy it and especially in public areas. What we do is we keep two copies of the video: one copy that is encrypted and one copy of that is blurred. So most of the security personnel, they watch the blurred video. Now it’s not blurred enough, so you can’t see if somebody fell. You can certainly see car accidents. I’ve seen a video of people fighting, but you don’t know who’s fighting. And just to access the unblurred video, this system requires that two trusted people insert their security tokens in the system.
So typically in Europe, it is the Chief Security Officer and the Chief Privacy Officer of the organization. And in manufacturing plants that Chief Privacy Officer is typically the head of the union. So both of these people have to agree that the incident is severe enough the right to privacy is overridden by the right to be safe. And our contention is that a liberal democracy can protect its citizens without actually having to resort to an invasion of privacy–unwarranted invasion of privacy. When a bomb goes off or something bad happens, our right to security and happiness overrides, temporarily, our right to privacy. It is not true that we have to give up our right to privacy totally to be secure.
Now, actually that my son, just to jump back to him is that, um, he was, uh, in his late teens when he made that statement. Now that he’s in his mid-twenties, he actually has a change of heart.
[00:14:29] Tom Garrison: (laughs) He’s seen the light.
[00:14:32] Camille Morhardt: You’re also saying though, I have to bug on one more thing. So you’re also suggesting that like the security personnel at say an airport, aren’t viewing individuals; they’re there looking for maybe a crime or, or a safety issue when somebody falls down. But I would imagine that there are like Interpol alerts or something that are using facial recognition to locate an actual individual person. So are we allowing essentially AI or a computer program to identify an individual, but we’re not letting people identify an individual?
[00:15:05] Pierre Racz: Well, first of all, in the liberal democracies, they don’t have facial recognition running on all the cameras; they do have it on a lot. And this is part of the social contract that hasn’t been established yet. Where do we draw the line? So they are playing relatively cautiously by not putting it all over because look, these algorithms are, they’re useful, but they’re mindless. Uh, Professor Winslow told me a long time ago, he said, “I don’t believe in AI.” You know, he did his PhD at MIT and he was working on, on AI in the 70s and 80s. And he says, “I don’t believe that artificial intelligence exists,” he says, but I like IA; it is intelligent automation.”
And so I contend that AI stands for “absolute ignorance.” The computer is mindlessly doing a job. And it’s fine because, you know, my pocket calculator also mindlessly does square roots a lot faster than I can. And as long as I then get to make decisions based on what the computer has done, the computer will do the heavy lifting for me. And then I will do the creative part of the, the job that the computer cannot do. IA–intelligent automation–put the human in the loop and put a trustworthy human in the loop.
So a lot of our security professionals are well-trained. They do not use the technology to gawk at the citizens. Certainly the less trained might do that, but if they’re going to gawk with a blurred video, well, you know, more power to you.
[00:16:42] Tom Garrison: So, Pierre, I’d like to go back on something you mentioned before, which was you brought up an example about. The US soldiers that were running around the base and they were wearing these, you know, GPS, uh, fitness trackers and the data wasn’t protected. I wonder what are your thoughts regarding the responsibility and whose responsibility is it to manage what data is collected and how is that data going to be used through either that’s primary use or secondary uses? Who owns that? Is that the person who is wearing the wristband running around the base or is that the company or is that somebody else?
[00:17:29] Pierre Racz: All right. That’s an excellent question. I will answer that question with a question. If you don’t control the software in your phone, whose phone is it? Or if you don’t control the software in the device whose device is it?
[00:17:44] Tom Garrison: So I, I happen to have an Apple phone. So I guess by your question, the nature of your question, you’re suggesting it’s Apple’s ownership of the device–even though I bought it. Your question implies it’s an Apple device. Is that, am I hearing this right?
[00:17:59] Pierre Racz: Well, if you don’t own the software, that’s running it, it’s not your device. Now, how does Apple deal with this is that–and they deal with it nicely, actually, not perfectly, but certainly better than most–is that they say look, “we will provide you with a service. We won’t steal your information. We’ll do our best to make sure that no one else steals your information and you can choose what you share.” But in situations where you cannot choose or where it’s not granular enough, then it’s not your device.
[00:18:32] Tom Garrison: And so for the fitness tracker example, let’s, let’s use that, based on your understanding, how would you define who owns it?
[00:18:40] Pierre Racz: Oh, unless you can opt in, you don’t own it.
[00:18:44] Tom Garrison: And so that means that, that then the fitness tracker company should be the one held accountable for how that data is managed and who has access to it?
[00:18:56] Pierre Racz: Oh yes. And eventually I think that society is going to say to them that if you take information–either without asking permission or without being clear, what the value of that information is to the person from whom you took the information–you, you are held liable for holding that, that information, and also liable for how that information can be misused.
If, for example, that information was misused and someone fired a missile that killed some of the personnel on the base, that manufacturer should be held liable. This is cyber malpractice.
[00:19:34] Tom Garrison: Interesting. Yeah. And, and, uh, you know, based on your understanding around the world, where are the regulations and the laws regarding this? Are they still in their infancy or are they progressing along the lines that.
[00:19:48] Pierre Racz: They’re in their infancy. Uh, but, but it’s, it’s getting there. For example, Sarbanes-Oxley has a few provisions about the fiduciary responsibility of executives to corporations, uh, with respect to cyber information, but it doesn’t have very many teeth. It has a lot more teeth about their fiduciary responsibility of providing accurate reporting.
So, if you remember how Sarbanes Oxley came about, it was basically Enron and it was a big all fraud, really? And the defense was, “oh, accounting is so complicated. I didn’t realize all the ramifications. Therefore I’m not guilty.” And no Sarbanes Oxley says “no, you’re guilty of criminal stupidity. You’re going to jail.”
Well, we’re getting the same defense with executives of companies today where they’re saying, “oh, software is so complicated. It’s not my fault that we had, uh, you know, we, we lost all these customer data, this personally identifiable information;” but in the future, they are going to be held liable.
An example I like to give is if my Accounting Department doesn’t remit the payroll withholding taxes or the, the goods and the sales taxes to the government as, um, a board member, I am personally liable. They can come and they can collect it for me, if the company doesn’t pay. Well, we need laws where the executives are going to be personally liable for the stewardship of this information, because this information does create value in our society. And if they don’t put in the proper governance of this, they should be held liable. And we’ve already started to see a little bit of this. I was talking to executives of Lloyd’s, uh, where they’re creating cyber malpractice insurance. So the companies that don’t have good governance over their cyber, well, their insurance policies go up and so executives that don’t understand software that’s completely fine, but they do understand higher insurance premiums. So they will then put in the governance required to lower their insurance premiums and we will be all better off.
[00:22:07] Camille Morhardt: What do you think are going to be some of the main line items to reduce a premium? Like, what are some of the top things, if somebody wanted to start now expecting these kinds of standards to come, what sort of things would you tell them they should start implementing or get ready to write?
[00:22:26] Pierre Racz: Well, there were a couple of really big, bad breaches recently because of bad passwords. So the first thing is you want to have strong multi-factor authentication and ideally you have it using hardened crypto devices. These are not expensive. You can get them for in, in single unit quantities for less than $50. And essentially these are devices in which you can insert crypto secrets or even the crypto secrets is generated on the device and you can’t easily get them out–not to say you can’t get them out at all. But they can perform crypto operations so that they can sign messages. They can identify you strongly. And what this does is it raises the bar that to be able to carry on a successful attack, you actually have to physically have the key. You have to be in North America to do what you can’t do it from the Starbucks in St. Petersburg.
So the identity management is the, probably the first place that they can start. Another important place is there are basic governance practices, basic security that they should use. So for example, you know, if you remember a couple of years ago, there was the Equifax hack where the financial information of what it was 142 million Americans and a million Canadians were stolen. So Equifax there’s two things: the president of Equifax had underfunded the cyber security. There were certificates on their servers expired. And the worst is that they were using the root password–the administrative password–to do routine administrative work for whatever they’re doing their customers. So this is such bad basic practices that executives who engage in in that, I mean, goodness, you know, a sixth grader knows not to do that.
[00:24:26] Tom Garrison: Right. There’s a lot to cover there, certainly. And I know Camille and I spent a lot of time talking about making sure platforms are updated as well and having a whole process around making sure that the patches are installed and you have a process. There are certainly good basic practices that people should be implementing anywhere in any company.
Before we let you go we do have one segment that we like to call Fun Facts. I’ve heard that, uh, you have, uh, an interesting fun fact that you’d like to share with the audience. What is it?
[00:25:05] Pierre Racz: I had the privilege of having a drink with the singer Sting.
[00:25:12] Tom Garrison: No kidding.
[00:25:14] Camille Morhardt: I wanna know where! (laughs)
[00:25:21] Tom Garrison: Yeah. Where and like, did you just have a casual conversation or?
[00:25:20] Pierre Racz: So in the early 80s, for my summer job, I was Director of Physical Resources for the World Film Festival in Montreal, which means that I, I was the head gopher. And, uh, things that we were in charge of among other things, the limousines that were used to ferry around the stars. Well. One day we had, uh, certainly less limousines and we had stars. And if you remember, that was the year that, uh, Brimstone & Treacle, a movie came out, starring of course Sting. (A very good movie I might add). And anyways, so he took the limousine and he drove up north to, you know, we have nice cottage country about an hour and three quarters north of Montreal.
And so he took the limo and explored there. Anyways, he came back like five or six hours later, and I had all these other people that wanted to be ferried around. That evening. Uh, I was sitting in the bar with my crew and, uh, we’re having a drink and he just came in, sat down and sort of apologize for all the confusion that he had. And we had a great conversation for about, uh, about a half an hour.
[00:26:31] Tom Garrison: No, that’s great. That’s great. Cool. So Camille, how about you? How, what fun facts do you have to share.
[00:26:37] Camille Morhardt: I was installing something again. You can tell I’m, I’m doing some cabinetry in the kitchen, uh, and I kept dropping the screw off the tip of the screwdriver and swearing, you know, this is ridiculous and I can’t believe these things aren’t magnetizing. I learned that you can, in fact, magnetize, a non-magnetized screwdriver simply by taking a magnet–and then I have a refrigerator magnet; so you slide the magnet from the handle down to the tip of the screwdriver. And you can do that like three times kind of rotating the screwdriver quarter of an inch each time. And lo and behold, you can pick up the screw and it’s, magnetized. Supposedly this lasts about three months. I haven’t tested that yet. But also interesting is, if you go the opposite direction with the magnet, you immediately de-magnetize the screwdriver.
[00:27:30] Tom Garrison: Uh huh. So what if you start going the opposite direction? Do you repel screws?
[00:27:36] Camille Morhardt: No, you don’t. You just don’t magnetize.
[00:27:40] Tom Garrison: Interesting. Well, that’s cool. I mean, I’ve had instances where a non-magnetized screwdriver was magnetized for whatever reason and I thought, “well, that’s awesome. That was useful.”
Well, let’s see, so my fun fact is, uh, back to the animal kingdom, because I just think there’s so many cool things about animals. Uh, and this is about falcons and specifically Peregrine Falcons. They are smaller than other larger falcons and because they’re smaller, they have smaller talons. And so to get their prey, they, instead of grabbing and slashing like other larger falcons do –and other birds of prey, really, for that matter–they fly really, really fast up to 200 miles an hour and they take their claws and they fist up and they punch their prey as they fly by. And it kills them or stuns them at least to the point where they can come back and get them.
So, yeah, peregrine falcons actually punch their prey and they just do it at tremendous speeds. And, uh, it’s very, very effective. I, I actually saw a slow motion video of it, of a peregrine falcon and taking out, uh, a Mallard duck. And, uh, yeah, it was a, let’s just say it was effective.
[00:29:08] Pierre Racz: Uh, and karate. And they teach us that it’s not your weight. That’s important. It’s the speed because the kinetic energy is proportional to the weight, but it’s proportionate to the square of the speed.
[00:29:17] Camille Morhardt: Hmm.
[00:29:20] Tom Garrison: There you go. So Pierre, thanks for sharing your thoughts around privacy and around data, and a lot of the interesting topics that cultures and societies around the world need to really be wrestling with. It was definitely an enlightening conversation. So thanks for joining us.
[00:29:39] Pierre Racz: Well, thank you for having me. And maybe I’ll leave you with one last. I thought that, um, there was written on a t-shirt that I was in the Engineering Department, but next to us was the, the Physics Department. They have this cool t-shirt and it read “Fighting Entropy is Everybody’s Business.” So, uh, I think that cyber insecurity is a form of entropy and I think fighting it is everybody’s business.
[00:30:04] Tom Garrison: Great. Thanks.