Skip to content
InTechnology Podcast

Buy Now, Hacked Later: Security of Online Shopping (134)

In this episode of InTechnology, Camille and Tom get into “buy now, pay later” credit and security with guests Jim Ducharme, COO at Outseer, and Armen Najarian, an industry advisor in digital fraud and identity. The conversation covers the potential security risks and threats of “buy now, pay later” credit for online shopping, how AI and machine learning are being used to detect and prevent those security threats, and brand impersonation.

To find the transcription of this podcast, scroll to the bottom of the page.

To find more episodes of InTechnology, visit our homepage. To read more about cybersecurity, sustainability, and technology topics, visit our blog.

The views and opinions expressed are those of the guests and author and do not necessarily reflect the official policy or position of Intel Corporation.

Follow our hosts Tom Garrison @tommgarrison and Camille @morhardt.

Learn more about Intel Cybersecurity and the Intel Compute Life Cycle (CLA).

Buy Now, Pay Later: What’s the Risk?

As online shopping becomes more ubiquitous in everyday life, more convenient payment options are now available. Consumers who might not have been able to access traditional credit can now take advantage of “buy now, pay later” installment payment plans through popular platforms like Afterpay or Klarna. Consumers can benefit from “buy now, pay later” options by bypassing traditional credit, while companies can benefit by increasing sales and bypassing the reduced profits that come with using third-party credit companies.

Unfortunately, these benefits come with many security risks and growing threats for both companies and consumers. Cybercrime is going to be present where there is money to be made, Jim and Armen explain. Risks to companies include the potential for overall loss when customers can’t pay all of their installments, while consumers are at risk for cyber crimes like account takeover fraud, synthetic identity fraud, and identity theft.

Stopping Fraud with AI and Machine Learning

Cybercriminals now have multiple ways to gain unauthorized access to consumer accounts and personal data. They can impersonate someone to make fraudulent charges using their credentials, or they can create entirely synthetic identities (i.e., fake accounts) using someone’s personal information like their name, location, device, etc. Because “buy now, pay later” platforms are not tied to financial institutions like your bank and because they are frequently being used for the first time by many consumers, the accounts and personal data are easier for fraudsters to hack into.

Thankfully, companies like Outseer are using artificial intelligence and machine learning to stop fraud from these nefarious actors in their tracks. Jim and Armen share how their engine uses hundreds of data points to verify the identity of consumers and the legitimacy of their transactions, as well as to identify criminal patterns. It takes sophisticated AI and machine learning to work behind the scenes of every transaction to ensure security.

Brand Impersonation and Phishing

Fraudsters are finding new ways to get around advanced security for online payments. One of the most prolific methods right now is brand impersonation, which puts both consumers and brands at high risk. These are often phishing attacks where cybercriminals attempt to look like legitimate brands by contacting consumers, posing as brands using their logos/graphics and similar-looking URLs, and getting consumers to enter their credentials or personal information. Nowadays, phishing attacks can even happen by gaining remote access to consumer devices while consumers unknowingly provide hackers with their information.

Consumers are at risk of their personal data or identity being stolen, while brands risk gaining a negative reputation. Thankfully, these brand impersonations and phishing attacks are becoming more detectable with the multipronged security approach from companies like Outseer.

Jim Ducharme, COO at Outseer

Jim Ducharme buy now pay later online shopping phishing

Jim Ducharme is the Chief Operating Officer at Outseer. His career in technology began 30 years ago at Natural Intelligence after studying computer science at the University of New Hampshire. Before becoming COO of Outseer in 2021, Jim was the VP of Identity and Fraud & Risk Intelligence Products and later General Manager of the Anti-Fraud Business Unit at RSA Security, the parent company to Outseer.

Armen Najarian, Industry Advisor, Digital Fraud & Identity

Armen Najarian buy now pay later online shopping phishing

Armen Najarian has over 15 years in the B2B and technology marketing space and has worked with globally-renowned companies including IBM. He studied accounting at the University of Massachusetts at Lowell and received an MBA from the University of Southern California’s Marshall School of Business.

Share on social:

Facebook
Twitter
LinkedIn
Reddit
Email

[00:00:27] Tom Garrison: Hi, and welcome to InTechnology podcast. I’m your host, Tom Garrison. And with me as always is Camille Morhardt. How are you doing today, Camille?

[00:00:33] Camille Morhardt: I’m doing well today, Tom.

[00:00:35] Tom Garrison: Our listeners did not get the benefit of hearing me stumble through that introduction. Even though I’ve done it like a hundred times now, but that was laughable.

[00:00:44] Camille Morhardt: We have to have a bloopers reel someday.

[00:00:46] Tom Garrison: There’ll be a lot of bleeps in that blooper reel.

[00:00:59] Camille Morhardt: Yeah.

[00:00:52] Tom Garrison: So today we’re going to jump into the world of credit, and what does credit mean, both on an individual level. Then we’re going to go to the near cousin of credit, which is reputation for companies and I think it’s a pretty interesting topic.

[00:01:11] Camille Morhardt: Yeah. I had honestly never heard the term synthetic identity before this and I think I’ve got new levels of fear as I log into accounts. It was very interesting to hear how careful you really need to be. And of course, how artificial intelligence is being used on both sides of the equation. Once again, both to perpetrate fraud and also to discover likely fraud.

[00:01:38] Tom Garrison: That’s right. Yeah, technology doesn’t know good or bad. It’s how is technology used. And in this case it’s being used on both sides, like you pointed out. I think certainly in the age that we find ourself in, which is that instant gratification, buying stuff right now, and maybe some people that don’t necessarily have the credit to be able to do that in traditional ways. Now with this buy now, pay later model, introduces a whole new set of challenges and then companies that are just trying to do marketing to outreach to their customers directly, they, as a company, can be exploited by these fraudsters and stealing stuff from the customers and using it to make money. It’s scary on both sides.

[00:02:25] Camille Morhardt: Yeah, really just the sheer number of online purchases that are occurring, where people are entering credentials, just really has captivated this market and turned it into its own thing.

[00:02:36] Tom Garrison: Yeah. Well, let’s jump right into it. What do you say?

[00:02:39] Camille Morhardt: Yeah.

[00:02:44] Tom Garrison: We have two guests today. The first is Jim Ducharme, he’s COO at Outseer. Jim is responsible for product strategy and leads the associated product management and engineering team. He has nearly two decades of experience leading product organizations and the identity market space. Armen Najarian, is an Industry Advisor and Digital Fraud & Identity expert. He is a 15-year Silicon valley veteran with deep experience leading the marketing function for fast growing fraud protection, predictive analytics and cybersecurity companies. His most recent leadership roles include positions at Outseer, Agari and ThreatMetrix, establishing solutions for digital identity solutions. So welcome to the podcast, both of you.

[00:03:21] Armen Najarian: Thank you.

[00:03:22] Jim Ducharme: Thanks Tom.

[00:03:23] Tom Garrison: So we want to talk about two different topics with you guys today and just cover quite a bit of ground actually. And so the first area that I want to spend some time on is this buy now, pay later trend that’s going on in the market space and what sort of threats exist in that world. So can you first just describe what is this buy now, pay later, and then we’ll get to the threats here in a moment.

[00:03:53] Jim Ducharme: Yeah. So you’ve probably seen as you shop online, buy now, pay later is just yet another way to pay when you go shopping. So in the past when you’re checking out, you might see putting your credit card here, but now you see buy now, pay later, which allows you to take a purchase and put it on basically an installment plan pay in three installments, nine installments, whatever it may be. So it’s just a new way to pay. So we at Outseer look at this as yet another form of an emerging digital payment that is right for fraudsters to take advantage of, is yet another way to steal your money. So that’s what we spend our time looking at, is what are the ways in which fraudsters can take advantage of these new digital payments, like buy now, pay later, to commit fraud.

[00:04:35] Tom Garrison: So yeah, I’m aware of the buy now, pay later. I think most of us have seen that when we make purchases, but I wonder from your perspective, obviously you are thinking about cyber crime, can you walk through the basics of where there is various threats and what those are?

[00:04:53] Armen Najarian: Yes. I’ll take this, there there’s risk. And then there’s threats. The inherent risk with the buy, now pay later model is inherently when you’re offering this new form of credit that’s unregulated today, it exposes opportunities for people that might not have the wherewithal to pay, to actually take on this credit. And so there’s inherent credit risk sometimes referred to as first party fraud or friendly fraud, where I, as a consumer, might not have a steady job, but I have the opportunity to buy some items on effectively an installment plan, doesn’t affect my credit today. I might do it and low and behold, six months later, I might not be in a position to pay. So there’s that inherent credit risk that exists. In fact, just last week Afterpay, which is the Australian base by now pay later platform that was acquired by Square last year, did announce that they had a massive shortfall due to unplanned credit risk that actually came to fruition. So that’s one type of risk that exists in this new payment instrument world called installment payments or buy now, pay later.

[00:05:57] Tom Garrison: So for that one, the risk is actually to the vendors.

[00:06:01] Armen Najarian: Well, let’s talk about it, the merchant… The beauty of the model from a merchant perspective, is they get paid instantly because oftentimes there’s a third party buying out, pay later provider behind the scenes, that’s effectively buying the transaction from the merchants. So the merchants generally like this because they get paid, they reduce their risk out of the gates. They’ll take a little bit of a hit because we’re getting the guaranteed payment on front. But the risk is being born by typically the buy now, pay later platforms like After pay and Klarna and several others out there that are dominating this space.

[00:06:37] Tom Garrison: Okay. Got it. Got it. Okay.

[00:06:38] Armen Najarian: The fraud that we’re starting to see orchestrated are really two flavors. One is your traditional account takeover fraud. So as a consumer, I have an account, someone steals my credentials, they log into my Afterpay account. They begin executing transactions, unbeknownst to me, effectively buying transactions using Klarna’s installment payment plans, I don’t discover this for a few months. So there’s that type of fraud that exists. And it’s very, very real. The other type of fraud we’re seeing is synthetic identity fraud, where a new account is created that borrows bits of data from maybe an authentic human, but bits of data that are contrived to create this new identity, a person that doesn’t really exist and establishes effectively credit with one of these buy now, pay later platforms and begins transacting and buying a whole bunch of goods and services. And then all of a sudden disappears when the installment payments are due.

[00:07:36] Tom Garrison: The perpetrators of this fraud are they nation state type people? Are they teenagers or who’s doing this?

[00:07:46] Jim Ducharme: This could be basically anybody from local teenagers to professional fraudsters that are creating a whole supply chain of basically stolen goods. We tend not to see this as sort of nation state actors committing this, but more nefarious, typical identity thieves, illegitimate transactions, things like that.

[00:08:03] Armen Najarian: And we are seeing some examples of organized fraud. So not nation state, but actual organizations with structure that have multiple lines of fraud business in their portfolio. And buy now, pay later schemes being a new opportunity for them to commit their fraud and make their money.

[00:08:20] Camille Morhardt: But in this case, the goal is to actually end up with the goods, right? Rather than stealing identity or stealing credentials or stealing from bank account information, that was sort of the second example you gave. But in this case, it’s actually to end up with a good from a merchant say…

[00:08:39] Armen Najarian: I would just quickly say the ultimate goal is to end up with the money. So if they have access to the goods, they can sell those goods and transfer those goods. May never even take possession of those goods.

[00:08:48] Jim Ducharme: Yeah, that’s exactly right. And we see on the dark web with our fraud analysts all the time, these online shopping stores that you can buy goods and services from for 80%, 90% off. And again, to Armen’s point, some of these goods may be obtained through threat vectors of buy now, pay later, right? They’ve used buy now, pay later to get their merchandise, shove it through a supply chain, and ultimately they want to end up with the money.

[00:09:11] Tom Garrison: Interesting. So without obviously going into too many details, how do you prevent the nefarious actor? Whether it’s the creating identities that aren’t real people or stealing legitimate people’s information.

[00:09:27] Jim Ducharme: Yeah. So our products for decades have been designed around looking for transactional fraud and we’ve seen new types of identities born and new types of transactions happen. But essentially we use information about the transaction and about other transactions that we see in the world to look for basically the level of trust or risk associated with a particular transaction. So when you talk about things like synthetic identities or even stealing somebody’s identities. Is this Armen Najarian? We want to do some level of verification that it’s him. Is he in a location that we typically see, is it device centric, is this his spending patterns, et cetera.

The new twist with buy now, pay later, as Armen pointed out is many times, you’re shopping for the first time with one of these buy now, pay later instruments, as opposed to using your credit card where you have an established relationship with your bank. So you’re almost establishing credit right then. And so that’s where this sort of notion of synthetic identity, or even just stolen identity comes in. So we have to do a level of identity verification, identity assurance to make sure it actually is Armen. Beyond just we have an existing relationship like you might with your bank, we actually have to make sure that this is the first time I’m meeting Armen, is this actually Armen? And so our risk engine comes in and takes into play hundreds of data points as part of the transaction and our ecosystem to do verification that this is indeed Armen, and this is a legitimate transaction from Armen. So that’s what our risk engine and our capabilities are all about and it applies perfectly to buy now, pay later,

[00:10:59] Armen Najarian: We live in a digital world. Decisions must be made in the moment. And so really the challenge is, and the opportunity is, these are real time decisions, a hundred milliseconds or less. Like Tom, you pressed the buy button on your favorite shopping site within a hundred milliseconds behind the scenes is when this is happening, to Jim’s point, it’s risk-based decisioning. Now the reality is 95% of all transactions are totally legitimate from authentic customers, just wanting to buy their goods and services. And so we don’t want to stand in the way of Tom wanting to get what he wants or Camille wanting to get what she wants. We want to treat you well, we don’t want to step you up to a challenge question or subject you to additional hurdles to get what you need to get done. However, 5% of the transactions are either suspect or outright fraud, and it’s knowing which of those 5% are suspect or legitimate fraud and doing something about it in the moment. And they’re in lies science and risk decisioning in the moment. And I’m oversimplifying it, but that’s what’s happening behind the scenes after you click the buy button or the pay button. There’s sophisticated commands and sophisticated algorithms working behind the scenes to thread bits of data together to make a judgment call, whether Tom should be trusted or the person purporting to me, Camille, should be trusted.

[00:12:24] Camille Morhardt: I assume you’re using algorithms. Maybe you’re using machine learning models or artificial intelligence to iterate and keep up. By the same token, are you seeing, we’ll just say the bad guys, use AI and other sorts of algorithms to create the synthetic IDs, large numbers of them. And can you talk a little bit about that back and forth and how that’s working?

[00:12:50] Jim Ducharme: Yeah, so, so we do use artificial intelligence and machine learning because the fraudsters are always manipulating how they take advantage of fraud. In particular, this is important because for those financial institutions that don’t use that and have more of a rigid policy based fraud detection model. For example, some banks may use, if you’re in the same region that you live in, and so Armen may be traveling, he’s in New York, he’s from California, they may look at it and go, “Hey, arm’s very far away from his home. It appears that transaction seems suspect.” Well, the fraudsters are onto the sort of static policy based piece. So we have to really look at behavioral patterns and how that shifts over time. We see everything from fraudsters change the transaction values. So rather than go steal the $3,000 TV, we’ll go steal $109 pieces of goods because it flies under the radar. So by using artificial intelligence and more importantly, machine learning, we can watch for these trends and the machine learning will actually tell us where the oddness is coming in, where that pattern is shifting, and we can adapt quickly to that.

[00:14:0-] Tom Garrison: Yeah. This is interesting stuff. I mentioned at the beginning here, we wanted to cover two different topics. And so I’m going to switch gears a little bit and switch over to… So we’ve been talking about sort of identity fraud and this transaction fraud, but if we can move now over to brands, so companies that are out there as well established companies and using services around, making sure people aren’t impersonating them. Can you talk through that aspect of the business as well?

[00:14:34] Armen Najarian: Yeah. So there’s an epidemic taking place with brands that are being impersonated for the purpose of achieving two outcomes. One financial gain, and then two spreading misinformation. And any brand is subject to this type of attack. These attacks are often very difficult to detect and can cause a lot of damage with the end consumer, by either stealing credentials or stealing data or spreading information or stealing money. And they also negatively affect the brand and the reputation of that brand. They’re vicious, they’re quick hitting, they can cause a lot of damage, yet they can’t be stopped.

[00:15:16] Tom Garrison: And can you give an example? I don’t know if there’s one that’s been in the news or something that would help our listeners sort of understand the damage or what an attack like this might look like. And if not, just make one up either one.

[00:15:30] Armen Najarian: Yeah. So a classic attack, it’s a classic phishing attack. Let’s just say your favorite retail coffee shop and you’re part of the loyalty program, and you might get an email saying, “Hey, you’ve got some points added to your account, go check them out. A special day today.” So an innocent consumer receives that message, they click through, they log into their loyalty program account or what they think is their loyalty program account, and then the damage is done. Effectively, their credentials have just been stolen. And so this happens time and time. Again, what’s happening on the back end is a lookalike website has been created with a lookalike domain that maybe sounds and feels like the authentic website, like the name itself, the URL itself, and certainly the design for that login page looks exactly like what you would see on the authentic brand’s website. And consumers aren’t always vigilant looking out for these signals that, “Hey, this actually might be illegit.” But once that consumer takes the bait, they log in, their credentials are stolen. So there’s the value right there for the fraud actor is to steal those credentials, or in some cases, the fraud actor can take those credentials log into the authentic account, maybe swipe the points or use those same set of credentials and log into other accounts or sell those credentials right on the secondary market.

[00:16:49] Camille Morhardt: I would imagine it would be difficult to locate this because, like you said before, you can just modify, keep the artwork the same or the graphics the same, and then modify the URL ever so slightly, and you could just keep changing it, keep changing it and keep changing it so that when you’re chasing it down, you don’t even know what to look for, what you’re going after.

[00:17:10] Jim Ducharme: Yeah, that’s absolutely true. So that’s why you have to have a multipronged approach to, how do you prevent this? How do you make it effective? So, so let’s say somebody is victim to that, that’s why we put these controls on the legitimate usage of these systems. So as Armen said, like the point system that you may be using, or even your bank. Now that somebody’s coming in with your credentials, is it actually you? And so that’s where a lot of the machine learning, is this your normal behavior, is this when you usually access your account, is this usually the behavior that you do? So that provides some level of effective control around when somebody steals your credential. So it’s not like with these effective controls, if somebody just has your username and password, we can still think that’s a suspect transaction.

But as we talked about before, frauds are onto this, that sometimes just having somebody’s username and password isn’t enough because of these artificial intelligence machine learning controls behind the scenes. So now what we actually see them doing is actually doing remote access. So they may call up and I may pretend to say, “Hey Camille, I’m, I’m Jim from your wealth management company and we think there’s something going wrong with your computer. We want to walk you through it to protect your account. Would you allow us to log into your computer? And we’ll walk you through how to fix this.” And what’s happening is now they’re committing fraud from your home, from your device, using many times your fingertips to log into your account and they’re diverting it that way, or even just getting in enough to put malware on your account. So at least it looks like… Again, what they’re trying to do now is rather than just steal credentials, they’re actually trying to impersonate the signal or the data that many of these fraud systems are using to understand is this actually Camille.

[00:18:54] Camille Morhardt: So in that case, multifactor authentication doesn’t help because I’m just going to go ahead and do that myself and help the fraudster, yeah.

[00:19:03] Jim Ducharme: That’s exactly right. Yeah. But there we can look at things like, what are you doing with the transaction? So we actually look at, it may actually be you, you may be, unwittingly doing a bad transaction. And so we’ll look at even, if you’re going to do a wire transfer for example, to transfer money someplace, we’ll look and go, we know that’s a mule account, so we’ll actually look at the entire transaction, not just making sure it’s you, but again, do you typically transfer to this location? Or as I said, with our global data network, we can… No, we know there was committed fraud of people sending money from here to there. So we have to look at the whole thing, not just strong authentication, MFA, that’s an important part of the puzzle, but it’s only one piece of how fraud is committed.

[00:19:49] Armen Najarian: Yep. So Camille, you’d asked about the creation of these phishing domains and can the fraudster just go out and buy a bunch of lookalike domains, that all resemble one another. So that happens, versions of the same phishing website are set up that all look the same that have slightly different URLs. We have detection capabilities to detect exactly that. To look for the presence of what are called lookalike domains or cousin domains. But that’s typically the origin of these brand use attacks, is one of these lookalike websites that set up, whether it’s the login page to get to your loyalty account or a page with information on it, and it’s really detecting the presence of those assets of those pages that are illegitimate, and knowing that with certainty that those should not exist. That’s where it all starts and how to take down these attacks.

[00:20:39] Tom Garrison: Well, this is very fascinating. So the scary as heck, to be honest with you, because I can see how difficult it would be to protect against both of these kinds of attack. The buy now, pay later is a huge technical challenge, as well as the brand identity issues. So great topic. But before we let you go, we do have one more segment that we like to do on our podcast, which is called fun facts. And so I’ll start maybe with Jim, do you have any cool fun fact that you’d like to share with our listeners?

[00:21:11] Jim Ducharme: Yeah, well, I think this one’s a little timely because Elon Musk has been in the news lately and I saw this a couple weeks ago and it said if you want to appreciate how much money Elon Musk has, it says, “If you were born in 80,000, BC, so over 82,000 years ago, and you saved $10,000 a day, you still would not have as much money as Elon Musk.”

[00:21:35] Tom Garrison: Wow.

[00:21:37] Jim Ducharme: So, that’s a lot of money.

[00:21:40] Tom Garrison: Wow. That is a lot. I’ve never heard that before. That’s cool. Armen, how about you?

[00:21:46] Armen Najarian: So I am thinking about vacations for this year and exotic locales and one place I’ve always wanted to visit was Easter Island. So I started researching Easter Island. We’ve all seen those big figures that have been carved out of stone. Those mysterious figures. And so I started researching those and it turns out, what we see in the images in some cases is literally just the tip of the iceberg. There are some of those figures that have actually been determined, actually go down like 30 or more feet below the ground with full bodies. I don’t think many people realize that. And it just adds to the mystery and the intrigue of how the heck were these things even created and erected and moved to their locations. It takes what made a mystery, even more mysterious.

[00:22:28] Tom Garrison: Cool. Camille?

[00:22:29] Camille Morhardt: Yeah, I enlisted some friends to help with fun facts. And unfortunately this is more of a horrifying fact, but I’m going to read it anyway because I think that people should be aware at least. There were Oregon black exclusion laws in effect starting in the late 1800s that actually made it illegal for black people to live in Oregon for more than two or three years. And they came off the books in 1925, which seems shockingly late. And I guess the very final reference to them didn’t come off the books until 2002. So you can look it up for more information.

[00:23:07] Tom Garrison: Wow.

[00:23:08] Jim Ducharme: Yikes.

[00:23:08] Tom Garrison: I’ve never heard that.

[00:23:09] Armen Najarian: Hard to believe. Wow.

[00:23:11] Tom Garrison: Well, I’m going to go much lighter in tone. The first McDonald’s drive through was installed in a restaurant in Sierra Vista, Arizona, located near local military base. The military rules, forbade the soldiers from wearing their military uniforms in public and they weren’t going to go change to their civilian clothes just to go grab a burger. So the restaurant manager named David Rich came up with a solution. He cut a hole in the wall and allowed members of the military to pick up their orders without stepping out of their car and the convenience and the simplicity of the idea quickly caught on.

[00:23:50] Jim Ducharme: That’s innovation right there.

[00:23:51] Tom Garrison: Change the world right there.

[00:23:53] Jim Ducharme: Hmm.

[00:23:54] Tom Garrison: All right. Well, hey Jim and Armen, thank you so much for coming in. It was a topic we hadn’t covered yet over all the episodes and I think both Camille and I found it fascinating, so thanks for sharing.

[00:24:06] Jim Ducharme: Thank you, Camille and Tom. Appreciate it.

[00:24:08] Armen Najarian: Thank you guys.

More From