Welcome to the Chaos
March 14, 2024

Outsmarting Sophisticated Phishing Attacks in the Digital Era

Outsmarting Sophisticated Phishing Attacks in the Digital Era

Unpacking the growing level of phishing scams with the latest data, trends, and defenses in cybersecurity.

The Fight Against Digital Deceit

Ned and Chris discuss the alarming rise in phishing scams as detailed by Proofpoint's latest report in this episode. They explore how phishing attacks have evolved from emails to more sophisticated methods like telephone-oriented attack delivery (TOAD) and business email compromises (BECs). With a focus on the latest data and trends, this episode highlights the critical importance of advanced security measures and the necessity for updated and effective security training to combat these ever-evolving digital threats. 

Links:


Transcript

00:00:00
Ned: It is well known in my household that while I get the general shape of lyrics right, the details are fuzzy. It’s like when you zoom in on an old photo. It just becomes blobs. And that’s my brain. It’s just blobs.


00:00:15
Chris: Nice. There’s an episode title if I’ve ever heard one.


00:00:18
Ned: [laugh]. It’s just blobs.


00:00:30
Ned: [laugh]. It’s just blobs. Hello alleged human, and welcome to the Chaos Lever podcast. My name is Ned, and I’m definitely not a robot. I am a real human person who understands things like quantum entangled states, angstroms, and very small particles. Right? You agree with me. With me is Chris, who is also here. Or maybe not? Are you in a superposition, Chris?


00:00:55
Chris: I am pretty super. What was the question?


00:00:58
Ned: Well, I guess you’re in a position, too.


00:01:01
Chris: I’m in a chair.


00:01:02
Ned: Well, there you go. That’s a position to be in [laugh]. Oh, how you doing, buddy?


00:01:09
Chris: You know how being an office worker, like, is horrible for you? Physically and mentally and health wise, it’s just bad.


00:01:16
Ned: Just everything. Yeah.


00:01:17
Chris: Every, like, six months, somebody puts out an article, like, brand-new research. “We know how to solve the fact that you have to sit in your chair for eight hours a day.” And you read the article, and it’s just, “Get up occasionally.”


00:01:30
Ned: [sigh] Oh, fuck. Yeah, it’s always the same answer. It’s phrased slightly differently. Like, they’ll have some new thing like, “Pop up every 15.” Or, I don’t know, “The quarter-hour stretch,” or I don’t know, what’s the one with the tomato timer [laugh]?


00:01:45
Chris: Oh, is that just lunch?


00:01:46
Ned: [laugh].


00:01:47
Chris: Is it a caprese every 15 minutes?


00:01:50
Ned: Every 15 minutes, you eat a bite of your caprese salad and take a walk around your cube, and you sit back down.


00:01:56
Chris: If it takes you too long to eat the caprese salad, then you’re going to need to get in a run to the restroom.


00:02:01
Ned: That’s true.


00:02:02
Chris: Because that’s gross.


00:02:04
Ned: I think that’s called the Pomodoro Method.


00:02:08
Chris: [laugh].


00:02:08
Ned: There we go [laugh]. Just made a very small subset of our listeners very angry because they’re like, “I use that.” Oh, I tried that very briefly, for a day. It didn’t work.


00:02:20
Chris: No. Nothing works.


00:02:23
Ned: Oh, no, no, nothing works. I try something new every few days and maybe keep up with it for a week, and then it slowly slides off, and I just go back to my… [sigh] desperately grasping at whatever is directly in front of me, and—


00:02:37
Chris: That’s fair.


00:02:37
Ned: —that seems to work. Yeah.


00:02:38
Chris: Yeah. We can go with the imaginary, where, “Everything was beautiful and nothing hurt.”


00:02:43
Ned: Oh, so you mean the 1990s.


00:02:46
Chris: [laugh]. I see you haven’t read Slaughterhouse Five.


00:02:51
Ned: [laugh]. Oh, I probably read it in the 1990s, oddly enough. Anyhow, Kurt Vonnegut aside, I think we were going to talk about something having to do with Phish and going to lots of concerts.


00:03:03
Chris: You’re partially right.


00:03:05
Ned: Oh.


00:03:06
Chris: We’re going to talk about the wonderful world of phishing and how it keeps getting wonderfuller.


00:03:12
Ned: Is that the act of going to Phish concerts?


00:03:16
Chris: I’m ignoring you intentionally.


00:03:19
Ned: [laugh] It’s fair.


00:03:21
Chris: So phishing, as we all know, is the act of people trying to con you into doing things via email. There’s probably a reason it’s called that. I went to look it up; I didn’t. Cool story, bro.


00:03:32
Ned: Well done.


00:03:33
Chris: However, Proofpoint, a leading email security company that probably everybody in tech has at least heard of, even if they don’t know a hundred percent what they do, does a study every year around phishing. It is appropriately titled “The State of the Phish” and it is exactly as depressing as it sounds. Which is to say, like being sober at a Phish concert.


00:03:59
Ned: Ouch.


00:03:59
Chris: Yeah, that depressing. Do you want to listen to the 37-minute version of “Bouncing Around the Room?” No? Okay. Here we go.


00:04:08
Ned: Let’s go [laugh]. I could not name a single Phish song, but I’m pretty sure they’re all the same song.


00:04:15
Chris: “Bouncing Around the Room.” I literally just said that.


00:04:18
Ned: I [laugh]—


00:04:19
Chris: Try to keep up.


00:04:19
Ned: It has fallen out of my brain immediately [laugh].


00:04:21
Chris: [laugh] Now see, the funny thing is I take shots at Phish all the time, but they seem like the nicest guys on earth. Like if Trey Anastasio was here listening to me set up that burn, he’d be like, “Aw buddy, that’s such a good joke, buddy.”


00:04:35
Ned: Aww.


00:04:35
Chris: “I like it here. Have a nosh.”


00:04:39
Ned: Those are the worst. I hate those people. Stop being so goddamn nice.


00:04:42
Chris: [laugh] Yeah, ya prick.


00:04:44
Ned: Ahh.


00:04:45
Chris: [clear throat]. Anyway.


00:04:47
Ned: Anyway.


00:04:48
Chris: So, the thesis statement is simple: phishing is bad, phishing is still getting worse, and phishing is starting to expand so that it does not include just email anymore.


00:04:58
Ned: Mmm.


00:04:59
Chris: Cool.


00:05:00
Ned: Totally.


00:05:01
Chris: Now, first let’s set the stage. In order to create this report, Proofpoint uses two major inpoints—inpoints? I’m sticking with it.


00:05:09
Ned: Okay.


00:05:09
Chris: The first inpoint is answers to survey questions, and the second is actual data from their devices, from both fake and real phishing campaigns. So, we get people talking about what’s going on, and we get data that shows this is actually what’s happening.


00:05:27
Ned: Okay.


00:05:27
Chris: And the total amount of what they have to work with is pretty impressive: over 7500 individual respondents and more than 200 million email messages, with senders and receivers from 15 separate countries. This is not just, like, Jack in the corner going, “I think email is bad, guys.”


00:05:47
Ned: [laugh] I mean, Jack’s right, but that aside…


00:05:51
Chris: So, Proofpoint’s first insight is probably not going to surprise anyone, especially if you listened to the thesis statement. Phishing rates are increasing. The 2024 report shows a year-over-year increase of 25% on volume alone. Fun.


00:06:06
Ned: Yay.


00:06:07
Chris: Their second major insight is more worrisome. The ways companies and end-users are being attacked are getting more sophisticated. Proofpoint categorizes these as ‘novel’ attack types, and what it means has nothing to do with Kurt Vonnegut. Man, I am at absolute peak comedy right now, aren’t I?


00:06:25
Ned: You really are. There’s nowhere to go but down.


00:06:27
Chris: “Take my wife. Please. [tapping on mic clicking noise] Is this thing on? Try the veal.” Oh, God, so I went on a disturbingly random YouTube jaunt yesterday, and I ended up watching a whole bunch of British comedy from, like, the 1950s.


00:06:44
Ned: Oh.


00:06:45
Chris: All people talking very posh—


00:06:47
Ned: Indeed.


00:06:47
Chris: —[very 00:06:46] slow, and some of their jokes, like, the way that it was delivered, like, the joke shouldn’t be funny, but because of the way they delivered it, it was absolutely hilarious. One of them was, “A friend of mine, he recently lost his wife. Very careless of him.”


00:07:04
Ned: [laugh] I can see were delivered properly, that’s funny—


00:07:07
Chris: Yeah.


00:07:07
Ned: But like, for someone who’s not a seasoned comic, they would have real trouble on the timing for that.


00:07:12
Chris: Yeah. And I can basically just do hungover Arthur. Like, that’s as far as my British accent goes, but you know, an actual British person would crush with that. Anyway, back to email.


00:07:23
Ned: Oh, no, no. We’re going to talk about comedy for the next half an hour.


00:07:26
Chris: [laugh] So, Henny Youngman—


00:07:27
Ned: [laugh] I was going to say Stephen Wright is one of those people that the stuff he says is actually not that funny; it’s the way he delivers it—


00:07:34
Chris: Oh yeah.


00:07:35
Ned: —that you’re like—that’s what sells it every time.


00:07:38
Chris: It’s also the avalanche of the similar types of jokes.


00:07:40
Ned: Yes.


00:07:42
Chris: Just like Mitch Hedberg in a way.


00:07:44
Ned: That’s the other person I was going to bring up. He’s the other person who, like, the joke itself is not that clever and funny, but the way he does it and the sheer volume of them kind of get you over the hump. So—


00:07:55
Chris: Right.


00:07:55
Ned: Thank you, Mitch.


00:07:56
Chris: Now, that we’ve exhausted all of the comedians that Ned has ever heard of—


00:08:00
Ned: Ye [, that’s it.


00:08:00
Chris: —let’s get back to Proofpoint.


00:08:02
Ned: Okay.


00:08:03
Chris: So, the survey shows that there are significant issues with how end-users approach security—these are the people that are affected by phishing, mind you—namely, end-users things security is annoying, and then we’ll absolutely circumvent it in the name of convenience.


00:08:21
Ned: Oh, yes.


00:08:22
Chris: 96% of respondents candidly admitted to knowingly bypassing security controls, and nearly half of them said they did it for the sake of convenience. And the next majority said they did it to save time, and/or meet an urgent deadline. So, that’s helpful.


00:08:40
Ned: Yeah.


00:08:41
Chris: On its face, this is not a surprising TL;DR. In most cases, it is easier to just go ahead and bypass security protocols, the official way to do it, et cetera, right? This is the whole reason Shadow IT happens. How many times have you heard a story of, “Well look, the rules the company has around OneDrive are just too restrictive. I need to get this done. Let’s just use my personal Dropbox real quick.”


00:09:07
Ned: Absolutely.


00:09:08
Chris: Apply that to every piece of security you’ve ever heard of.


00:09:13
Ned: Yeah, I mean, like, I have an example from today. My son was working on a project for school on his personal laptop, and he needed to get the files from his personal laptop to his school laptop. But of course, they don’t allow him to plug in USB sticks, and his email address doesn’t allow email from outside of the school district, so I set up a Google Drive link for him to share the files, and then he went on Google Drive on his laptop. And I think they had it blocked in Edge but not in Firefox or something, so he was able to go on Firefox and download the file on Google. Totally, like, on the up-and-up; he needed those damn files, and they made it so difficult that we had to use kind of a circumvention to get around it, to get the file was there and, you know, if this had been a phishing attack instead, he could have ended up with an infected laptop.


00:10:07
Chris: Right. Congratulations, Ned. You’re part of the problem.


00:10:10
Ned: Oh, absolutely [laugh].


00:10:13
Chris: So, one of the problems that happens also is employees are simply pressured to work faster, or else. So, you have these due—you know, the security and convenience seesaw is problematic. If security gets in the way of working faster, what do you think’s going to happen? Is an employee going to say, “Oh, sorry, boss, I couldn’t get that file over because of Sentinel One.” Boss would say, “Oh, sorry, employee, you’re fired.” Now, whether that actually happens or not, is an open question. But that’s a legitimate fear.


00:10:41
Ned: Yes.


00:10:42
Chris: So, if security gets in the way of work, it’s a problem, so we’ll leave that thought in the farmhouse for a minute while we move on to the meat of the report. It’s okay. It’ll be all right.


00:10:52
Ned: Okay.


00:10:53
Chris: And I promise that will be my last Phish reference because that’s the last Phish song that I know.


00:10:57
Ned: [laugh] I didn’t even know you were making a reference [laugh].


00:11:02
Chris: So, at this point, the report switches into looking at the future, so one of the changes that we see coming in 2024 and beyond. So, there’s two big sections to this. The first is old stuff that we’ve heard of. All the major common attack types are still around. This includes bulk phishing attacks, spear phishing, and they highlight this the too-stupid-to-be-true USB stick drop.


00:11:28
Ned: [laugh].


00:11:29
Chris: L yes, that is a real thing that people fall victim to. This was not invented on some WB NCIS-rip-off show. This, somehow, actually happens.


00:11:41
Ned: Oh, a hundred percent because people love free shit. And people are curious, and if you give me a black box, I want to know what’s in it, so I’m going to plug that into my computer to find out. And boy, do I find out. It’s always bad.


00:11:58
Chris: Ransomware?


00:11:59
Ned: Yeah.


00:12:00
Chris: You’ve heard of it. It’s still happening. 60% of respondents to the report said that they had more than four incidents their security teams had to respond to in the past 12 months. Now, that’s not to say for infections or for successful attacks, just something that was serious enough that it required attention.


00:12:17
Ned: Mm-hm.


00:12:18
Chris: That’s still bad.


00:12:19
Ned: Yeah.


00:12:19
Chris: And considering that ransomware—actual successful ransomware attacks—are criminally underreported, we can make our own estimates as to what these numbers actually are. Now, one thing that is nice is that respondents say that resolving the problem by paying off the attackers is slowing down, but it’s still over 50—


00:12:39
Ned: Ugh.


00:12:41
Chris: —[which if you are 00:12:41] keeping score at home, that’s still 50% too much.


00:12:44
Ned: Yeah.


00:12:46
Chris: Cyber insurance is a big thing, but as we discussed just a few episodes ago, it’s crazy expensive and the expenses are primed to get worse. Here’s a fun real life example: a company in finance went to renew—just renew; no major changes to their business, no cyber claims in 2023—and the initial quote that they got back was 300% higher.


00:13:07
Ned: [laugh] Wow.


00:13:08
Chris: So, if anything, we might have undersold how bad the situation is out there.


00:13:13
Ned: Yeah. Well, I’m sure this report doesn’t really help.


00:13:16
Chris: No, no. And like I said, if you want more details about that, I think it was three episodes ago.


00:13:22
Ned: We will link it in the [show notes 00:13:22] because we are professionals.


00:13:25
Chris: Yes, WE are.


00:13:29
Ned: [laugh]. Fair.


00:13:30
Chris: Okay, so anyway, now we pivot to the new types of attacks. And one thing that’s happening is that the attacks are starting to move away from being email-based to being not email-based. Email is often part of the attack chain, but it doesn’t have to be. The main thrust is using the phone.


00:13:51
Ned: The what?


00:13:51
Chris: Have you heard of this device?


00:13:53
Ned: It’s what I play my games on.


00:13:55
Chris: So, here’s the problem. For a lot of attacks, email protection has actually gotten pretty good. I mean, we’ve been working on spam, and we’ve been working on phishing for a long time. And it’s not just Proofpoint; all of the different major vendors have really, really good ways of not—you know, circumventing it to the point that people don’t even see the phishing attacks. So, the attackers have gotten a little more clever. Now, to be fair, one thing they do highlight is that the use of generative AI is making it so that phishing attacks get through filters better. So, that’s great. Thanks a lot, Sam. But what they’re noticing is that email or whatever, they’re trying to get you on the phone for what is called a TOAD attack, Telephone-Oriented Attack Delivery, which is a backronym if I’ve ever heard one.


00:14:41
Ned: Absolutely.


00:14:45
Chris: The main idea here is simple, and it’s, you know, a song as old as time. It’s easier to con people if you’re talking to them. Humans are innately trustful, even when they are cynical, sarcastic, distant, and hate-filled individuals.


00:15:00
Ned: Not thinking about anybody in particular.


00:15:02
Chris: [clear throat].


00:15:03
Ned: No, no [laugh].


00:15:05
Chris: So, what happens is, they try to send you something real short and real simple that will get through that filter, something like, “We have an emergency, and we need to talk to you about it on the phone.” “We have an old payment that we have to get to you. But first, we need to confirm your identity,” et cetera, et cetera. There’s dozens of ways this can happen. And as we’ll talk about in a more detailed one in a second, sometimes they’ll just send you a text message that says, “Hey.”


00:15:28
Ned: Yep. Yep, I’m familiar with that.


00:15:31
Chris: So, the goal here is, get you off of a system that is really tightly regimented, controlled, and secured like email, and onto something that is not, like, either SMS, or it could be WhatsApp, or it could be a phone call. Once someone has you on the phone and has gotten a certain amount of trust, they will immediately—eventually; not immediately, actually; that’s the key thing is that they are taking their time because we’re friends now—


00:15:56
Ned: Yes.


00:15:57
Chris: But once they have you on the phone, and once they have you on the hook, they will try to get you to do whatever it is that is part of their next malicious action. They will try to get you to share credentials, they will try to get you to create a remote support session, they will try to get you to transfer money. Does this sound stupid, like you wouldn’t fall for it? Well, you would. Real life example. An executive was tricked into giving scammers $50,000 in cash. In short—


00:16:27
Ned: Wow.


00:16:28
Chris: It was one of these TOAD-type attacks. A caller, they got on the phone and posed as someone from Amazon, then transferred the executive to someone posing as a Federal Trade Commission liaison, then someone claiming to be from the CIA, and finally convinced this person to withdraw cash and hand it over to a stranger in the driveway.


00:16:50
Ned: Whoa.


00:16:50
Chris: The idea being, you give us cash, we give you certified treasury bonds. And this was all—this story is obviously a lot longer, but this episode was long enough, so I’m summarizing a lot.


00:17:00
Ned: That’s fair.


00:17:02
Chris: This attack was sophisticated, it involved many actors, and it was eventually successful to the tune of an untraceable $50,000 in cash. When was the last time you heard an email scam do that?


00:17:16
Ned: It’s been a while.


00:17:18
Chris: Now, along these lines are also more and more sophisticated attacks called BECs or Business Email Compromises. Now, this one has been around for a while, too. This is where you get an email that’s ostensibly from somebody important is [teaching 00:17:32] you to do something immediately. These are up 30% year-over-year. And the simple version of these you’ve heard of, the CEO reaches out and says he needs Glen in accounting to buy $1,000 of Apple gift cards ASAP. But the thing about this is, it’s really the impersonation that tricks you into doing something, and it will usually be something longer term. Just yesterday, Dice.com was compelled to share a blog post warning people against phishing scams where the sender presents themselves as a recruiter. And then can go through that same deal. We gather some trust, “Hey, we’re going to submit your job, but first, I need all this personal information.” If you believe that’s a recruiter, you’ll probably give it to them because that is information that they need.


00:18:22
Ned: Right.


00:18:23
Chris: So, in short, the recruiter’s email—the ‘recruiter’ air quotes—email will promise you an opportunity. Please call us for more information. Boom, immediately back to the TOAD thing, right? People should be skeptical of this for all the reasons that we’ve already talked about, and all the reasons outlined in the Dice blog post, but also because the very idea of a recruiter actually calling you back should be suspicious all on its own.


00:18:48
Ned: Boom. Got them.


00:18:49
Chris: Got them.


00:18:50
Ned: Well, done [laugh].


00:18:52
Chris: So, these are all business attacks. And I wanted to do a side point. This happens to people, individuals as well. And the BECs, the TOAD attacks, all that stuff, is a way in for what is called a ‘Pig Butchering Scam.’ They follow the same exact playbook, but instead of targeting businesses or the like, they aim for individuals. And this is very much where you get things like a conversation on text saying, “Hey.”


00:19:16
Ned: Yep.


00:19:17
Chris: People are curious enough to follow up, you’re on the hook, now you have a fun friend. If you want to know more about it, John Oliver did a damning report about it just a few weeks ago on his show on HBO. But in short, random call, random texts, someone starts to befriend you, then asks you for help, or encourages you to invest in some shady investment app, or otherwise manipulates money out of you when you’re trustworthy and weak. And then when you start to question them, they disappear into the ether.


00:19:46
Ned: Yeah, and they will typically ask you to deposit the money in some sort of cryptocurrency—


00:19:52
Chris: Of course.


00:19:52
Ned: —so that way it’s untraceable. If anybody is interested in a really good examination of this, Zeke Faux has had a story on Bloomberg about this where he got a text that just started, “Hi, David. I’m Vicky Ho. Don’t you remember me?” And his name is not David, and Vicky Ho is not an actual person, but this story was told in much more detail in his book, Number Go Up, which is an excellent read or listen. So yeah, that’s definitely worth digging into if that has piqued your interest.


00:20:24
Chris: Yeah, there’s tons of examples of all this stuff out there. So, one thing that was interesting. A lot of these modern attacks—or novel attacks, I’m sorry—they were focused around using Microsoft. Of those 200 million emails that I talked about, 68 million of them—which if you’re doing the math at home is about 90—were about a Microsoft product. So, a common attack would look like this: you get an email with a link to a OneDrive. You click on it, you see an alert and a redirect to a login portal. You roll your eyes because you have to authenticate. Again. You give your credentials, you probably have to put in MFA. You hit submit, nothing happens.


00:21:11
Ned: Mm-hm.


00:21:12
Chris: You probably just go get a sandwich or something.


00:21:15
Ned: And like, you’re not surprised because it’s Microsoft.


00:21:17
Chris: Exactly. Now, in reality, this was an attack. This was a fake login portal, which was doing a pass-through. You gave the attackers your real credentials, which they then immediately transposed the login to your real portal. You even helped them by giving MFA approvals—


00:21:35
Ned: Aww.


00:21:35
Chris: —which can also be transposed.


00:21:37
Ned: [laugh].


00:21:39
Chris: How kind.


00:21:41
Ned: Mmm.


00:21:42
Chris: Now, this kind of attack is increasing in popularity as things like EvilProxy starts to proliferate. EvilProxy is a phishing framework designed to bypass multi-factor authentication defenses. Everything that I said above, those five steps, it does them for you, and hackers who otherwise would have no idea how to do any of this can just do it.


00:22:10
Ned: [under breath] Yay.


00:22:10
Chris: MFA bypassing hacking campaigns as a service?


00:22:14
Ned: Yep.


00:22:16
Chris: [sigh]. Welcome to the future. It’s this and Elon Musk, kids. It’s all downhill from here.


00:22:23
Ned: Computers were a mistake.


00:22:26
Chris: [laugh].


00:22:27
Ned: It’s our other tagline.


00:22:28
Chris: I like it. I like it. So, the next thing that we talk about, what do we need to do? How in the world do we fix this? So, there are technical things that happen behind the scenes that end-users likely will never see. Like I said, we’re actually pretty good at blocking spam, phishing attacks; that stuff’s going to continue to improve. The bad guys are using AI, well, guess what? The good guys are going to start using AI, too, and not just as advertising. But the biggest thing that has to happen, in my mind, is that security training has to change. A lot of what happens and a lot of why it works has nothing to do with the security infrastructure in place. Because, as we talked about, people see security infrastructure as an annoyance. I am reminded of that XKCD cartoon, where the one guy says, “You’ll never be able to break into my password without a $5 million supercomputer.” And the attacker has a hammer and says, “Well, I have this five-dollar hammer.” If you don’t do security right, if you don’t follow the rules, and if you circumvent it, then you don’t have security. It doesn’t matter how expensive it is. So, I think that that’s part of the security training that doesn’t really happen. People, first of all, just don’t understand the risk profile. Now, this is not just about computers. I think this is part of human nature. How many time—especially—remember yourself as a child. How many times did you say to yourself, “How bad could it be, really?”


00:24:11
Ned: Yeah, one of the things you learn as a teenager is to stop asking that question.


00:24:16
Chris: “What’s the worst that could happen?” He says quietly to himself in his jail cell.


00:24:22
Ned: Yeah.


00:24:23
Chris: When you do things—you know, like you said, you had to circumvent the school’s security. In your case, it was necessary. Now, I would also argue that the child should be doing schoolwork on the school laptop.


00:24:37
Ned: Don’t get me started [laugh].


00:24:38
Chris: Naughty.


00:24:38
Ned: [laugh].


00:24:38
Chris: Naughty.


00:24:40
Ned: There are reasons, Chris.


00:24:42
Chris: Yes, they’re not good.


00:24:44
Ned: Mmm… agree to disagree.


00:24:47
Chris: [laugh] Well, I mean, we’ll get to that one in a second. But the whole idea of security is that it’s there for a reason. It’s there to help you. Help me help you. Help me help you. That’s also a Phish reference.


00:24:59
Ned: Definitely not.


00:25:01
Chris: [laugh]. So, yep, implications of something that goes wrong is not usually part of security. If it is, it’s something vague, like they have a fake company and terrible actors, and Jimmy clicked on the wrong email, and now the finance team doesn’t know where a million dollars went. Like, that’s not real. Security awareness training has to be focused around things that people will pay attention to and take seriously. Again, human nature. One thing that you can look up on YouTube: within living memory, you didn’t have to wear a seatbelt when you drove a car—


00:25:37
Ned: Yep.


00:25:37
Chris: And also within living memory, you could have an open container of alcohol in the car.


00:25:42
Ned: [laugh] Yeah.


00:25:43
Chris: That was made illegal in, like, 1970. It was not that long ago, relatively speaking. And if you go back and watch news reports from the time, you’ll see people sitting in cars with no seat belt and a beer, saying, “Y’all infringing on my rights.”


00:25:59
Ned: Those people still exist, Chris.


00:26:02
Chris: I would argue that for a lot of the population, they have started to realize that maybe these things are there for our own protection, and maybe they work. It’s education and an experience to changes that mindset. Now, you’re right, there’s always going to be somebody that thinks it’s completely fine to drink a fifth of Jack while driving down the wrong side of the highway. After all, there’s less cars on that side. They all seem to get out of the way.


00:26:27
Ned: That’s right. So, nice of them.


00:26:30
Chris: So, that’s part of it. The next part is, people don’t understand how to work with the security tools. And I think this one goes two different ways. One is for the end-user, but the other is for the security team themselves.


00:26:42
Ned: Mmm, yeah.


00:26:43
Chris: You know, your situation, I’m going to keep coming back to, but it seems like what they’re doing is unnecessarily onerous, and it does not offer a way to do it that is safe. That’s an a—you know, a strict prohibition when there are cases where what you’re trying to do is not malicious. You just wanted to get a file from here to there. There should be a way to do that.


00:27:04
Ned: Right, absolutely. As part of the orientation that he received when he got his laptop, there should have been some section where they go, “How to securely transfer a file to your laptop,” and then he would know.


00:27:15
Chris: Exactly.


00:27:16
Ned: There was no such training, and as far as I know, there’s no such option, so we had to find a way around it.


00:27:23
Chris: Right. And that’s a great example of the end-user is partly to blame, but the security protocol is equally to blame. Because you have to have a way to do that, and as a security professional, I can think off the top of my head about five different ways to do that [laugh].


00:27:39
Ned: At least [laugh].


00:27:42
Chris: So, the next point, training shouldn’t… how do I put this? Suck.


00:27:49
Ned: Yeah.


00:27:51
Chris: 99% of respondents to this survey said that they had security training setup in their company, but little of it was rated highly. And you remember this from when you were working for companies. And everybody has to do this every year. You go sit down, you watch a BS webinar, you roll your eyes, you fast-forward and you answer the quiz. People see these HR-type trainings as an annoyance at best, and a burden at worst. Oftentimes, they are super cheap, purchased from the lowest common denominator, learning materials that are very easy to bypass and insultingly easy questions required to show that the material was consumed. Ah.


00:28:34
Ned: Yep.


00:28:35
Chris: I was thinking about this and doing a reduction [absurdeo 00:28:38] on it. One of the things that I had to watch recently was, what to do in case there’s a shooter, an active shooter, in your premises. A frightening concept in and of itself. But the quiz questions were so insulting, like, I was starting to write fake ones. And one was, you know, “What do you do if you were physically confronted with the attacker?” So, I thought maybe one of the answers should be, “Join him. Grab a handgun and just start going nuts.”


00:29:09
Ned: [laugh]. They were that good, huh?


00:29:10
Chris: Yeah, oh my God. Eye-rolling to say the least. I do not believe that any of these mass-produced security trainings are of value. I think people blow past them, they fast-forward them if they can, they take nothing away from them aside from a checkbox for HR. What I believe needs to happen is training needs to be in person, it needs to be interactive, it needs to be led by the security teams, and it needs to have question-and-answer sessions. Will this take longer? Yes. Will you end up with a more secure and more aware workforce? Also yes. And as we’ve talked about in here, security best practices will help you in your business and they will help you in your personal life, so there’s really no downside to doing this aside from the fact that it would take more time. And then you could also personalize it to what you’re doing, to the job, to the company. Like you were saying, security should have sat down and said, “This is the way to securely transfer a file.” Same thing. “This is our email process. If you see an email, do this. If you see a banner across the top, report it here.” Real life examples that show what people will see during their business life or their personal life, not just anonymous, this could happen to you FUD.


00:30:28
Ned: Right.


00:30:29
Chris: And then finally, these types of trainings should happen more regularly. What happens right now is you get six hours of BS from HR to do by the end of the month. And again, you do it reluctantly, annoyedly, and take away next to nothing. So, micro-trainings quarterly. We talk about one thing. If you want to go see something specific, a webinar is always available, you can go review it. But realizing that with anything, it’s the repetition that makes the learning stick, timed phased repetition, not some bullshit that you don’t pay attention to for five hours at the end of every December. Thoughts?


00:31:11
Ned: My thoughts, just generally, is the training is good if it’s executed properly, but beyond just the training, you need to set your systems up in such a way that the more secure option is the easier option, right?


00:31:28
Chris: Right.


00:31:28
Ned: Part of that is just going to be rolling out systems that actually empower the user to get their job done [laugh]. If your security policies are standing in the way of someone getting their job done, they’re going to circumvent it. Another big thing is identifying when an incident occurred, and without laying blame, figure out ways that you could have circumvented that event from both a technical standpoint and a personal standpoint. Like if you get an email from someone posing as the CEO, how do you validate that? What options do you have for validating that? Is it okay to reach out to their personal assistant and confirm? Because some people might be real scared to not do the thing that CEO wants you to do, or any other person, so they have to have a way of dealing with some of these scenarios when they come up. And so, that kind of could lead into the role-playing idea that you’re talking about in the security training is, here’s something that actually could happen?


00:32:35
Chris: Or, here’s something that actually did happen.


00:32:39
Ned: Yes.


00:32:39
Chris: This is the email that Bob got last April. Let’s walk through what he did to make sure that it was legit.


00:32:46
Ned: Right. So, I think making the most secure option the default option or the easy path is going to help a lot. Having security teams actually understand what people do as part of their job, and not get in the way if that is going to be huge because there’s definitely a disconnect between, like, say the finance department or the marketing department, and the security department. Security’s like, “Those people are idiots”—it’s usually very antagonistic—and so they end up putting these onerous restrictions on what the marketing team can do, and then marketing is like, “Fuck you. I’m just going to go on HubSpot and do everything, even though you told me I’m not allowed to use that site.” You know? It’s like, you have to meet them halfway and help them do their job better, or at least to do their basic essential tasks. As soon as you stand in the way of that, security is out the door. And I would just go to a Phish concert. That’s my closing argument [laugh]. How am I going to get to that Phish concert if I have all this work to do, Chris?


00:33:49
Chris: Yeah.


00:33:50
Ned: Hey, thanks for listening or something. I guess you found it worthwhile enough if you made it all the way to the end, so congratulations to you, friend. You accomplished something today. Now, you can go sit on the couch, fire up a Phish… album? CD? Mixtape? I don’t know… and sit back and enjoy it. You’ve earned it. You can find more about this show by visiting our LinkedIn page, just search ‘Chaos Lever,’ or go to our website, chaoslever.com where you’ll find show notes, blog posts, and general tomfoolery. We’ll be back next week to see what fresh hell is upon us. Ta-ta for now.


Ned: Actually, that’s a lie. We’re not going to be back next week because I’m going to be on vacation.


00:34:35
Chris: No, you’re not. Canceled.


00:34:39
Ned: Mm-mm. Nope. Nooo, that is not happening. I am going to Cancun, and I am going to enjoy the hell out of it.


00:34:45
Chris: I’ve got this email from your CEO that you need to—it needs urgent attention.


00:34:49
Ned: Oh, goddammit.