Welcome to the Chaos
March 7, 2024

The Reality of 'Secure by Design' and the Future of Cybersecurity

The Reality of 'Secure by Design' and the Future of Cybersecurity

Ned and Chris discuss the 'Secure by Design' initiative, debating its effects on tech innovation and cybersecurity in the fast-paced tech world.

The Secure by Design Debate

In this Chaos Lever episode, Ned and Chris tackle the "Secure by Design" concept, inspired by a report from the Cybersecurity and Infrastructure Security Agency (CISA). They discuss how security can be built into software from the start and the challenges this poses for developers under pressure to deliver quickly. They debate whether government rules help or hinder tech progress, and what this all means for the future of safe and innovative software. 

Links:


Transcript

00:00:00
Ned: While I was waiting for you, because you’re late, I was watching the Vengaboys’ The Vengabus. And now [laugh] you have that stuck in your head.


00:00:10
Chris: Wow [sigh]. And you seem disturbingly happy about it.


00:00:14
Ned: Okay, don’t just listen to the song though. Go and watch the video on YouTube, and it will just bring joy to your life. Like the ’90s was so fucking weird. And it’s back.


00:00:26
Chris: [laugh].


00:00:26
Ned: I think we’re all better for it. Hello, alleged human, and welcome to the Chaos Lever podcast. My name is Ned, and I’m definitely not a robot. I am a real human person with the capacity for joy, amazement, and soul-crushing disappointment with the human race. Sure is fun being human. Very glad I’m not a robot who can just watch on with dispassionate rationality. Good times. With me is Chris, who is also here. Hi, Chris.


00:01:07
Chris: Wasn’t that the name of your rap album? Dispassionate Rationality?


00:01:12
Ned: It was. It was during my Nerdcore phase.


00:01:15
Chris: I didn’t think that the first song should have been 700 minutes long, but you know, you take advantage of the digital world in your way and… the rest of us will just cry.


00:01:25
Ned: I cannot be confined to the restrictions of a CD, all right? That’s just, that’s the way I work, that’s the way I rap. I throw down with my bros, like, MC Frontalot, and we have a good time. Yeah.


00:01:43
Chris: Yeah. Let’s go ahead and run away from everything that just happened.


00:01:47
Ned: [laugh]. Well, I guess we can talk about something else that is equally nerdy and ridiculous, and that is cybersecurity. Are you ready?


00:01:57
Chris: Uh… I’ve heard of it.


00:02:00
Ned: Mmm. Now, you actually are more involved in cybersecurity than me, so this is sort of the same role-reversal that we had last week, where I’m going to talk about a thing that I kind of know about for a while and then you can correct me on all my bullshit.


00:02:15
Chris: I can’t wait.


00:02:17
Ned: I know you can’t.


00:02:18
Chris: It’s the sort of thing I live for.


00:02:20
Ned: We don’t even have to make that up. That is just demonstrably true. All right. So, in late December last year, the Cybersecurity and Infrastructure Security Agency—they love security so much they had to include it twice in their name—released a request for information about their report, “Shifting the Balance of Cybersecurity Risk: Principles and Approaches for Secure by Design Software.” I assume you’ve read this front to back, Chris, and you’re well versed in the entire thing?


00:02:52
Chris: Oh, I have an autographed copy, obviously.


00:02:56
Ned: [laugh]. It’s impressive that you got them all to sign it, considering there was like 20 different governmental agencies and countries involved.


00:03:04
Chris: Well, admittedly, the book signing did take a while.


00:03:08
Ned: As it does. Well, you’ll be glad to know that the commentary period has now ended as of February 20th. So, it’s closed. There’s going to be no more initial contributions; you’ve got all your signatures. But now you have to go out and collect the 83 different comments and PDFs that were submitted—or comments—and get those signed, too.


00:03:33
Chris: [sigh].


00:03:33
Ned: [laugh]. So, I took the time to read the initial document that CISA posted, along with some of the commentary. And I thought, you know, maybe we could take a look at what the original document proposes, and one of the more substantive responses that serves as a bit of a foil to the, let’s say, lofty goals that CISA espoused, and I’ll also share some of my own thoughts on the topic. The response I’m going to be referencing is from Shortridge Sensemaking, and was written by Kelly Shortridge and Ryan Petrich, both well known folks in the realm of software development and cybersecurity. They disagree heavily with some of the key ideas of the paper, and also the general tone, which I’ll get to momentarily. And Chris, feel free to chime in with your nuanced takes, like, “Computers were a mistake,” and, “Burn it with fire.”


00:04:28
Chris: I mean, so far, I’m already disappointed. The place is called Shortridge Sensemaking. Their names should be Kelly Shortridge and Ryan Sensemaking.


00:04:37
Ned: [laugh]. I don’t disagree. Kelly Shortridge has her own LLC, and then Ryan Petrich has his own organization that he works for. So, they both contribute to the paper, and it was submitted under her LLC, but he doesn’t actually work for her. Does that help?


00:04:55
Chris: I still think my idea is better.


00:04:57
Ned: [laugh] Okay well, I will reach out to Ryan and let them know that he has to change his last name, immediately.


00:05:02
Chris: Thank you.


00:05:02
Ned: Okay [laugh]. You’re welcome. Obviously, I will link both the original document and the response in the [show notes 00:05:08]. But, um, let’s start with the core idea. I was not familiar with the Secure by Design idea. I certainly hadn’t heard the phrasing before, but um, it’s pretty literal in its interpretation. It’s a system that is secure by design is—wait for it—designed with security as a guiding principle. So—


00:05:33
Chris: [laugh]. It’s right there in the name, man. What more did you need to know?


00:05:37
Ned: It is. But to elaborate a little bit more, according to the document Security by Design means that, quote, “Products that are secure by design are those where the security of the customers is a core business goal, not a technical feature,” end quote. To which I say, I would also like a unicorn that burps diamonds and shits gold bricks.


00:06:01
Chris: Well, I am certainly not loaning you Larry.


00:06:04
Ned: [laugh]. Why not? I thought we had, like, sort of, um… joint custody of Larry, since we found him together on the Gumdrop hills?


00:06:13
Chris: No. Well, I mean, I was there first, and you’re also forgetting about the security principle of finders-keepers.


00:06:19
Ned: I guess I will have to weep then, as it were. All right, so that really brings me to the tone of the paper overall. If I had to sum up the document, it would be, “Shame, shame, knows your name.” Yeah. It directly targets the software manufacturing industry as a whole, and the entirety of the document lays the blame for our current cybersecurity woes at the feet of the software development houses, and basically holds customers entirely blameless. Now, Chris, I don’t know about you, but I’ve been to customer sites. I’ve seen their implementations, and to hold them blameless for cybersecurity is pretty fucking wild.


00:07:05
Chris: Yeah, I think this falls into the shared responsibility ethos.


00:07:10
Ned: It does. You usually see that applied to Software as a Service or cloud type deployments where, here’s what the vendors responsible for up to a point and then here, customer, here’s what you’re responsible for. That still applies if you’re buying your software from someone, though I feel like in those cases, we don’t hold the software vendors’ feet to the fire as much as we should, so maybe they are trying to tip the balance a little bit, but they may have overcorrected, I guess is what I’m saying.


00:07:41
Chris: I mean, we will see what they say, but I mean, this is not—if we take this out of security realm and just into software efficacy, the same situation applies, right? The rush to market, features that don’t work right, beta is sold at full price, that we can absolutely put at the feet of software vendors.


00:08:02
Ned: Indeed. Now, beyond the shaming of software vendors, the paper also heavily favors larger software development shops, with seemingly endless capital to burn, and it has a very academic view on business drivers when it comes to security, that is to say a view that is not grounded in reality. If I might quote from Shortridge, quote, “Assuming our economy will continue to embrace market competition, CISA’s objective of forcing companies to prioritize something neither customers nor shareholders value over the things they do value—as demonstrated by their willingness to pay—will be not only farcical but also alienating to private industry,” end quote. So, if I could summarize that, “Capitalism. Have you heard of it? It’s a thing.” The company that prioritizes building secure software over everything else is not going to be a company for very long. With that in mind, let’s dig into the meat of the paper. So, the first principle that they bring up is actually not Secure by Design; it’s Secure by Default, which is something I can very much get behind. So, the core idea here, Chris, is that the defaults of any system or software should be the more secure option, with less secure options either being removed entirely or difficult for a customer to achieve. And in those cases where a customer desires a less secure configuration, the software should make it extremely clear what the potential risk is by, like, turning off that feature and also confirm their intent. Basically, “Are you sure? Like, are you really sure? Like, don’t be a dumbass, Hank. ‘admin123’ is not a good root password. Well, all right, fine. Screw it. It’s your neck. Don’t say I didn’t warn you.”


00:10:04
Chris: Right. And for a real life example of this as it played out in everybody’s memory, let’s talk about S3.


00:10:12
Ned: [laugh]. You mean the fact that originally S3 buckets were public by default?


00:10:19
Chris: That’d be the one.


00:10:21
Ned: Yeah. And then they changed that, but they didn’t necessarily force everybody to change their scripts or automations—


00:10:29
Chris: Correct.


00:10:29
Ned: So, then they changed it to tell you with bright red warnings in the UI that you’ve set it to public.


00:10:38
Chris: Which is, you know, to the point that this paper is making what they should have done in the first place.


00:10:44
Ned: Another great example: the original implementation of Microsoft SQL as a Service—Azure SQL—had a public, non-firewalled endpoint. By default.


00:10:57
Chris: Neat.


00:10:58
Ned: This is a great idea. I love it.


00:11:03
Chris: [laugh].


00:11:03
Ned: The document lists a few choice quotes, and the one I really like is, quote, “A secure configuration should be the default baseline.” Yes, this. A hundred-thousand times this. Too often, the installation options for a piece of software favor, let’s say, convenience over everything else—Microsoft, I’m looking at you—leading to less secure installations and a largely unaware customer. I’m not saying that the installation process should be onerous; it should still be relatively easy, but the default options in that process should be the more secure options with the less secure options being more difficult to find and implement.


00:11:46
Chris: Right. It can still be next, next, next, finish—


00:11:50
Ned: Absolutely.


00:11:51
Chris: —just that all of the security checkboxes are pre-checkboxed for you.


00:11:55
Ned: There may be some customer frustration when they realize after they deploy the thing that, oh, I need to jump through an additional hoop to access it, but I think that’s better than just leaving all of your private data hanging out in the S3 wind.


00:12:09
Chris: Agreed.


00:12:11
Ned: Another quote says, “The complexity of security configuration should not be a customer problem.” This one’s a little more nuanced. For SaaS operations, yes, this should absolutely be true. The whole reason I’m using SaaS is to remove or relieve administrative burden. The vendor deals with the security complexity of things like key rotation, audit logs, and single sign-on. Not me. I don’t want to deal with that; it’s why I’m paying you money. But for self-hosted software, I don’t think there’s a way to take the operator out of the complexity, especially as the software probably needs to integrate with other components of your environment, but the software manufacturer should absolutely make their side of the integration as simple and secure as possible. The last one I want to bring up, the quote says, “Security should not be a luxury option, but should be considered a right customers receive without negotiating or paying more.” Uh, you know, at first blush, it might sound good, but I think with this one, there’s a lot more nuance. As Shortridge and Petrich mentioned, there are many ways to price products in the marketplace. There is perceived value from the customer, actual cost for the vendor, or keeping up with the competition, but regardless of which pricing model you choose, some security features will necessarily be in a more expensive tier. Take single sign-on, for instance. There’s going to be a lot of small and medium businesses that don’t care about SSO, but enterprises absolutely require it. By perceived value, it will go into the enterprise tier of products. The actual cost will be the creation and maintenance of the integration, something you need to charge a little for, and looking at the landscape, most of the vendors charge for their SSO integration. Asking for vendors to simply include all of their security features in every version of their product is, let’s say, wishful thinking at best, and actively hostile to companies at worst.


00:14:21
Chris: Mmm… I’ll hold my thoughts for now.


00:14:23
Ned: Well, I think there is an argument to be made here that there are some… there’s, like, a bare minimum that should be part of any product tier. Like, your free tier of your product can’t just be horribly insecure. [laugh] I’m not saying that. But I am saying there are some features, security or otherwise, that it’s okay to reserve for the paid or enterprise tier of your product.


00:14:48
Chris: Sure. And really, I guess the end goal here is, what is that line?


00:14:53
Ned: It is a difficult line to draw, and I don’t think they can do it in the document because there’s so many different kinds of software out there, so it’s more of a—


00:15:01
Chris: Yeah, there’s bad software, there’s bloated software, there’s software that doesn’t work.


00:15:09
Ned: [laugh]. I feel like you’re taking an adversarial position against the software vendors.


00:15:13
Chris: [high pitched] Whaaat?


00:15:14
Ned: [laugh]. Which is fair. There are some bad actors out there who have really skimped on the security side, or put all of their security features behind a paywall, and that’s bad, right? I’m not saying that’s good behavior. But I think a blanket statement of everything should be included? Mmm, not loving that.


00:15:34
Chris: Right.


00:15:35
Ned: Yeah. So overall, secure by default, I think it’s good. I like the general concept, and I don’t think it’s just for the end-user that they’ve been talking about so far. I think secure by default should be implemented by all of the upstream providers that software developers use. So, thinking of, like, open-source library maintainers, the platform engineering teams that might be in your organization, and the business-to-business SaaS providers, all of them should guide their customers to the more secure option whenever possible and provide some type of warning or notification if the customer strays off the golden path. So, that was secure by default. Mostly good. I like it. There’s a whole thing about hardening guides that I think we will get into later and how they’re terrible, so that’s also part of the secure by default discussion. We can dig into that a little bit. Let’s get into the meat, or the rest. That was sort of a preamble. The rest of the document talks about secure by design, and they have three guiding principles for this idea. They are: take ownership of customer security outcomes, embrace radical transparency and accountability, build organizational structure and leadership to achieve goals, and—my number four—wish upon a star.


00:16:58
Chris: [laugh]. I would love it if that was actually what it was called.


00:17:02
Ned: God, [laugh] I wish it was. It’s about as realistic as some of the other principles, so… there we are. The first one, take ownership of customer security outcomes, is a bold statement, and it immediately made me think of the legal ramifications of taking ownership. Like what does it actually mean to take ownership of customer security outcomes? I thought we could put this into a metaphor that was used several times by the authors, and that is of seatbelts in cars. I think we can all agree that seatbelts and cars are a net-good thing, makes people generally safer. You’re going to fight me on that, or we—


00:17:46
Chris: No. No, I’m nodding knowingly because I know where this argument is headed.


00:17:50
Ned: So, let’s start with, does the vehicle manufacturer—which would be sort of the software developer in this case—have a responsibility to install seatbelts and verify their functionality? Absolutely, yes. You put the seatbelts in; they should work as advertised. Does the vehicle manufacturer have a responsibility to notify drivers and passengers when they aren’t using the seatbelt? I mean, my car certainly seems to think so, very loudly. And that’s not a person; it’s just a bag of dog food. Dear God.


00:18:24
Chris: Why aren’t you strapping in the dog food?


00:18:27
Ned: It’s a valid point, I guess. Despite all that, maybe. We’ll say maybe on that. But is the vehicle manufacturer responsible for passenger injury when they choose not to wear the seatbelt? I don’t think so. So, in that regard, it seems foolish on its face to say that software manufacturers should take ownership of the customer security outcomes. Should they provide a secure environment with secure features and some helpful nudges when you’re doing something profoundly stupid, Hank? Sure. Are they responsible that you got hacked because you committed your SSH keys to a public repository and left your system running on a public IP address with no firewall rules? Probably not.


00:19:15
Chris: To close the loop on—no pun intended—on the seatbelt argument, one thing that they are responsible for is if you’re using the seatbelt and it fails.


00:19:25
Ned: Yes, absolutely. So, if you’re using their security feature and security feature doesn’t do what it’s supposed to do, they should be responsible. And I think that’s somewhere we are lacking a lot when it comes to software manufacturers is, when their security features don’t work—


00:19:44
Chris: Right.


00:19:45
Ned: We should have at least civil penalties back on them and possibly criminal penalties depending on the severity of the failure.


00:19:55
Chris: I agree. And this is actually where Security by Design works because the very concept of Security by Design is proactive. Everything you do should be secure from the jump, which makes programming harder, which is one of the reasons programmers hate it. They don’t want to bake security in at the beginning. They want to tack it on at the end. Security by Design flips that on its head. You know the famous example for, uhhh, system administrators of a certain age is Perl taint mode, which is an unfortunate title, but just stick me.


00:20:32
Ned: [laugh]. Okay.


00:20:32
Chris: With it does is requires rigorous security and management of your variables, of your inputs and outputs, and if you don’t have them in your script, the script will not run. Therefore, Security by Design. You have to write a secure script if you use it in that mode. Now, does that mean that your shell script will be twenty lines long instead of five? Yes. In a lot of cases, it will end up being a hundred lines long, but in the miracle of software development, you only have to write it once.


00:21:05
Ned: That’s true. Yeah. Even better would be if you invoked a library that did all that stuff for you that someone else maintains, and it is, you know, properly doing all the things that you might do incorrectly in your script. But we’ll get to that. So, the first principle is actually very poorly named, but the actual contents that they put inside the principle are not that bad. The author specifically mentioned application hardening, application features, and default settings. The application hardening piece boils down to doing things like using a memory-safe language, sanitizing inputs, and incorporating security into the software development lifecycle. These are all nice-to-haves, but as Shortridge and Petrich rightly note, there has to be some tangible business benefit to implementing these changes. In particular, switching to a different programming language that is memory safe, while good, is incredibly costly for any organization and possibly ruinous to a smaller shop. So instead, they recommend that shops focus on modularizing monoliths and making use of external libraries and services as much as possible. Doing so can speed up development, make developers more productive, and it’s an actual business benefit with enhance security as a nice bonus.


00:22:27
Chris: Right. And as I discussed earlier this week, I hate this argument that you have to use a memory-safe programming language in order to be safe. It is a programmatic practice that you can absolutely make that will make your code safe, so just do it. Do the thing.


00:22:45
Ned: Do the thing.


00:22:46
Chris: If I wrote the document, that would be the name of this section: “Just do it.”


00:22:52
Ned: Right. And I don’t want to say that developers are avoiding security on purpose. I don’t want to cast aspersions that way. They have a set of features that are in their next sprint that need to get implemented, or they don’t get paid, and so the thing that they’re incentivized to do is get those features to work. If you can offer them tools and libraries and services that help them do that, and do it securely, even better, but that’s not their core goal, it’s not their core principle, and if you really want to implement this secure by design thing, you can either try to do the top-down approach of having, like, a head of security of design who pushes all these policies down and makes sure that you rubber-stamp all the things. That company is eventually going to go out of business because they haven’t made any features in six months.


00:23:45
Chris: Right. Well, that’s always the problem, right? And you’re right, I think it is important to highlight we’re not just shitting on programmers for the sake of it, as fun as that is, that’s not—


00:23:56
Ned: I mean, it’s a great time.


00:23:56
Chris: —what we’re doing. A lot of this does, in fact, come down to the fact that business demands requires programmers to work as fast as possible, and working as fast as possible means not working as efficiently as they could. So, that’s where I do think that there is some runway here for us to make security more important, and make it a bedrock of all products that come out by mandating stuff like this. Just like the seatbelts. People did not wear them with any regularity until it became a federal law.


00:24:29
Ned: Right. Some of it is going to require a carrot and some of it is just going to require a stick, and I think we can use both.


00:24:37
Chris: Right.


00:24:38
Ned: Speaking of which, when you think of the stick and the carrot, do you think of someone dangling a carrot with a stick, or do you think of them like whipping the mule with the stick and then holding out a carrot?


00:24:49
Chris: Well actually, I like to pivot that completely and whip the mule with a carrot.


00:24:55
Ned: [laugh].


00:24:55
Chris: Then he doesn’t know what to do.


00:24:57
Ned: [laugh]. Just sit down. Give up. All right, so to demonstrate the principle of taking ownership, the document has a series of recommendations, some of which are excellent and others are… buck wild, [would be the way of 00:25:09] [laugh] describing it. So like, eliminate default passwords. Love it. Yes. Please more. Reduce or remove hardening guides. Yes. Your product should be hardened out-of-the-box. Why wouldn’t you do that? If you have a 30-page guide on how to properly harden your software, like, just do that already. Why are you making me do all your work? Do you have feelings about hardening guides because I hate them.


00:25:41
Chris: No, I think you’re absolutely right. And I also think that the language that is used matters. So, if you have a hardening guide, then you’re putting the onus on the customer to do the responsible thing, but let’s pivot all that opposite, right? Let’s say it comes out-of-the-box a hundred percent secure, and you hand out and un-hardening guide, or a security loosening guide.


00:26:03
Ned: That’s exactly what they call it. A loosening guide.


00:26:05
Chris: Yeah, that’s going to make people think, “I probably shouldn’t do these things.”


00:26:10
Ned: Yeah, and part of the job of the loosening guide, if that exists, is to explain what the consequence of making this change is, as well as how to make that change. So, “You are removing TLS certificate validation. That’s extremely bad. Here are the reasons why. If you still need to do it, here’s how you do that.” But out-of-the-box, it should have that feature turned on. [sane 00:26:37] [laugh]. Another one is, provide secure defaults for developers and create secure templates. Perfect. Love it. Make security the easy option for developers. Oh, you need to write a web front-end? Here’s a secure template for a web front-end that already has all the good things baked into it. Just fill in your code as needed. Love it. Perfect. But then we have quote, “Foster a software developer workforce that understands security,” end quote, and, “Document conformance to a secure SDLC framework,” and my favorite, “Conduct field tests.” What? You want my six-person startup to conduct customer field tests? That’s wishful thinking.


00:27:25
Chris: Eh, is it though?


00:27:26
Ned: Yes.


00:27:27
Chris: It’s not that hard to do the bare minimum. I mean, you can fuzz a software package relatively easily and relatively inexpensively. The real question would be, like, exactly how intense are they saying these field tests have to be? I think that’s where a lot of this is going is, people were thinking worst-case scenario, and I’m thinking, let’s just [laugh] move the ball forward… one yard.


00:27:50
Ned: Yeah. I get where you’re coming from. Like, conduct field tests could be construed in a couple of different ways. One could just be, like, have a couple of users try it and let them know how they did, or it could be, like, rigorous tests, where you go out to multiple customers and have them walk through the entire UX with you, and point out all the security issues. Like, that’s pretty expensive to do.


00:28:12
Chris: Right. I mean, there are external places you can go to do it, but you’re right, it’s expensive, and it slows everything down in a way that might not be practical or cost-effective right now.


00:28:23
Ned: Yeah. And then stuff like, “Documenting conformance to a secure SDLC framework.” That just feels like security theater. It’s a document no one’s ever going to read.


00:28:32
Chris: Sounds like Mad Libs.


00:28:34
Ned: [laugh]. It kind of does a little bit. There’s also another gem, which is, “Publish a commitment to never charge for security or privacy features or integrations.” That kind of gets back to my previous point of, like, are you saying that I should never charge for a feature? Because that seems wrong.


00:28:53
Chris: This would definitely go into the category of, “Y’all are pushing it.”


00:28:58
Ned: [laugh]. Yeah. It’s an official category. Yeah, I’m not going to just, like, stop charging for all my support and software updates and put up a GoFundMe page. That’s not a good business plan. In the same paragraph, they try to justify that previous statement by saying, “Yet we do not see many manufacturers charging extra for availability or data integrity,” to which I was like, “What?” I mean, everyone charges more for availability and data integrity. VSphere clustering, S3 bucket multi-region, HashiCorp’s Vault HA mode. Like, everybody charges more for additional availability and data integrity.


00:29:40
Chris: Yeah. That’s just how it works, kid.


00:29:44
Ned: [laugh]. Okay, so enough of that. On to the next principle. “Embrace radical transparency and accountability.” And I feel like someone just dropped me into a cultist retreat.


00:29:56
Chris: That explains the patchouli.


00:29:58
Ned: Oh, it is… awful.


00:30:01
Chris: It is dense in there.


00:30:03
Ned: Can’t breathe [sigh]. Anyway, the idea here is that software manufacturers should be transparent about their software development processes and share them with others in the industry, and that helps to establish conventions and update them over time, leading to overall more secure practices in the same way that open-source tends to be more secure than closed-source software. Which all sounds great. I love all of this. The document claims quote, “There are few opportunities for software manufacturers to see how peer organizations structure their SDLC programs, and how those programs hold up in the customer environments against real attackers,” end quote. To which I’m like, you’ve heard of conferences, right? Like, people get up at DEFCON and RSA and talk about what they do in their company. That’s a thing. Shortridge et al. Agree with me. They said, quote, “We suspect that federal agencies are not often exposed to the spaces where these learnings are shared,” end quote, going on to note conferences, meetups, blogs, social media, and of course, all the open-source software out there with robust documentation and active Slack communities. It’s all pretty rad. One might even say radical.


00:31:24
Chris: It’s radical, and it’s also kind of a surprise, for CISA, in particular, absolutely goes to conferences. They have presences on the dias at least. So, I think that there is certainly some—let’s just say, maybe this needed a second edit.


00:31:41
Ned: [laugh]. A little bit. And remember, this document is, like, they put it out as a request for commentary, so we are giving commentary. Even though the deadline has passed. What we could actually use more of is transparency and disclosure when a vulnerability is exploited or a customer is hacked. Well, documented CVEs and root-cause analysis can help other organizations avoid the same fate.


00:32:06
Chris: Yep, I agree, and I’m really curious to see what happens with this after last year’s December mandate to disclose became active.


00:32:15
Ned: Four days, SEC. That’s what they said. And it’s been upheld so far, which is—


00:32:20
Chris: Yes.


00:32:21
Ned: —wild. Oh, the recommendations for this section fall into the security theater trap a bit as well. We have things like, “Publish detailed secure SDLC self-attestations,” “Publish high-level threat models,” and, “Publish SBOMs.” Not that any of these are, like, necessarily bad, but there’s probably, I would say, more productive stuff you could be doing to enhance your security, like improving your actual SDLC rather than writing an attestation that literally no one is ever going to read. Like, just saying the word attestation puts me to sleep.


00:32:58
Chris: The only one there that I think is significant and necessary and important is publishing SBOMs.


00:33:03
Ned: We did a whole show on SBOMs, didn’t we?


00:33:05
Chris: We did.


00:33:06
Ned: Yeah.


00:33:07
Chris: I didn’t listen.


00:33:08
Ned: That’s fine. I talked a lot. We’ll link it if I can find it. Anyway. So, that brings us to the last principle: lead from the top. And it’s by far the shortest of the three sections. Thank God because I’ve been talking for way too long. They talk about the need for support from the top-down when it comes to security initiatives, that you have to incentivize your workers, but not provide perverse incentives, like Free Porn Fridays.


00:33:36
Chris: That’s not a perverse incentive. That’s a perverted incentive. It’s different.


00:33:40
Ned: Oh, I must have misread. The guidance around these principles is purely aimed at, let’s say, giant software manufacturers and focuses on having a dedicated secure by design executive, like, one person whose job is secure by design, and also a secure by design council, and customer councils. That is a lot of overhead for really any but the largest corporations out there. Now, this principle is such an absolute nothing-burger, that Shortridge and Petrich don’t even give it a cursory note in their comments.


00:34:17
Chris: Nice.


00:34:18
Ned: I don’t feel any need to comment further on that either. So, thoughts and feelings. Let’s share Chris. As hard as I am on this 36-page report, overall, its heart is in the right place. Cybersecurity is a pressing issue, and I do think we need to not let software manufacturers off the hook so often for their shoddy applications and their morass of vulnerabilities. I sensed an undertone of frustration in the document, and I get it. Many of the issues that they bring up—default admin passwords, SQL injection attacks, and data sanitation—these are, like, InfoSec fundamentals. These are solved problems that manufacturers seem to mysteriously and consistently fail at. The academic looks agog at this mess and goes, “Why?” You know what the answer is, Chris? Do you want to say it [laugh]?


00:35:18
Chris: Dog races.


00:35:20
Ned: [laugh]. Sort of. It’s money. Capitalism.


00:35:23
Chris: Oh, yeah. That other kind of race.


00:35:25
Ned: It’s always that. It’s about incentives, both at the vendor and the customer level. Vendors, they’re not trying to ship insecure software, but they’re also not trying to go out of their way, and they’re also not trying to go out of business. Security needs to be the easy choice and the default choice when one of their developers reaches out for a tool. Security breaches and vulnerabilities don’t seem to do long-term damage to most companies. SolarWinds is still a going concern. They should not exist anymore as a company.


00:36:01
Chris: Fair.


00:36:02
Ned: Oracle releases patches in the hundreds every month. Cisco hits a 9.0 on the CVE scale about once a quarter.


00:36:13
Chris: Hey, they had a 10 last quarter. Don’t forget that.


00:36:17
Ned: [laugh]. Exceeding expectations. You can either punish these companies with massive fines—which is something the federal government has been fairly reluctant to do, but can actually do—or make it a lot easier for them to do secure things. Now, the federal government here can actually throw their weight around quite a bit when it comes to all the government contractors. The government is a huge employer. They have, like, a lot of employees and a lot of contractors, and they can regulate all manner of security requirements and force companies to comply with those requirements or get the fuck out. If done thoughtfully, that can actually help move the needle in a useful direction when it comes to security. Beyond that, based on the response from Shortridge Sensemaking, the better approach is to adopt modern software development principles that favor developer productivity and velocity, and as a result, security is dragged along in its wake. Faster iteration, modularization, open-source standards and libraries, these can all enhance security much more than a 50-page attestation or letter to the board of directors. And, bonus, it’s in line with making money, the actual driving force behind all of these companies. Your thoughts?


00:37:38
Chris: Largely, I agree. I think there is a necessity for an actionable government mandate, not just a letter, not just ‘this is what we should do,’ we have to change the rules of the road. And it has to be changed for everyone, and it has to become table stakes. Much like seatbelts, much like speed limits. And if you violate those rules, there are consequences. One of my all-time favorite quotes is, “Anyone can build a bridge that works, but it takes an engineer to build a bridge that barely works.” And that is why regulations are important because they will mandate things like how much load must the bridge carry, how much traffic, how much sway, all these types of things, that if you just put rocks upon rocks upon rocks upon on rocks would just be solved, but it would be over-engineered to the point of absurdity, and it wouldn’t work in a capitalistic society.


00:38:45
Ned: Right.


00:38:45
Chris: The Roman aqueducts still exist for a reason. It is not because these guys were good at math. Even though they were good at math. You know what I mean.


00:38:53
Ned: I got you. Being able to set baseline requirements, but also with the knowledge that people are only going to meet those minimum requirements. Your engineer example—


00:39:03
Chris: Right.


00:39:04
Ned: —there were certain tolerances, and he’s like, “Well, they were within tolerances.”


00:39:08
Chris: Yeah.


00:39:08
Ned: “I don’t know if those were good or not.” But yeah, we need to set some sort of baseline minimums that software manufacturers should adhere to, and if they don’t, then there are actual consequences to failing to live up to those. But creating a security theater is definitely a potential pitfall, if you just make it all about documentation, as we’ve seen with countless other security directives in the past, like PCI DSS. You could pass the audit and still be horribly insecure because if you’re just going for the checklist, there was stuff you could skip or just kind of fudge.


00:39:45
Chris: No, I agree. And, you know, I know we’re going over, so the last thing I want to state is, everybody said that GDPR was going to bankrupt every company in Europe. And you know what? It didn’t.


00:39:55
Ned: No.


00:39:56
Chris: All it did was cause a lot of complaining, and made people’s personal identifiable information a little bit more secure. It works. Let’s do the thing.


00:40:07
Ned: Let’s do the thing. Hey, thanks for listening or something, I guess you found it worthwhile enough if you made it all the way to the end, so congratulations to you, friend. You accomplished something today. Now, you can go sit on the couch, read that 36-page report, and think about how security impacts you every day. You’ve earned it. You can find more about this show by visiting our LinkedIn page, just search ‘Chaos Lever,’ or go to our website, chaoslever.com where you’ll find show notes, blog posts, and general tomfoolery. We’ll be back next week to see what fresh hell is upon us. Ta-ta for now.


00:40:45
Chris: I’m still shocked that this was only 36 pages long. As you were talking, I’m like, “This is 150 pages of nightmare hellscape, isn’t it?”


00:40:52
Ned: It’s not. The response was 45 pages.


00:40:56
Chris: [laugh].