Shiska Bobby Tables [43]

Posted on Tuesday, Jan 31, 2023
Chris is peeved about PII problems, Ned is excited about Small Modular Reactors, and we both agree Google should be run by Larry Flynt.

Transcript

[00:00:00] Ned: It hasn’t gotten any better, Chris.

[00:00:03] Speaker 2: In conclusion, it hasn’t gotten better.

[00:00:05] Ned: That should really just go on a shirt. Chaos Lever. It hasn’t gotten any better. Are you talking about the show or society? Yes.

[00:00:15] Speaker 2: Yes.

[00:00:19] Ned: Oh, God. Speaking of things that haven’t gotten any better, keyword SEO optimization of YouTube descriptions is awful and I hate it.

[00:00:30] Speaker 2: Yeah. I’m sorry, were you looking for an argument?

[00:00:34] Ned: No, I’m not fishing for an argument. I’m just purely here to complain to you and our three listeners, and I’m trying to bolster the YouTube channel that I have a little bit, maybe get some sponsorship since pluralsight decided to screw me over. That’s a separate grant. That’s not for today. And so I had someone who provides SEO optimization as a service take a look at three of my videos and put together updated descriptions and titles and all of that jazz, and it’s just keyword stuffing. Now, the description for my video is three paragraphs long with keywords just mushed in there so that you can get an SEO rating of 95. But it sounds like it’s written by a computer. I can’t imagine a human reading it. And plus, it makes wild claims that are not true about the video, like saying that it’s a comprehensive guide on cloud security fundamentals. It’s eleven minutes long, Chris.

[00:01:47] Speaker 2: Come on. Maybe you just talk real fast, or.

[00:01:50] Ned: Maybe I’d say, turn everything off, put it in a tank, and then blow up the tank and you’ve become grid.

[00:01:59] Speaker 2: Become a farmer, invest in gold.

[00:02:02] Ned: Oh, God. And it’s like one part of me says, I just want to go back to the way it was, where it was a simple summarization of what’s in the video, along with some timestamps, and be good. But I know, like, if I want to get more subscribers and grow the channel, I need to be more discoverable. And that’s where SEO comes in. So you got to play the game. And like, I hate it. I hate it all, and I don’t know who to blame, so I blame.

[00:02:36] Speaker 2: I was going to say Sergey Brin.

[00:02:38] Ned: Yeah, that works, too. All right.

[00:02:40] Speaker 2: And I forget what the other guy’s name is. I think it was Yul Brynner.

[00:02:46] Ned: That sounds correct. We’ll go with that. Isn’t it larry something? Larry. You’re talking about the founders.

[00:02:53] Speaker 2: Larry Fine, right.

[00:02:56] Ned: Larry Flint. No, that is a different founder, different company entirely. Well, I mean, sort of think about it. Don’t don’t think about it.

[00:03:12] Speaker 2: Don’t think about it.

[00:03:14] Ned: Oh, dear. Well, hello, alleged human, and welcome to the Chaos Lever podcast. My name is Ned, and I’m definitely not a robot. I am a real human person with a name. And what’s in a name? What’s in a keyword? Wouldn’t a barely coherent Markov chain with any other 44 character alphanumeric identifier derivate to the third order just as sweetly? You can even use non euler notation if you want to. After all, its SoulCycle 4,543,000,435,332 plus or -55 billion. I don’t know if I got any of those numbers right.

[00:03:59] Speaker 2: Have you ever heard of math?

[00:04:00] Ned: It’s not like it’s the Dark Ages anymore. Or is it? Is it the darkest age? That’s a pretty wide margin to bear, now that I think about it. Anyway, me is Chris, who is also here. You have a lot of letters in your name. How’s that going?

[00:04:19] Speaker 2: Well, it was much better than when I was a child. I had to fill out forms for standardized tests and stuff like that. I think for five or six years the state thought that my full name was Kristoff.

[00:04:32] Ned: With Kristoff?

[00:04:34] Speaker 2: No, we just stopped at the P. Oh, I see. Yeah.

[00:04:37] Ned: Because how could a name possibly be more than, what? Ten characters? Mad, right?

[00:04:42] Speaker 2: Those last three, forget about it.

[00:04:45] Ned: I never really consider that before because my first name is seven letters, six letters, six letters. I can do math, I can count.

[00:04:52] Speaker 2: Wow, what is happening right now?

[00:04:55] Ned: I’ve never had to think about it before. My actual legal name is six letters and there’s more than enough room on most of those forms, but yeah. And god, what do you do if you have like a non ABC type character? What if you have an Apostrophe in your name? Is that even an option?

[00:05:17] Speaker 2: This is a legitimate problem that has been wrestled with for at least 40 years. All computer systems handle it differently, which.

[00:05:28] Ned: Is the fun part, and they all handle it badly. The other fun part there’s there’s a fantastic episode of I want to say it was radiolab. Could have been is one of those that was talking about people with the last name of Null and the fun adventures they’ve had working with computer systems.

[00:05:50] Speaker 2: Is this a real thing or a Bobby Table situation?

[00:05:53] Ned: This is a real thing where people just have the last name null. That word existed before computers and some.

[00:06:01] Speaker 2: People nothing existed before computers.

[00:06:04] Ned: Hard to believe. And apparently one of them lived in a state where the DMV, the way that they did tickets maybe it wasn’t the DMV, but it was whatever organization was in charge of traffic tickets would assign a null value, but the null value was assigned as a string to tickets that didn’t have a valid name associated with them. And so he EndToEnd up with hundreds of thousands of dollars in tickets all being assigned to him. And he was getting these voicemails and letters warning him that he had like these hundreds of thousands of dollars that he was going to be taken to jail for not paying his traffic fines. Pre entertainment. Not nice. I mean, not for him. No. Entertaining, right? Yeah. Dealing with people’s personal information and the variation of people’s last names, I can get pretty tricky. The other big one was people with really short last names because whoever set up the database when they were creating the fields assumed that all last names would be at least like three characters, let’s say. And so it will throw an error saying not long enough, or something along those lines. Yeah, or it’ll grab characters from the next field, which is fun.

[00:07:25] Ned: Databases are fun, said no one ever. Oh, they’re garbage. Let’s talk about some other tech garbage.

[00:07:34] Speaker 2: Fantastic. Let’s talk about the state of privacy. OOH. Particularly in 2023, which, if you’re keeping score at home, is now.

[00:07:48] Ned: No, it doesn’t sound right.

[00:07:51] Speaker 2: So privacy, that’s a thing we’ve all heard about, right? Especially recently.

[00:07:58] Ned: I feel like I’ve heard it mentioned once or twice.

[00:08:00] Speaker 2: I mean, the funny thing is, when the Internet started, and if you can believe it, there was a time before the Internet seems to privacy was not like a big deal or a concept that people thought about. We just gave everybody all our information because they asked for it. Why not? In 2023, we’ve all learned we’ve learned that Facebook is poison and Google is an advertising company and individuals personally identifiable information is basically digital currency.

[00:08:35] Ned: Yeah.

[00:08:37] Speaker 2: And companies finally catching on to this trend, are working hard to right the ship and protect consumers privacy as a bedrock of their mission statements. Right, right. How come you’re not saying right?

[00:08:52] Ned: I mean, it’s probably in the mission statement, and we know how well people follow mission statements.

[00:09:01] Speaker 2: It’s in like, air quotes, though, is the problem.

[00:09:05] Ned: Yes.

[00:09:10] Speaker 2: So it’s 2023, and this week the ISACA or the Information Systems Audit and Control Association published their 2023 State of Privacy report. And gee whiz, I sure wish it had some better things to say.

[00:09:29] Ned: This is going to be one of those upbeat episodes, isn’t it?

[00:09:32] Speaker 2: It’s going to be keen and before anybody starts firing off any angry emails yes, I know that the ISACA does not go by Information Systems Audit and Control Association at all anymore. Their acronym has become their name side point. That’s weird.

[00:09:52] Ned: It is a little strange just going.

[00:09:54] Speaker 2: By ISACA, or Isaka, as I’m going to say it, because it’s faster is weird. And you don’t need to email me about that either.

[00:10:04] Ned: Please.

[00:10:04] Speaker 2: I’m just going to say Isaka because it makes me happy. We’re here to talk about privacy, not the shockingly hard game of naming organizations.

[00:10:12] Ned: All right.

[00:10:14] Speaker 2: All right.

[00:10:15] Ned: So you heard them, everybody. Delete those drafts and let’s move on.

[00:10:19] Speaker 2: Thank you. Any argumentative? Things can be sent to Ned repeatedly from a robocaller. So let’s talk, first of all, establish some terms and the playing field, and let’s do that by talking about security’s codependent relationship with privacy and vice versa. Security and privacy, simply put, go hand in hand. Think about all of the breaches that we’ve covered on this show. All of them had the word security in them, but what else did they have? They had the exposing of clients personally identifiable information to hackers. This mostly happened because the company did something dumb and avoidable and chaos ensued. A violation of security is thus, in most cases, a violation of consumer privacy.

[00:11:21] Ned: Well, as you mentioned before, customer information is very valuable. It’s the money that greases the modern Internet. If you’re going to compromise somebody’s security to go after something valuable, that’s probably going to be the private data of the people who use that service. Not always, but very often.

[00:11:46] Speaker 2: Right. And if you want to get really down to the brass tacks on this one, here are two official definitions for you to chew on. Privacy involves, quote, protecting personal information from unauthorized access or use. This means that only those with permission should have access to that kind of information. Security means, quote, safeguarding against malicious attacks and unauthorized access to systems and data as you just alluded to. How many times is it that the kind of data we’re talking about is users personally identifiable information?

[00:12:26] Ned: Yeah, I mean, there are certainly times when the thing that is purloined from the enterprise or whatever is trade secrets or something equally important to the internal organization or money. So some sort of espionage or ransomware is the other big example. But yeah, the the vast majority of security incidents we see, they have to do with stealing people’s private data.

[00:12:55] Speaker 2: Right. And one of the reasons that also happens is that usually it’s easier.

[00:13:04] Ned: Are you implying it’s not as well secured?

[00:13:08] Speaker 2: You remember back to our discussion about LastPass the biggest thing that was stolen was what? People’s information and their passwords. What was not stolen? Credit card information or money.

[00:13:24] Ned: Seems like you could have applied the same security that you applied to the credit cards to the rest of their data.

[00:13:30] Speaker 2: Now you’re talking crazy.

[00:13:34] Ned: I’m a crazy guy.

[00:13:37] Speaker 2: So the Asaka poll asked a lot of questions. Like the total poll was something like 55 total questions, and we’re not going to go through all of them. I just wanted to hit sort of the high points. So what the questions were targeted at were effectively around privacy staffing, privacy policy, and privacy philosophy in companies. In total, there were 1890 respondents from companies all over the globe, 75% of which represented companies that had revenue of $50 million per year or more. So while you would like to have more representation than that for this kind of poll, that’s not a bad return.

[00:14:21] Ned: No. 50 million is actually not that high in terms of revenue. If we’re talking about size of a company, you and I worked for an SMB that was employing, what, 200 some odd people, and their revenues were well north of 50 million.

[00:14:35] Speaker 2: Yeah, they were all pretty odd.

[00:14:38] Ned: Yes. Was that the revenues or the people? Yes, also.

[00:14:44] Speaker 2: Yes. So the first category of questions that were asked was actually about staffing, and the results in this one were pretty clear. Not only are legal compliance and technical privacy teams, which are two categories that they used in the survey, those were both understaffed, and the number of staff in both categories went down year over year.

[00:15:12] Ned: Seems like the wrong trend.

[00:15:15] Speaker 2: To be fair, it only went down a small amount, approximately four and a half to 5%. But considering the absolute plethora of security breaches that involve PII loss over the past year or so, you would have hoped that those numbers would go healthily up. And of course, we also have to remember that the breaches that involve PII loss that we know about is probably a tiny percentage of the total that actually happened.

[00:15:45] Ned: Yeah. Not every breach is discovered for certain.

[00:15:48] Speaker 2: Right. So one of the reasons for this understaffing trend that the Isaka report doesn’t exactly call out, but definitely implies, is that privacy as a practice needs more executive support. One reason this might be happening is that executives and companies still do not readily acknowledge a relationship between users privacy and the company’s larger objectives. If it’s not one of the pillars in the mission statement, then it’s just not important. And if it’s not important, it doesn’t get funded. And if it’s not funded, well, I think you get the idea.

[00:16:31] Ned: Yeah, I’m picking up on a trend here. It does have to be essential to the business. That’s the problem with security, too, is if security gets in the way of the company making money, the security immediately gets dropped. Because you have, you have to take the long view that security is going to overall make your organization a healthier organization and more profitable over a long term. And a lot of people don’t like to take the longer term view. Also, it’s ethically. Right. And.

[00:17:06] Speaker 2: That’S not part of my bonus.

[00:17:09] Ned: That’s right.

[00:17:12] Speaker 2: So for all of the reasons highlighted, executive support and most importantly, sponsorship of security and privacy policies, initiatives, what have you, is lacking. That is a problem. Another problem the Isaka report highlighted is also in the staffing front, and that’s actually skilled job hunters. What Isaka identifies as missing are not high level architects designing privacy practices, nor is it low level engineers patching systems and responding twenty four seven to alerts it’s that weird in between. Person that can do both has an understanding of the system from top to bottom, can handle concepts like governance, It architecture, data, lifecycle management, and can implement those solutions and see to it that those implementations are actually working.

[00:18:14] Ned: Is it outlandish to ask for such a person? Are we looking for a unicorn in this case? Because that is a large breadth of knowledge to have in a single human being.

[00:18:27] Speaker 2: Right. And I kind of wonder if they are kind of using this as an opportunity to carve this person out. Because all the things that I just talked about are super important. But have they ever been put into a single job description in any kind of regular way before? I would argue possibly not. And from Asaka’s perspective, this is a tad self serving because the four example that they use for this position is someone certified as a Certified Data Privacy Solutions Engineer, which is a Cdpse and you want to guess which organization runs that particular certification.

[00:19:10] Ned: Sucker.

[00:19:12] Speaker 2: You got it. Yay. But on the other hand, like I said, they’re not wrong. That position needs to be filled by somebody competent in order to bridge the gap. And respondents to the poll indicate that that’s not happening. The time to fill one of those legal compliance and or technical privacy positions is likely to be three to six months and this is horrifying or longer. Now, one of the things they did not talk about in here was salary. So one of the things that might be a problem is that people don’t have the full breadth of skills that these positions are looking for. The other might be the companies are only offering 60 grand a year.

[00:20:02] Ned: Right. And in the most recent economic environment, most technical people could command a salary well above that. That, however, is changing a little bit. So perhaps these jobs will be filled a little quicker as layoffs continue at some of the larger tech firms.

[00:20:23] Speaker 2: Right. And I will definitely be interested to see how that goes from now until the end of the year because I do think that what you’re talking about is a trend that is going to happen. But anyway, yeah. So these people are necessary and A they might not have those skills, b they might not know these jobs exist and C, these positions might not pay enough, but they still need to exist. We need people with security and privacy backgrounds who can be in a corporation to understand the policies and enforce them. Where the Rubber Meets the Road how many times have you been in or around a company that wrote bulletproof privacy policy documents that are never read, never implemented and never shared with anyone that’s not an auditor?

[00:21:15] Ned: I felt like when we had to write compliance and policy documents for the firms we consulted for, it was written specifically for the compliance officer and the auditors to read and not necessarily to be implemented. We just need the checkboxes.

[00:21:34] Speaker 2: Exactly. And we could go into a conversation and maybe we should later in the year about the difference between being audit compliant and being actually compliant.

[00:21:47] Ned: Slight differences.

[00:21:48] Speaker 2: And I mean, if you look at the way that the various audits are written, sometimes they actually make a distinction. The audit will be about the policies that are in place, not the implementation of technology to support them.

[00:22:03] Ned: I’d say just as important as the architecture and the tooling is the training of the staff to effectively implement those policies that have been written.

[00:22:14] Speaker 2: True. Not just those policies, but security in general. Which leads me smoothly and seamlessly into my next major point the false promise of security training. So according to this survey, something that is increasing year over year is the inclusion of a yearly security training at most of these companies. A frankly impressive 65% of respondents reported receiving privacy training annually. That’s kind of a lot.

[00:22:50] Ned: Yeah, not bad.

[00:22:54] Speaker 2: The trouble is, we have known for years that doing training yearly is not enough to cause long term changes in security and privacy behavior.

[00:23:07] Ned: Yeah.

[00:23:08] Speaker 2: You actually need to do this three to four times a year in order for it to be reinforced to the point that individuals will change their behaviors and their subconscious thought. If you don’t. The danger is, as the report succinctly puts it, these annual trainings become check the box exercises. For what? For audits?

[00:23:32] Ned: Yes.

[00:23:34] Speaker 2: They have no effective way of evaluating if employees are learning anything from them. However, sociological studies over years and years and years can tell you pretty clearly they’re not. This is even worse because most respondents do believe these programs have a benefit, with well over half of them saying they have at least quote some positive impact. Aside from the fact that yearly is not sufficient. The other problem with these kinds of trainings is that they are often, for lack of a better word, super easy. Okay, that was two words.

[00:24:11] Ned: Yeah.

[00:24:11] Speaker 2: Anyway, the way these trainings are put together, they’re implemented as a check the box exercise. Because of a regulation, executives don’t think they’re important, so they’re content with bottom of the barrel, lazy, next next finish kinds of quote tests that take five minutes and don’t even have to be paid attention to because it’s impossible to fail.

[00:24:33] Ned: Yeah, I may have taken one of these trainings before. I’m familiar with the format.

[00:24:38] Speaker 2: So while there is an increasing belief that security and privacy training in general is a good idea, it’s also pretty clear from both the commenters about these trainings that they’ve taken in specific are in most cases, next to useless.

[00:24:57] Ned: How much of this because you said that a lot of the people report that it’s having some positive impact, and I’m wondering how much of that is the sunk cost fallacy of, well, I spent 3 hours or however long in this lecture, so it must have done some good. Otherwise my time was completely wasted.

[00:25:16] Speaker 2: I don’t even think it goes that far. In most cases. This is more like watch a 20 minutes video, answer a ten question true or false quiz, and you’re done with your security training for the year.

[00:25:25] Ned: Right? But nevertheless, I’ve spent time on it, and I want to think that my time is valued. And so I think there must be something positive coming out of the fact that I spent time doing it right.

[00:25:38] Speaker 2: And one of the other things about the survey that would have been more helpful but I think they had to do this for anonymity is they don’t separate out the answers based on job title or job description, because I would guess that line engineers would answer these questions differently than CXOs, probably at least more sarcastically.

[00:26:03] Ned: Yeah, we’re secure. How did he add the scare quotes to the yes in the survey? I don’t even know.

[00:26:11] Speaker 2: That wasn’t even an option. Bobby tables. So really in conclusion, there is some glimmers of hope but it just doesn’t feel like a ton has changed or is changing, which is a bummer. Yeah, but I have some ideas. Let’s close the book on the survey and just kind of ideate about how to fix all these things.

[00:26:44] Ned: Well, I will say before you get to the ideas, the one thing that I have noticed is that the average person on the street is more aware of privacy as a concern than they were, say, five years ago. And so while there may not be as much progress as we would like on the vendor or the company side of things, just as a public notion, it’s definitely increasing awareness now whether people are actually willing to change their behaviors in regards to that. Maybe, I don’t know. Facebook isn’t doing so well lately, so I guess people are having some kind of reaction to it. But yeah, I think at the very least we’ve made it a more public conversation instead of it being something that’s just decided on and argued over behind closed doors.

[00:27:34] Speaker 2: Yeah, I think that that’s true. I think a lot of the breaches that have come out have actually started to affect people. And especially the breaches that say, oh, I don’t know, exposed huge amounts of credit information for a company that faced basically no consequences whatsoever. That’s the kind of thing that’s going to make the public pay attention.

[00:28:00] Ned: Yeah, a little bit.

[00:28:04] Speaker 2: So that’s actually another good site. We are crushing the segues crush your good. My number one idea for making companies take privacy and security more seriously is to enact truly punishing fines for companies that lose customer data. And I’m talking bad like actually on the verge of bankruptcy bad. There was an old thought experiment that I like to think about when it comes to these kinds of things. The thought experiment was around the discussion of designing a safer car. One commentator at the time, I think this was the meant to look it up and I totally didn’t.

[00:28:51] Ned: Okay.

[00:28:51] Speaker 2: But one commenter said that the best way to make drivers more careful on the roads would be to install an 18 inch long steel spike in the center of the steering wheel that pointed directly at the driver’s sternum.

[00:29:04] Ned: Wow.

[00:29:05] Speaker 2: The most common accident on the roads at the time and now is actually rear ending the car in front of you. If there’s a giant 18 inch steel spike pointing directly at the center of your chest, you’re going to stop tailgating. The imminent danger of being eviscerated, the thinking went, would stop these kinds of accidents in their tracks.

[00:29:29] Ned: You would think. So.

[00:29:31] Speaker 2: Extreme, but probably effective. Probably at the very least, that one guy is not going to tailgate anymore. He’s now a Shishka. Bob.

[00:29:40] Ned: That’s true. Shishka bobby tailgate.

[00:29:43] Speaker 2: Those are the level of fines that I think would actually instigate change. I alluded to the infamous equifax breach, but let’s remind everyone so we can all get mad again, those fines were pathetic. Toothless doesn’t even begin to describe it. Also, by saying toothless, it makes me think of how to train your dragon. And that was a fun movie. And equifax is not fun.

[00:30:11] Ned: Decidedly not equifax if people don’t remember.

[00:30:16] Speaker 2: Way back in 2017, lost the credit information for over 150,000,000 Americans. Two and a half years of no, it was more than that. Four and a half years of court machinations. And they were finally fined a grand total of $700 million. I remember getting my $5.0.63 settlement check. What a proud day for the justice system. Equifax, remember, earned $3.4 billion in revenue in 2017, a yearly number that has only increased since then. That 700 million pales in comparison. But at least in that case, there was something in place that caused a lawsuit and eventually a fine. There are situations around privacy and security where the laws are so toothless they can’t even be enforced. One particularly outrageous case happened recently and involved Home Depot Canada. The privacy commissioner of Canada found that Home Depot was, quote, routinely sharing customer data with Facebook in order to help fine tune, you guessed it, targeted advertising. What was the company doing? They were sending er receipt data along with users identifiable email address back to Facebook without consent from the consumer.

[00:31:48] Ned: Wow.

[00:31:49] Speaker 2: Did you click on a link for a lawn mower? A week later, did you buy a lawn mower? Facebook knows now. Congratulations, everyone involved. This was in clear violation of Canada’s Personal Information Protection and Electronic Documents Act. Or PIPEDA. That sounds fun.

[00:32:08] Ned: It sounds like a topping for salad.

[00:32:11] Speaker 2: But guess what? The privacy commissioner’s office has no power to levy fines. All it can do is investigate and issue recommendations.

[00:32:26] Ned: Wow, how very Canadian of them.

[00:32:30] Speaker 2: Imagine if the police were set up like that. Like if you were shoplifting and a cop watched you do it and walked out the door and the cop went to you and said, you shouldn’t do that, Sonny. Okay, but I like Legos, so I’m going to go.

[00:32:45] Ned: I feel like we just got a little view into your childhood.

[00:32:48] Speaker 2: I don’t know what you mean. Let’s move on.

[00:32:52] Ned: Fair enough.

[00:32:52] Speaker 2: Oh, and just in case anybody was getting all high and mighty and wanted to mock Home Depot and or Canada for this kind of horrific behavior, well, guess what? H and R block did it too. Right here in the good US of A.

[00:33:05] Ned: To the shock of no one. HR Block is terrible, awful and terrible. Again.

[00:33:13] Speaker 2: You’Re not wrong.

[00:33:14] Ned: Thanks.

[00:33:16] Speaker 2: Anyway, my point is, make the fines and the penalties actually matter. That will go a long way towards making executives get on board with security and privacy initiatives. I know this is not a new idea, but it’s a good idea. And my second idea is a little bit more along the lines of what we as individual consumers can do to, A, protect ourselves, and B, give a stiff middle finger to all these companies who don’t give a shit about us in any way, shape or form. And that is, number one, we should stop allowing companies to insist upon PII personally identifiable information from consumers for no.

[00:33:57] Ned: Reason, especially an excessive amount of that information.

[00:34:01] Speaker 2: Right. But the other thing is, as consumers, what we should probably do is just start lying about it.

[00:34:10] Ned: I do like that idea.

[00:34:11] Speaker 2: Companies are not interested in keeping my data private. Then I’m simply going to stop giving them information that needs to be private. Seriously. Here’s a hypothetical. All I want to do is play fantasy football. Why in the hell does Yahoo need my home address for me to play fantasy football?

[00:34:32] Ned: Where are they going to send the trophy, Chris?

[00:34:35] Speaker 2: First of all, I never win the trophy, but they’re never going to mail me anything. No, the answer clearly is they don’t need that data, but they want it so they can bundle it up and sell it to advertisers. Stop it. Until they stop it. Like I said, we can lie to them. For example, according to my account on Yahoo, I live in Serbia.

[00:35:06] Ned: Congratulations.

[00:35:07] Speaker 2: According to other sites, I’m Canadian. And at least one case from a few years ago, I’m pretty sure I live in Mumbai. Now, there are some sites that will complain and require you to be a US citizen, so why not move to Oregon? For the purposes of Pinterest, in every single case, the cell phone number I give them is different because they’re not going to call me either. Some sites have my middle name as fun things like Melchior, and in other sites, I don’t have a middle name at all. Just like James Bond.

[00:35:47] Ned: His middle name is Bond. I thought we established this. James Bond. Bond.

[00:35:55] Speaker 2: Bond. James Bond. Bond. So now it is true that you can’t do this everywhere. You can’t lie about your information to, for example, the government or your bank, because they do have rules about if you lie to your bank, you can go to jail. Isn’t that fun?

[00:36:13] Ned: Super fun.

[00:36:15] Speaker 2: If the bank loses your information, they don’t go to jail.

[00:36:17] Ned: No.

[00:36:20] Speaker 2: But in almost every other case, there is absolutely no justifiable reason for them to ask for this information. Now, if you want to take it even a step further, there are services that exist that can mask your email address as well. Apple, famously, last year introduced a service where you would get an Apple.com email account that was completely anonymous. The emails would be shifted behind the scenes into your inbox, but the website, whatever you signed up. For would never see your real account name. So if you’re within the apple wall garden, you can do that for free. It’s super fun. If you’re not, there are dozens of other services that do it for you. And in a perfect world, I would love it if everyone on Earth started to do this. Let’s muddy the waters of data about ourselves so that targeting advertising and as an industry goes back to being what it should be in the first place. None the fuck existent. This would have two benefits. Number one, it would reduce the interest in PII in general. Why try to exfiltrate customer data if everybody starts to realize that it’s all fake and useless?

[00:37:40] Speaker 2: And two, it would reduce advertising overall. And I think we can all admit that this would be a better world if there was less than that.

[00:37:53] Ned: You’re not going to get any disagreement out of me. I think the central problem here this.

[00:37:58] Speaker 2: Episode has been brought to you by Casper. Damn.

[00:38:00] Ned: It’s supposed to wait till the end.

[00:38:03] Speaker 2: Damn it.

[00:38:04] Ned: I think the larger problem here and the one that we’ll continue to struggle with, is that the average consumer needs it to be easy to not give up PII because they’re just trying to play some fancy football. They’re just trying to fill out a form on a website to get to something. And they don’t want to sit there and be all creative and come up with fun street names or anything like that garbage. They just want to get the form out of the way. And a lot of browsers have built in form fillers, so it’ll just do it all for you. Great. That’s actually a really good opportunity because you can add multiple addresses and just make up a few fake ones and use those where you don’t want your real address used. But again, that’s a level of effort that I think most people are not willing to undertake to protect their privacy. The cost benefit reality for them just isn’t there. So while I laud and appreciate your security and privacy focus, Chris, I hate to say that the rest of the world might not be there with you.

[00:39:08] Speaker 2: Well, I mean, the other thought is maybe this is a business opportunity. Somebody puts together some type of a plug in that automatically fills forms with things that pass muster in terms of that might be a real street address. If you do, can you please thank me in the liner notes?

[00:39:24] Ned: Absolutely. There is a website that will generate fake demographic data for you. So fake names, fake Social Security numbers, fake addresses for testing out a database or something like that. So it’ll just generate a bunch of fake data for you. I’ve used it to create accounts for labs and whatnot. You could do that for yourself. Just come up with like 100 different aliases, all with things that will pass the smell test on most form validation, but are not actually linked to any human being. You could have fun with it.

[00:39:59] Speaker 2: Have fun with it. Lightning round.

[00:40:02] Ned: Lightning round. Web three company raises $10 million because reasons you would think after the beating that crypto NFTs and the Metaburse took last year, we might see investment in the space cool off. And yet, Web Three startup, Spatial Labs, has raised $10 million in seed funding to design, quote, next gen technologies to connect brands to younger demographics that shop and interact with brands in completely new ways. End quote. Boy, that sounds like it was written by chat GBT, doesn’t it? New ways like wearing pants on your head or belts on your feet? No, by embedding tracking chips into clothing so that they can be turned into a traceable asset in the metaverse using blockchain. If there’s one thing I know about the youth of today, they love constantly being tracked by advertisers and vendors. The chip in question is the Lnq or Link chip that was designed by Spatial Labs, with each chip corresponding to an NFT on a blockchain that can be tracked and verified by vendors. I suppose for high end goods, this could be used to help prove authenticity or provenance. Is your Bergen bag authentic? Scan here to find out. Oh, it’s definitely not, but otherwise this just seems like a gross invasion of privacy with little or no benefit to the consumer.

[00:41:41] Ned: Not only that, but selling the item to someone else would require you to update the blockchain as part of the sales process. No, thank you. And of course, vendors can add a virtual version of the item that the users can leverage in the metaphorse. Sure, Chad. Whatever you say. After reading through their product pitch, I completely failed to understand why they were able to secure $10 million in funding. Chris, you and I are clearly in the wrong business. Do you think we could sell NFTs of Chaos lever episodes for an additional fee and then track listeners across their audio metaverse journey? I feel like that should net us at least a cool $5 million.

[00:42:27] Speaker 2: Advertising company Google sued for their monopolistic practices around selling advertising. Yay. This week the justice department sued Google again for monopolizing digital advertising technologies through serial acquisitions and anti competitive auction manipulation in order to, quote, subvert competition, in and monopolize advertising revenue unquote. From basically everyone on the internet, it appears they have been successful. Google has brought in $209,000,000,000 in ad revenue from 2021, making them the largest ad company in the world by far their closest competitor. Propaganda purveyor Facebook only made $115,000,000,000 in comparison. This is especially significant in terms of income for advertising company Google. As in 2021, 81% of their total revenue came from you guessed it advertising. Google, of course, immediately disputed the claims made by the lawsuit, stating, quote, DOJ’s lawsuit ignores the enormous competition present within the industry and attempts to unwind acquisitions nearly 15 years old. Which colo, potentially harm publishers and advertisers while also stifling innovation. Unquote, the DOJ has not yet responded to these statements by Google, probably because at this point they will let everything go to the courts without further comment but also possibly because everything Google just said in that quote was what’s the word?

[00:44:05] Speaker 2: Wrong and dumb.

[00:44:07] Ned: Wait, that’s three words all out. It’s the end of intel as we know it and I feel fine. Last week intel held their queue for financial meeting and things were, in finance speak bad. We all knew that intel wasn’t doing great and they had brought Pat Gelsinger in to turn the Itanic around. But one could rightly call these results abysmal. Intel posted a 700 million dollar loss on revenues that were down 32%. The hardest hit groups were the Client Computing Group and Data Center and AI, both slipping over 30% versus last year’s Q Four revenue. Their network and Edge group saw severely curtailed growth and the Accelerated Computing Systems and Graphics Group didn’t fare any better. You know what? I remain optimistic about intel for several reasons. First, intel is taking on massive spending to build out new fabrication plants and that’s an investment that is going to take a minimum of three to five years before it sees any return. It’s also being funded by the government ray. Second, the decrease in revenue is mostly a correction from the profligate spending during the pandemic on desktops, laptops and data center gear that has very much cooled off.

[00:45:37] Ned: Third, intel finally rolled out its newest generation of processors, Sapphire Lake at the data center level with desktop processors coming in Q two of 2023. There’s no doubt that cloud providers and data center admins as well as some consumers were waiting for the new processor to vROps before a hardware refresh. Intel has certainly lost ground to AMD and Arm in the past two years and I doubt they will ever be the behemoth they once were. But with Pat at the helm, I fully expect them to be riding high once again in two or three years time.

[00:46:17] Speaker 2: Zero Trust still struggling to gain effective widespread adoption despite its awesome name so according to like everybody everybody loves zero trust. So sayeth Forrester. The number of organizations implementing a zero trust security model has doubled in the past two years. This is of course necessary for a lot of reasons distributed computer models, the Edge remote workers, et cetera, et cetera. It seems that even in places where zero trust is deployed which somewhere between 30% and 50% of it shops, depending on whose survey you choose to believe it’s not being deployed thoroughly or reliably. This week, Gartner put out their own report that stated this in frankly bleak terms according to their definition, only about 1% of organizations currently have a, quote, mature program that meets the definition of zero trust. Ouch. Basically what they’re saying is that people are embracing the term zero trust and the idea of Zero Trust, but are not actually competently deploying Zero Trust. It’s kind of like if you have the club for your car. You remember the club, right? The big orange thing that locks the steering wheel in place? It’s like if you have that but you don’t actually lock it, in fact, it’s not even on the steering wheel.

[00:47:52] Speaker 2: It’s sitting on the passenger seat next to your wallet, and the windows are open. I can’t believe I’m saying this out loud, but in this case, I agree with Gartner. I know zero Trust is extremely hard to implement properly and most importantly, to maintain. Gartner’s prediction is a bit optimistic, though. They say that by 2026 that 1% of companies with a mature Zero Trust will rocket up to 10%. Wow. Let’s hope that’s an underestimation.

[00:48:33] Ned: The US. Is about to go nuclear again. There are 93 active nuclear power plants in the US. That produce about 19% of all electricity. While the Nuclear Regulatory Commission and RC has approved licenses to build eight new power plants, most of these ventures have stalled for lack of funding or political viability. Nuclear power is a polarizing technology, with such public incidents as Chernobyl, Threemile Island and Fukushima reminding folks how dangerous the technology can be. That’s without even mentioning the difficulty of disposing nuclear waste from the power generation process. Our best solution so far is to bury it under a mountain and hope for the best, which is essentially the same strategy my six year old would employ to hide spilled milk. However, the massive plants with their steaming cooling towers are not the only approach to nuclear power. And the NRC has finalized their rules regarding the construction of small modular reactors, SMRs, as implied by the name. These reactors are much smaller in scale, producing 50 to 75 MW, as opposed to our gigantic facilities today that produce closer to 1000 plant. They are also modular, meaning that they can be produced in a central location and then moved into place instead of being built entirely on site.

[00:50:03] Ned: And they can be changed together to increase net output. The NRC rule finalization clears the way for construction of SMRs with the first plant construction by new scale at the Department of Energy’s Idaho National Laboratory in 2029. That’s not exactly swift, but then building nuclear power plants isn’t a process I want rushed. There are still concerns surrounding the disposal of nuclear waste, but with the rules finalized, the door is open for competing solutions to emerge in a more competitive landscape. My bet, however, is still on cold fusion. Keanu Reeves would never steer me wrong.

[00:50:42] Speaker 2: Can we please pump the brakes on articles about how AI will be taking everything over by this time next week? Look, I get it. Chat GPT is like super neat. And in terms of regurgitating facts, it is something of a miracle that it only gets like 30% of what is asked of it wrong. I have bounced ideas off of Chat GPT regularly since it became an open beta, and I have been absolutely gobsmacked by the dumb shit it says. I was doing very important research recently, and I asked GPT what James Bond’s middle name was and its response I shit you not, quote James Bond’s middle name is Bond. What? So you’ll forgive me if I’m less than impressed when I hear reports about how Chat GPT passed a med school exam, or Chat GPT passed an MBA test, or the latest Chat GPT wrote a bullshit article abstract that was indistinguishable from one that a person wrote, so effing. What? You don’t get into med school just because of your MCAT. There’s an interview body of work courses you have to pass, and you don’t get an MBA from a good school just from an exam.

[00:52:19] Speaker 2: You also have to do an actual dissertation or a full scale final project and defend it and writing bullshit abstracts that make it past the halflistening gatekeepers at, yes, even journals as August as Nature has been going on for decades. None of this is new, none of it’s exciting, and none of it is going to change a single freaking paradigm. So let’s just calm down a little bit, yeah? And then we can get back to actual important stuff like shaming Google and the other tech titans for firing tens of thousands of people for no justifiable reason. Here’s a fun fact. Did you know that Google also fired their head of mental health? And, wellbeing, as part of the random 12,000 that were let go? Go get mad about that instead of caring how good Chat GPT is at fucking Sporkle. What? Sporkle. That’s the reference I went with. Good God, I’m old.

[00:53:22] Ned: You really are. It’s okay. You can ask Chat GPT what you can do about that. I’m sure it has some really stellar advice. Oh, hey, thanks for listening or something. I guess you found it worthwhile enough if you made it all the way to the end. So congratulations to you, friend. You accomplished something today. Now you can sit on the couch, optimize your user profile for maximum SEO discovery and monetize your presence to feed the algorithm. You’ve earned it. You can find me or Chris on Twitter at ned 1313 and at heiner 80 respectively. You can follow the show at chaos underscore lever if that’s the kind of thing you’re into. You can sign up for our newsletter and check out the show notes that are available@chaoslever.com. If you like reading things which you shouldn’t, podcasts continue to be be better in every conceivable way. We’ll be back next week to see what fresh hell is upon us. Tata for now. For now.

[00:54:23] Speaker 2: I have nothing to add.

[00:54:25] Ned: For now.

Hosts

Chris Hayner

Chris Hayner (He/Him)

Our story starts with a young Chris growing up in the agrarian community of Central New Jersey. Son of an eccentric sheep herder, Chris’ early life was that of toil and misery. When he wasn’t pressing cheese for his father’s failing upscale Fromage emporium, he languished on a meager diet of Dinty Moore and boiled socks. His teenage years introduced new wrinkles in an already beleaguered existence with the arrival of an Atari 2600. While at first it seemed a blessed distraction from milking ornery sheep, Chris fell victim to an obsession with achieving the perfect Pitfall game. Hours spent in the grips of Indiana Jones-esque adventure warped poor Chris’ mind and brought him to the maw of madness. It was at that moment he met our hero, Ned Bellavance, who shepherded him along a path of freedom out of his feverish, vine-filled hellscape. To this day Chris is haunted by visions of alligator jaws snapping shut, but with the help of Ned, he freed himself from the confines of Atari obsession to become a somewhat productive member of society. You can find Chris at coin operated laundromats, lecturing ironing boards for being itinerant. And as the cohost on the Chaos Lever podcast.

Ned Bellavance

Ned Bellavance (He/Him)

Ned is an industry veteran with piercing blue eyes, an indomitable spirit, and the thick hair of someone half his age. He is the founder and sole employee of the ludicrously successful Ned in the Cloud LLC, which has rocked the tech world with its meteoric rise in power and prestige. You can find Ned and his company at the most lavish and exclusive tech events, or at least in theory you could, since you wouldn’t actually be allowed into such hallowed circles. When Ned isn’t sailing on his 500 ft. yacht with Sir Richard Branson or volunteering at a local youth steeplechase charity, you can find him doing charity work of another kind, cohosting the Chaos Lever podcast with Chris Hayner. Really, he’s doing Chris a huge favor by even showing up. You should feel grateful Chris. Oaths of fealty, acts of contrition, and tokens of appreciation may be sent via carrier pigeon to his palatial estate on the Isle of Man.