Welcome to the Chaos
Feb. 1, 2024

How AI Is Reshaping The Internet As We Know It

How AI Is Reshaping The Internet As We Know It

Ned and Chris delve into how AI shapes the internet's transformation, discussing its profound effects on current challenges and the rise of user-focused, diverse application integrations for the future.

ARPANET to User-Centric Futures

Explore the impact Artificial Intelligence (AI) has made on the internet in this episode of Chaos Lever! Our hosts, Ned and Chris, discuss the internet's evolution from ARPANET roots to its present complexity, highlighting the shortcomings of its foundational protocols and the role of AI in both exacerbating and potentially rectifying these issues. Ned also explains how emerging technologies like the Rabbit R1 work, suggesting a shift towards AI-driven tools that prioritize user needs and integrate diverse applications. 

Transcript

[00:00:00] Ned: This week, I’ve been fighting with Camtasia because every time it renders my videos, the audio and the video get slightly out of sync. I have years of YouTube videos that I created with Camtasia where the audio and video are fine, and they broke something. Wasn’t me.


[00:00:21] Chris: [laugh] And this is one of those secret reasons why people get fed up and just don’t update.


[00:00:29] Ned: [laugh] God, is there any more apropos theme for our podcast than all of this was a mistake [laugh] ? Hello, alleged human, and welcome to the Chaos Lever podcast. My name is Ned, and I’m definitely not a robot. I’m a sentient, carbon-based life form who loves puppies and bovines. And it’s not weird that you eat one and pet the other. Yeah, that’s normal. With me is Chris, who loves bovines?


[00:01:07] Chris: They’re petable.


[00:01:08] Ned: They’re terrifying, is what they are.


[00:01:10] Chris: Well, there’s that.


[00:01:12] Ned: Being the non-rural boy that I am, the first time I encountered a cow up close, I was like, dear God, this thing is huge. It could kill me.


[00:01:23] Chris: Yeah. And a lot of times they have more energy than you would expect, especially if they have room to range. They’re kind of like 900-plus pound Labradors.


[00:01:34] Ned: [laugh] Yeah. And my Labrador, who she is, 90-something pounds, already can do a significant amount of damage to me by just, you know, I don’t know, stepping on my foot.


[00:01:47] Chris: Well, that’s willful. I trained her to do that.


[00:01:50] Ned: I know you did. And she really got me because I was, like, going down the stairs and so was she. And so it was like her full weight just right on my foot, and it was not comfortable. Anyway [laugh] , that has nothing to do with what we’re going to talk about today, because she is a natural intelligence. We’re going to talk about how artificial intelligence is going to destroy the internet, and really, that’s for the best.


[00:02:18] Chris: It actually fits with the intro.


[00:02:19] Ned: Yeah, I didn’t do that on purpose, but that has been, like, a prevailing theme of the entire week is, not just AI, it’s just technology is awful, and I hate it.


[00:02:30] Chris: Right. When in doubt, set the flag on fire.


[00:02:34] Ned: [laugh] That’s my grandma and your grandma, right?


[00:02:38] Chris: Two 90-year-old arsonists.


[00:02:40] Ned: [laugh] Oh, I don’t know. There’s something adorable about the idea of a grandmother who’s also an arsonist. Like, you’d think she’d be caught at this point, but no, no.


[00:02:50] Chris: Yeah, it sort of sounds like a book title, My Grandma the Arsonist.


[00:02:54] Ned: Oh, well, maybe we should stop doing this and go write a book.


[00:02:58] Chris: It’s about time we did something productive.


[00:03:01] Ned: [laugh] Fair. Well, this is going to be a little off the cuff, because it’s just ideas that have been swirling around in my head, which is a dangerous place for all of us to swim, yet I invite you to jump into the deep end. And, Chris, you’re going to try to throw people a lifesaver when I wander off too far. So I want to start with reminding everyone out there that the internet, as we know it today, is less than 30 years old. And I can probably make a pretty good argument that it’s less than 20 years old considering how people actually consume it today. And, if I could sum up two major themes of the internet era, it’s been acceleration and expansion. The internet and the applications that rely on it are bigger than anyone ever imagined, weirder than anyone could have comprehended, and the tech behind it is, let’s say, ever changing and evolving. Some of it relies on protocols and technologies that were developed 60 years ago, and some of it relies on protocols and technologies that were invented 6 minutes ago.


[00:04:10] Chris: That’s only slightly exaggerated.


[00:04:14] Ned: [laugh] I know. Right? There is a new JavaScript framework being developed every day, and I don’t know a lot about JavaScript except that it’s awful.


[00:04:25] Chris: Yeah. And then the other day, because YouTube likes to give me wonderful and weird suggestions, I watched a video about how JavaScript frameworks are now passe.


[00:04:37] Ned: [laugh] Of course they are. I will say I just redid my Ned in the Cloud site. Well, I had somebody else do it. Let’s be honest. And it uses a static site generator, and there’s a little bit of JavaScript on some of the pages to do, like, very specific things. But for the most part it’s all native HTML. And it’s really nice because it renders quickly. It loads quickly. It’s very tiny. I really like it. I know that doesn’t work for every website. Some really do need the heavier handedness of it. But compared to WordPress, which required its own, like, virtual private server to run on and many gigs of memory, this is just hosted on Netlify, and it’s static, so it basically takes nothing for them to host it. What was I talking about?


[00:05:29] Chris: JavaScript at some point.


[00:05:30] Ned: Okay [laugh] . What I want to say is we can track the evolution of the internet in a few major waves, and that there’s been change at different layers. But the thing I really want to focus on is how the waves seem to be coming faster and more furious, and AI is really going to fuck things up in a multitude of ways.


[00:05:52] Chris: Yeah. I mean it’s kind of a variation on a theme, which is that technology for the last 50, 60 years, has been moving faster and faster year over year than it ever has before. To paraphrase a great philosopher, it’s faster than it ever was, and now it’s even faster.


[00:06:11] Ned: [laugh] So I guess part of the question is, at what point do we outpace people’s ability to keep up with it? And I would say to a certain degree, in some regards, we already have. We have outpaced what a human can actually keep up with.


[00:06:27] Chris: Yeah. I was going to say, for this one, you have to give a competent definition of ‘people’.


[00:06:32] Ned: [laugh] Even a good programmer has no way of keeping up with all the different programming languages and web technologies out there. The only way to do this is to work as a team, where you have experts in different areas that work together to create a cohesive whole.


[00:06:49] Chris: Right.


[00:06:51] Ned: And it’s only getting more complexified-er.


[00:06:56] Chris: Good word.


[00:06:57] Ned: Thanks. So, all right. Let’s start with a little bit of history because that’s kind of what we tend to do. The internet, honestly, it’s already broken at many, many layers.


[00:07:07] Chris: That’s not history, that’s a conclusion.


[00:07:09] Ned: [laugh] Okay. All right. The premise is the internet was never meant for any of this.


[00:07:15] Chris: There we go.


[00:07:17] Ned: [laugh] I mean, you know, if you look back in the long scope of the internet, it all started as, like, ARPANET, and in the early days it was just a handful of systems that were connected together over dial-up connections. Everybody knew everyone. Like, if there was a problem, you could just literally call the person who ran the other system and be like, “Hey... there’s a problem.” And there was a certain element of trust between all these different nodes that were on the network. And this is where we established a few important standards, things like Internet Protocol, IP addressing, and how two things can connect. We created Ethernet for the actual physical connection. We created TCP to handle transmission control and BGP to handle the interaction and routing of traffic across autonomous systems, AS’s. And that’s kind of how the larger internet was birthed was we standardized on these protocols. But again, it was very small, and none of these protocols took security into account at all because you knew everybody. Who as going to do something weird with BGP and hijack your routes? That’s ridiculous. Why would anybody do that? We still use all those protocols.


[00:08:35] Chris: Yeah. I mean, when these things were created, it’s important to remember that the very first web directories were curated by hand.


[00:08:41] Ned: Yes.


[00:08:42] Chris: Like, it was just some guy who went, “One dot one dot one dot one; nothing? Okay. One dot one dot one dot two.”


[00:08:51] Ned: Well, yeah, that’s literally—there was a hosts file and that host file—this was, like, pre-DNS. Right? So there was no server you could hit and say, “I have this address. What’s the IP address that corresponds to this name?” There was just a file that someone kept up to date [laugh] .


[00:09:09] Chris: Right.


[00:09:10] Ned: And eventually that did have to be handed off to something slightly more centralized. And that’s how we got DNS. But again, DNS was created pre this larger era of the internet, so security was not taken into account at all. It was assumed that you could trust the DNS servers and that people weren’t going to do weird, malicious things, like stick something in the middle that would just lie to you about what the addresses were. So we didn’t have security to begin with, and we tacked on security, and it hasn’t been great. And we could talk through that for a while, but I think we all know that there have been major security issues with all these different protocols. And I didn’t even get to HTTP, which has its own host of issues, and it’s basically what the entire modern internet was based on is the invention of the hypertext transfer protocol and the language behind it, HTML. Okay, so that’s our little history lesson in terms of the underpinnings of the internet. It was never meant for any of this. It’s already broken in all these different layers, and there’s just decades of spackle [laugh] —been applied to all of these standards to keep everything working.


[00:10:25] Chris: Right. And we’ve done, you know, some different deep dives on different aspects of this. One that you didn’t mention but is in the same exact ballpark is email.


[00:10:34] Ned: I was going to get there.


[00:10:35] Chris: Oh, well then, I’ll shut up. Carry on.


[00:10:39] Ned: [laugh] Then, once the actual modern internet started, there became the question of, how do people use it? How do they find things? And we had the rise of internet service providers that created walled gardens. Right? So you had your AOL, your Prodigy. There was a bunch of other smaller ones. CompuServe was another one. So all of them said the internet’s a big scary place. So what we’re going to do is just create a walled garden environment where you can do things that are safe. And maybe we’ll help you find things that are out on the greater internet, but we’d really like to keep you here where, you know, we can manage things. And then Google and, like, Yahoo and other search engines happened, and people started to realize there is a much wider internet out there. And so the walled garden started to come down, and people just started to browse the internet using their web browser and these search engines to find cool sites. Did you ever—I did this, and I’m curious, Chris, if you ever did this. Did you ever just sit there, when you were in your, like, teenage years, and type in different web addresses to see if it was a real domain?


[00:11:54] Chris: Brave of you to think that I can remember that far back.


[00:11:58] Ned: [laugh] Because that’s something I did. I sat there, and I was like, I wonder if these are actual websites. And I just wrote down a whole bunch of websites, and then I went to each one, one by one, to see if it was a website. And that’s what we did for fun [laugh] in the late ‘90s. But that was kind of where we were at for a while, was things disaggregated and kind of blew up. And the search engine became the means by which you found a lot of things, but you also found a lot of things by word of mouth. You know, you were on an IRC channel, or you were on whatever predated Reddit, and someone would just share a site, and you’d go check it out. Or you had stuff like—what was that? Stumble Upon, which would just serve you up a random site. You had stuff like Digg, where people would upvote things. So you had these different aggregation platforms, but they were sending you to sort of a more decentralized internet. There’s a really good book called How the Internet Happened that sort of documents this phase of the internet. And that was—I don’t know, I kind of consider that to a certain degree the golden era of the internet, before Facebook and some of the other major social media companies stomped in and recreated the walled garden we had before.


[00:13:13] Chris: Right. People fondly recall it as the wild west period of the internet—


[00:13:19] Ned: Mm-hm


[00:13:19] Chris: —when the most dramatic thing you could do is have a Blogspot and a LiveJournal.


[00:13:25] Ned: Whoa. [laugh] Yeah, exactly. And then we had this sort of, like, consolidation of things, in part because the bigger media organizations finally got their act together and realized the internet was a real thing. And so they started pouring money into it. And a lot of that money went to gamifying the search algorithms to optimize their sites to have the best SEO. So now Google started to break because organic search stopped working as well. People were keyword loading, and then when that stopped working, they would find other ways to increase the search engine optimization of their sites. And the companies that had the most money to do that were these traditional media players. And then we also had the rise of stuff like Facebook, where Facebook’s goal is to keep you captive on Facebook for as long as possible because everything is ad driven, and retaining eyeballs for as long as possible is the end goal. And that’s true of every platform that I can think of.


[00:14:37] Chris: Yeah. Depressing, but accurate.


[00:14:41] Ned: Once you understand that, you can understand why sites do things like punish you in the ranking algorithm if you link to an external site. So, for instance, if I, on LinkedIn, write an article on LinkedIn and then write a post promoting that article, that will do pretty well in the rankings. That’ll get pushed up. But, if instead I write a quick post that links to a blog post I wrote on my own site, that will get pushed down in the ranking.


[00:15:15] Chris: Favoritism towards things that keep you within the walled garden.


[00:15:20] Ned: Right.


[00:15:20] Chris: LinkedIn, in this case, which was never intended to be a social media platform.


[00:15:25] Ned: [laugh] Not originally, but it basically is now, more so since the implosion of Twitter.


[00:15:33] Chris: Right. That’s fair.


[00:15:35] Ned: But they all follow this model because they make their money off of ads. Keeping you on the app or keeping you on the website for as long as possible is their goal because that’s more ads they can serve to you, whether you care about those ads or not. And that’s the experience on everything, whether it’s Instagram or Twitter or, I don’t know, YouTube even. Like, all of them, their only goal is to keep you in that portal for as long as possible. And it sucks. And AI is going to make it worse.


[00:16:07] Chris: Mm-hm. Mm-hm. Mm-hm.


[00:16:09] Ned: I think what’s happened is we’ve developed a lack of trust between those who create and, more importantly, publish content and those who consume it. It’s become almost adversarial to a certain degree. And I feel like we’ve seen this pattern repeat over and over, not just for places that serve up content, but for all kinds of different technologies. So, for instance, to go back to the roots of the internet and BGP, BGP used to be fun for some approximation of fun, and then we ruined it because people would start doing things like injecting and hijacking routes, and then we had to develop technologies to fix that. Email used to be fun, and then we ruined it—and when I say “we,” I mean just everybody—with things like spam. Social media used to be fun, and then we ruined it with ads and spam and disinformation. I’m sure you’ve seen the talk that Cory Doctorow gave around the enshittification of platforms.


[00:17:21] Chris: Absolute masterclass.


[00:17:23] Ned: Yeah. So, if you want to know more about that idea and what he had to say about it, we’ll have a link in the [show notes] . Definitely watch the talk, or there’s an accompanying article that he wrote I forget for which website it was. I think it got reposted on Boing Boing or something. But, anyway, that term has actually made its way into the general population because it so perfectly captures what has happened with each of these different platforms for technology. It starts off exciting and new and shiny. It attracts a bunch of people because it actually provides some value or a useful service. And then, over time, it starts mistreating those people because it’s beholden to earning revenue, and the way it’s going to do that is by serving ads or selling your data or both. And, over time, the platform becomes worse and worse because you are not the customer; you become the product, and they’re selling ‘you’ as a person to other things. And so the whole thing gets shittier and shittier until we eventually abandon that platform and go to a new platform where the cycle repeats.


[00:18:38] Chris: Right. I mean, the biggest problem there is—I mean, you talked about a couple of different things. One is the inherent insecurity of protocols that have never been adequately replaced. They’ve only been patched.


[00:18:49] Ned: Right.


[00:18:50] Chris: But the other thing is, people put things online for fun in the good old days.


[00:18:54] Ned: [laugh] .


[00:18:55] Chris: And then, when things become more corporate, the goal becomes profit, which is fine. This is, you know, a profit-driven economy and all that. But the only thing that people have come up with to make a reliable profit is advertising.


[00:19:10] Ned: Mm-hm


[00:19:11] Chris: And advertising as much as humanly possible, using advertising for known bad malware, known bad products, known scams, all in the name of more and more eyeballs, which means more and more dollars. So you also end up with this vicious cycle of people only relying on advertising to make a profit but not policing advertising itself. Therefore, people trust advertising even less, forcing people to use ad blockers, which makes the advertisers even more desperate.


[00:19:45] Ned: [laugh] .


[00:19:45] Chris: And thus that cycle continues.


[00:19:48] Ned: Right. I do have a point I want to make about how the newer generation is used to ads, and they have become pretty good at ignoring them for the most part and just accepting that they’re there. But the other thing I want to say is, like, it’s not that advertising is inherently bad. I mean, it can be distasteful. You can do too much of it. You can do the wrong type of it. But it’s not like—I’m not going to get on my high horse and say that all ads are terrible, since part of the way that I make a living is by [laugh] having vendors buy ads or sponsor episodes. So, like, it’s not—


[00:20:24] Chris: Yeah, that’s fair.


[00:20:25] Ned: It’s not terrible. It’s just, like, there’s a responsible way to do it, and there is an irresponsible way to do it. And this goes back to the email used to be fun, and we ruined it. Social media was fun, and we ruined it. Like, content creation was fun, and then we ruined it. And AI, well, we talked about how, like, the—there’s the accelerating waves of change and cycles that are happening with technology, and this enshittification of things feels like it’s been accelerating. The average lifespan of a given platform I feel like has been shrinking over the last 20 years. AI is going to make everything that we just talked about worse.


[00:21:08] Chris: Agreed.


[00:21:10] Ned: Because it can do stuff at scale, massive scale. So starting with just the core foundational technologies that underpin the internet, what is AI really—one of the things it’s really good at is writing programs. And you can—with a little bit of engineering, you can get it to write malicious programs. You can get it to be adversarial in nature and just try to break stuff. And it can do it at a speed and a scale that is much faster than some of the traditional ways of hacking.


[00:21:50] Chris: Right. And that’s just assuming that you’re only using ChatGPT.


[00:21:55] Ned: [laugh] Right.


[00:21:57] Chris: If you go out of your way to program specifically based on malicious programs and malicious ideals, it gets even easier.


[00:22:08] Ned: When you consider that the largest sort of malware and all these different kind of scams, this whole criminal underground, right, is largely funded by well-organized institutions. Like, this is not like your brother sitting in a basement hacking away. We’re talking about people that have, like, corporate structures and charters, and, like, this is what they do.


[00:22:36] Chris: They can afford it.


[00:22:37] Ned: They can afford to train. Especially as AI becomes available as a service, they can afford to train a large language model, specifically on hacking related or programming related stuff, and remove the safeguards that would typically be on a ChatGPT. Like, if I ask ChatGPT to write me some malicious software, it’s going to say, “I can’t do that.” They’ve added protections. A private AI model—that you can remove those protections. So I think, as the cost of training these language models drops, we are going to see a rise in cyberattacks and cybercrime because it’s going to become that much easier to develop these exploits. So that’s one way it’s going to break everything. The next one is content farms.


[00:23:34] Chris: [sigh] .


[00:23:34] Ned: [laugh] I think, previously, the main thing that content farms were fighting against was, humans, in theory, had to write the copy and post the copy behind these content farms. You could use AI a little bit. ChatGPT has made it trivial to write just millions of lines of whatever and publish that whatever at scale. You can just spew out content day in, day out, using ChatGPT or any of these other tools. So you don’t need an informed human or even a semi-informed human to write this content anymore. You can just have AI write it for you and stuff your website full of it.


[00:24:22] Chris: And if you do any Google searches at all, you’ll find evidence of this.


[00:24:26] Ned: Yes. Same thing is now happening with video. So there are video services that will say we can help you generate your videos using AI and stock footage. So you literally write a script, probably have ChatGPT write [laugh] the script for you, and then you submit that script to the service, and it uses an AI voice to read the script along with some images that you might submit and stock footage to create a video. And then you can have it do that 20 times to create 20 variations on the same video and then publish all of them to different YouTube channels. And then you can have AI help you write the most optimized titles for all of these different videos and the thumbnails for them. And, if you don’t care about stealing other people’s work, which you probably don’t, you can also crib somebody else’s videos while you’re at it. And all of this is low effort, fairly easy to do at a massive scale.


[00:25:34] Chris: And you’re also neglecting having bot farms then populate the comments—


[00:25:39] Ned: [laugh] .


[00:25:40] Chris: —to simulate engagement.


[00:25:42] Ned: Thank you. I forgot about that. [sigh] So the problem with all of these content creation engines is it ruins the search algorithm. Google has not caught up with this. YouTube has not caught up with this. Facebook—like, none of them have caught up with being able to detect AI generated content and filter it out of the search results. And they don’t really want to because they’re not incentivized to do that. Because if that content keeps people on the platform for longer, they’re happy.


[00:26:13] Chris: Even if those people are quote unquote “people.”


[00:26:16] Ned: [laugh] .


[00:26:17] Chris: And by that I mean “not people.” A click is a click.


[00:26:21] Ned: A click is a click. A view is a view.


[00:26:23] Chris: Even if it happens exactly every point two six seconds from now until the end of time.


[00:26:30] Ned: I will say that advertisers over time have demanded that, like, platforms like YouTube and Twitter and others detect and remove bot farms that are artificially inflating things. Because they—the advertising companies want real people engaging with those ads and buying their products. And these bot farms are not going to do that. So that’s where the pressure actually comes from to fix things. But AI is going to make it so much worse.


[00:27:01] Chris: Right. And not helping is the fact that those platforms are not going to try super hard.


[00:27:08] Ned: [laugh] They’ll try just hard enough to keep the advertisers.


[00:27:12] Chris: Wasn’t the report last year that, on a conservative scale, 80 percent of all traffic on Twitter was automated?


[00:27:21] Ned: [laugh] I don’t know about that. Yeah. That seems too high. But I’m sure it’s a non-trivial amount of traffic and users on Twitter are not humans. I’m sure that hasn’t changed, either. So what’s the future look like for the average person? And then how do we fight this? And I have a few theories. I’m curious to hear your thoughts. In terms of how I think things are going to turn the tide is moving back to a more decentralized setup. It’s not that we’re not going to use some of these platforms like YouTube or LinkedIn to post our content, but I think what’s going to happen is, like, I’ve been paying more attention to my website, my personal website now, because that’s a thing I control, and I control what goes on that, and I control what doesn’t go on that more importantly. Whereas I have no control over LinkedIn or YouTube and the algorithms that get posted there. But again, what I do have control over is what gets posted on my channel. And what I think is going to happen is—and this is probably already happening—people are going to develop a list of trusted, real people that they believe are creating legitimate content and just focus on those people and anything that those people share. So, to a certain degree, influencers are going to become even more important than they already are because people can’t trust the search algorithm. So what can they trust? A human who actually curates a collection of content for them that they might enjoy, and it’s not AI creating all this stuff.


[00:29:03] Chris: I think that you’re right. I think that you’re already overestimating or underestimating how long it’s going to take for that to happen. Because I think, first of all, people need to understand the problem. And right now, I think there’s just not a lot of discernment still because it’s a lot easier to do a YouTube search, find that perfectly curated title, and not realize until you’re halfway through the video that you’ve not seen a human face talking.


[00:29:34] Ned: And even if you have, it might be an AI human.


[00:29:37] Chris: Right. And it’s like right now you can still pick out—every once in a while, an AI voice will pronounce a word weird.


[00:29:44] Ned: [laugh] Hey... I was watching a video on—I’m trying to remember exactly what the topic was. It was something having to do with hydrogen powered cars or something. And two minutes into the video, I’m like, I think this whole thing’s written with AI because I’m two minutes in and it has yet to say anything relevant. Like, it’s just—it’s doing that thing where AI repeats the same phrase in three different ways.


[00:30:10] Chris: Yep.


[00:30:11] Ned: Or uses the same start to each sentence. And that’s like a telltale mark. And I realized pretty quickly, oh, this whole video has been generated using AI. And for that reason, I probably can’t trust anything that’s on it. And it was a channel I don’t normally watch, so to that—and I was like, well, if I want to know more about this hydrogen powered car thing, I need to go find someone who’s a real human that I trust or that somebody else refers me to. So I think that’s one big trend, is that the individual humans who either create the content and are trusted or curate a collection of things for others is kind of where it’s going to be at for a little while.


[00:30:58] Chris: Yep. I just think you’re being too optimistic here. I think that AI is going to ruin everything first, and then we’re going to have to crawl back through the rubble to get to someplace reasonable like what you’re talking about. And this is especially prevalent in areas where it’s not just, like, opinions. Like, Kyle Hill has a great video about how these unscrupulous YouTube farms are creating copies of work that were done by real creators, basically stealing their script, throwing it into an AI generator, saying rewrite this, and then creating a YouTube video exactly like you described earlier, posting it in ten different channels, and then just sitting there and taking ad revenue from the person that busted their ass to build it in the first place.


[00:31:43] Ned: Right. And my point is—


[00:31:43] Chris: It’s gotten to the point where a lot of influencers, which is a term that I hate, are starting to quit YouTube because it’s just not worth the aggravation.


[00:31:52] Ned: Right. And they may end up on a platform that is—it doesn’t allow just anyone to submit and has maybe a paywall behind it. For instance, I use Nebula to watch some of the content creators because I know that Nebula is all curated. You have to apply to be a creator on Nebula. It’s a paid service. There’s no ads. And I know that if I go and watch a video there, it’s not going to be AI generated, and the money actually goes to the person creating this stuff. It’s kind of like the Patreon thing all over again. Right?


[00:32:29] Chris: Well, that’s the problem. Well, the problem and solution are hand in hand. The problem is being driven by unrelenting desire for eyeballs, no matter where they come from. And the reason that that happens is because advertising drives all revenue. And the reason that that happens is that we don’t have a good way to do micro payments online that people trust or think is convenient enough to use.


[00:32:53] Ned: Right. But we’ve gotten used to subscriptions. And to a certain degree, I think that’s kind of where some of it might go. But the other thing I’m going to say is that, while AI is going to destroy and burn down the internet as we know it, it is also probably going to be the thing we end up using to build backup.


[00:33:14] Chris: Right.


[00:33:15] Ned: We’re going to have this massive decentralization of content and platforms and things of interest and applications. We already have this with SaaS. If you think about all the different SaaS products that are out there to do things and how none of them really work together, even if they should, with the exception of stuff that’s all under the same umbrella, like all of Microsoft’s products, which they mostly work together. Debatable, but I don’t want to get into that. It would be really nice if there was something that glued all that stuff together. And the thing that’s going to glue all that stuff together is probably going to be AI, but AI working for us, as opposed to working for these platform companies or the people generating endless content. And so personalized AI, I feel like, is the next step in the evolution. We’ve tried personalized AI a few times. It’s always been terrible. We might finally be hitting the point where it’s not terrible, less terrible-er. Huh?


[00:34:21] Chris: I’ll allow it.


[00:34:22] Ned: Okay [laugh] . And kind of what made me think of this is, at CES a couple of weeks ago—and we covered this last week—there was a new product category introduced called the Rabbit R1. And, while I don’t think that’s going to be the end all and be all, they had an interesting approach, which was all these disparate things kind of suck. Having 100 apps on your phone kind of sucks. What if we could have an AI intelligently work with those different apps to accomplish a goal for you and train it in such a way that it can just navigate the UI of all these different apps so it’s not relying on each app writing an API for it to work with. And that’s kind of what they’ve tried to build, is an operating system that has both large language models and—I forget what they called the other one—like, large—


[00:35:17] Chris: Large action model?


[00:35:18] Ned: Action model, yeah. It has these two components to it that allow you to train it if it’s unfamiliar with an app, and also just understand the most popular apps out of the box. So, when you want to listen to a song or you want to book a trip somewhere, it can talk to American Airlines and Hilton and Uber and, like, book the whole thing for you and understand the context of you and how to actually perform these actions on the various apps instead of having you go in and do everything yourself. I can imagine extending that model out to “go find me a collection of videos on this topic, and make sure they’re not AI generated.”


[00:36:01] Chris: Right.


[00:36:01] Ned: And maybe it’ll be able to do that.


[00:36:05] Chris: Yeah, I mean, what you’re describing is a curating tool that is completely decentralized and completely fine-tuned by the end user.


[00:36:15] Ned: Right. And it’s not beholden to optimizing for advertisers. It’s optimized to make you use that tool more. And so it’s in their best interest to have the tool tune itself to be your indispensable personal assistant, as opposed to assisting you in finding more ads.


[00:36:39] Chris: Right.


[00:36:41] Ned: So anyway, I ordered one. I don’t know when it’s going to ship, but it was only $200, which is a lot less than the $600 iPhone that launched back in 2005 or whatever. And I feel—I felt okay about it [laugh] .


[00:36:56] Chris: Yeah, we’ll see. I am curious to start getting real world reports, because based on the marketing, it seems like it’s a good idea. And it seems more coherent than any other product like it so far.


[00:37:11] Ned: Right.


[00:37:12] Chris: Now, how well it works and any outcomes in particular around security, I’m super curious about.


[00:37:20] Ned: Yeah.


[00:37:20] Chris: And how they’re going to sustain revenue when they’re selling a product for 200-odd dollars with no subscription.


[00:37:26] Ned: There will be a subscription of some kind. I guarantee it. But it not requiring a subscription from the start is really nice.


[00:37:36] Chris: Yeah.


[00:37:37] Ned: Because the closest thing we could compare it to is the Humane Ai Pin, and that thing was what, like $600 or $700?


[00:37:46] Chris: With a subscription.


[00:37:47] Ned: Yeah. And you had to have the subscription for it to work at all.


[00:37:51] Chris: Right.


[00:37:53] Ned: This is a little bit different. Also, the guy who was doing the presentation was actually personable and, like, interesting [laugh] .


[00:38:00] Chris: Oh, the R1? Yeah.


[00:38:01] Ned: Yeah [laugh] . But, yeah, I think my ultimate conclusion to this whole thing is that the internet is broken. It has been broken for a while. The breakage is going to accelerate to potentially what looks like a full collapse under the weight of AI. And, from the rubble, AI is going to help us rebuild possibly something better.


[00:38:26] Chris: Skynet 2.0. This time without ads.


[00:38:31] Ned: [laugh] Well, at least it doesn’t show me an ad before it kills me. Hey... thanks for listening or something. I guess you found it worthwhile enough if you made it all the way to the end. So congratulations to you friends. You accomplished something today. Now you can sit on the couch, surf YouTube, and watch some AI generated video fodder for your personal enjoyment. You’ve earned it. You can find more about the show by visiting our LinkedIn page. Just search Chaos Lever or go to our website chaos lever dot cow , where you’ll find show notes, blog posts, and general tomfoolery. We’ll be back next week to see what fresh hell is upon us. Ta ta for now.


[00:39:16] Chris: Yeah, I mean I’m curious about the Rabbit R1 for sure. But this might shock people. I’m not usually a technology early adopter. So I’m just going to wait three months until the Rabbit R100 comes out.


[00:39:29] Ned: Or you just wait until I try it and tell you whether it’s good or not.


[00:39:34] Chris: Yeah, but that would require me to trust your opinion.