Featured in this episode of Tech News of the Week
Because of course they are. Note that this is apparently completely separate from Microsoft’s AI-chip creation initiatives. Note also that it doesn’t make any sense at all for OpenAI to do this- CEO Sam Altman’s chief complaint is that there’s a global scarcity of GPUs, which is true. But to design and build their own chips would take way longer for OpenAI than just signing a deal with one of the already-existing companies that do this, because building your own chips is hard.
Facebook, in particular, has tried and failed at this effort, because of course they did. Still, it makes sense that the company is getting a tad desperate. ChatGPT, unquestionably OpenAI’s biggest success, is really, really computationally intensive. According to analysis company Bernstein, each query costs 4 cents, which would mean “If ChatGPT queries grow to a tenth the scale of Google search, it would require roughly $48.1 billion worth of GPUs initially and about $16 billion worth of chips a year to keep operational." so… that’s a lot.
OpenAI is also looking into acquiring a company that builds chips, but I dunno. There’s just no way some kind of unicorn chip is going to be created in the short term, right? Right? Something tells me we’re just going to be dealing with this shortage for the rest of the decade.