Note from this interesting BG2 podcast: AI Demand / Supply – Models, Agents, the $2T Compute Build Out, Need for More Nuclear & More
Very good discussion. And some notes.
Some would argue that while LLMs are indeed good in coding, chatting etc., that “amazingness” was extrapolated too much. In other words, the gap between pre-ChatGPT and ChatGPT-3 is not gonna continue/replicate. The expectation is too high that the room to beat is diminishing.
Some would question the “moat” of most LLMs. The most advanced ones, the “leading edge” is good, but the rest could be commoditized. Then all those R&D dollars is more of less duplicated and return is low.
GTP-5 is undoubtfully powerful and could be more powerful than people think, but what’s next? What cards haven’t been played or thought about after GPT-5?
Besides technology advancement, there seems to be a problem in regulatory environment in nuclear power in the US, which put a lot of restrictions.
France and China have lowered the nuclear power costs over the years.