We finally reached critical mass on seeing money as an arbitrary construct, so now we're converting it into real, physics-based heat. Entropy at its finest.
I've been saying roughly the same thing about cryptocurrencies (just a good way to waste fuckhuge amounts of resources on digital tulips to enable crime) but it never seems to stop anyone from plowing ahead on being stupid.
Water is one I really don't get, if anyone would be willing to explain it to me. Is it because of manufacturing? We already have these manufacturing plants, it's not like more are being started, that process takes years to decades. Hell intel couldn't even finish the few they were working on. Because of concrete for datacenters? I'm sure developing countries are using significantly more of that to build houses, so much so that it completely eclipses the one-to-two datacenters you might see built in the next 5 years in your area. Power generation? That water goes straight back to the atmosphere.. water cycle, even on the open loops, water becomes steam which becomes rain..
And the water is being pumped out of aquifers that aren't being refilled. That water took thousands if not millions of years to get there. Once gone, the aquifers collapse and cannot be replaced as the space once taken up by the water is now just ground.
The issue is that they need fresh water for evaporative cooling.
Much of the world experiences fresh water scarcity, so it's not the best ethics to divert this resource from people in need to tech of uncertain value.
Check out the impact xAI is having on environment and health in Memphis if you want to go further down the rabbit hole.
Maybe people shouldn't be building data centers in deserts. In the city of Toronto the Deep Lake Water Cooling System uses water taken from Lake Ontario used for drinking water to cool a number of buildings. Most notably 151 Front Street West which houses the data centers routing most of the Internet in Ontario.
For all the issues with power use, I do grant it that a lot of materials and labor in needed at the same time. It isn't JUST burning up cash to make Sora videos.
But by the same measure, the Great Egyptian pyramids would have been a huge boom for work even if the final product didn't achieve much for most people.
Thanks for the correction. The experts think the builders were not slaves. Their life in the barracks was too good for slaves, and similarly were they treated in death (which was specially important there)
I have a hypothesis that it was AI, not COVID/sanctions/etc, that was mainly responsible for the 2020-to-ongoing "chip shortage." Ignoring companies with their own fabs (Intel) and companies with pre-existing reserved-capacity contracts with fabs (Apple), everyone else is stuck waiting in line behind batch after batch of fab orders from Nvidia.
Downstream of that, AI is effectively also responsible for the current generation of game consoles never declining in price.
Because game consoles are fixed platforms that continue to be manufactured over 5+ years, normally the most expensive parts in the system (the CPU and GPU) would gradually get cheaper to manufacture [and in turn, cheaper to buy] over the course of the console's lifetime—which was often passed onto the consumer in the form of the console's MSRP gradually decreasing. Either the process node for the console's silicon design would stay fixed, and demand for this process node would gradually decrease as larger fab customers move on to newer nodes, decreasing the (effectively auction-based) pricing for fab time on the older node; or the console manufacturer['s silicon vendor] would put their silicon design through a process-shrink, and so, while still paying top dollar for use of the fab's newest node, would be getting more chips per wafer out of that, and again could charge less.
But instead, what we've seen since the start of the AI boom is that there's no longer any price-reduced timeslots to be sold to manufacture the low-BOM parts for these price-sensitive console-maker customers. Instead, both Nvidia and AMD are now getting such high value out of even the older nodes, that 1. the fabs know they can squeeze them, charging full price for those slots as well; and 2. both Nvidia and AMD, in their roles as silicon vendors to the console makers, haven't been able to justify using (very much of) the time they're paying so much for to fab low-BOM-cost parts to fulfill their pre-existing outstanding purchase orders, when they could instead be fulfilling much-higher-margin new POs from the hyperscalers.
Thus every console of the ninth generation (PS5, Xbox Series, Switch) still selling for their launch price (with no help from process shrinks); thus none of these consoles having been able to be produced in excess of demand to the point that supply-and-demand ever drove retail prices below MSRP; and thus the only tenth-gen console so far, the Switch 2, taking ~4 years longer than anticipated to release†.
---
† Nintendo were very likely waiting for Nvidia to run off enough of the Tegra T239 with a sufficiently low passed-on fab cost, for Nintendo to both 1. be able to build a backstock and non-run-dry pipeline of Switch 2s, and 2. be able to be positive-margin on charging the same price as the Switch 1 for them. They waited four years, and neither thing ever happened; so they eventually just gave up and priced the Switch 2 higher, and also built out an entirely novel D2C + online-marketplace-partner distribution pipeline so they could ration the tiny initial supply of units they had been able to build with the chips Nvidia had supplied them so far.
Though, that being said, Nintendo actually got a double whammy here. They were also waiting for fast NAND to come down in price, so that they could have physical game cards manufactured for a trivial BOM price while still enabling the "direct GPU disk-streamed assets" pipeline that games of the last generation had begun relying on. Obviously, as today's article points out, that hasn't been happening either! Thus game key cards; thus SD Express cards only beginning to trickle out, with no sizes above 128GiB available at the Switch 2's launch time; and thus those SD Express cards being ridiculously priced for their capacity compared to equivalent transfer-speed + die-size NAND (as seen in e.g. low-profile/flush-mount flash drives.)
> I have a hypothesis that it was AI, not COVID/sanctions/etc, that was mainly responsible for the 2020-to-ongoing "chip shortage."
Might be for higher-end chips like GPUs, but for smaller ones like microcontrollers it was very much COVID that threw a huge wrench in the gears.
Things are closer to normal now, but the toilet paper effect hit the electronics supply chain hard during COVID. People resorted to buying dev boards just to desolder the microcontroller on it to use in their commercial product, and similar desperate moves. With lead time measured in many months to years even when the factories are operational, sudden hoarding is not what you want.
It wasn't COVID directly, it was the car companies canceling and then uncanceling their massive chip orders that caused the shock in the semiconductor supply chain.
Something I think contributes is that TSMC seem to be so good at making chips now, it seems like the defect rate must be low enough and they can produce huge chips in volume that a decade ago would be burning money. If you can do that, why bother making less lucrative smaller chips? Also while there's still benefits to shrinking an existing chip with a revised design, I'd guess we're into diminishing returns for what it offers a console whether the chip itself gets cheaper or lets them reduce the bill of parts in other ways.
This is also the first generation where prices have gone up IIRC.
The defect rate always drops like that. But usually the big lucrative chips move on to the next node, and instead they're filling up more nodes in parallel.
The main cure is to build more fabs but that's hard too and they're not in a rush to ruin their own margins by scaling up too fast in the face of an uncertain future.
That sounds reasonable but I suspect it was a little bit of both. Initially Covid slow down followed by large demand. A lot of older Nodes were shutdown (300-400nm) with the Covid slow down and a lot of chips ended up having to be moved over to newer nodes once demand picked up again. It led to a big swing in the flow of production and would have roll on effects.
Combine that with a huge GPU boom and you have the setup for production issues.
But the basic chips like the ones that go into cars are the ones that were the most scarce--the low margin stuff was not being made. You're saying it's because of Nvidia? I'm not totally tracking here.
The game console thing is also related to monetary inflation--the price of everything just keeps jumping.
We should be making some effort to quantify the amount and cost of slop produced by both AI and simpler automated systems (spinners etc), it's a huge negative externality.
* HDDs
* SSDs
* DRAM
* GPUs, obviously
* Power to hook up to our datacenters
Anything I'm missing? What a crazy world we're living in.