I would bet that it's far lower now. Inference is expensive we've made extraordinary efficiency gains through techniques like distillation. That said, GPT-5 is a reasoning model, and those are notorious for high token burn. So who knows, it could be a wash. But selective pressures to optimize for scale/growth/revenue/independence from MSFT/etc makes me think that OpenAI is chasing those watt-hours pretty doggedly. So 0.34 is probably high...
a) training is where the bulk of an AI system's energy usage goes (based on a report released by Mistral)
b) video generation is very likely a few orders of magnitude more expensive than text generation.
That said, I still believe that data centres in general - including AI ones - don't consume a significant amount of energy compared with everything else we do, especially heating and cooling and transport.
Pre-LLM data centres consume about 1% of the world's electricity. AI data centres may bump that up to 2%
You gotta start thinking about the energy used to mine and refine the raw materials used to make the chips and GPUs. Then take into account the infrastructure and data centers.
...but then Sora came out.