Semi non sequitur argument aside, your math seems to be off.
I double checked my quick phone calculations and using figures provided, Rockstar games with their office space energy use is roughly 18,000,000 (18 million) kWh, not 18,000,000,000 (18 billion).
Ahh was wondering where the factor of 1000 came from.
Without turning into a complete shootout, I can kind of see the point with comparing energy usage, but as others have said with these massive data centers it’s like comparing two similar but ultimately different kinds of beasts.
Beyond just the energy used in training of generative AI models in data centers, there’s also the energy it needs to fulfill requests once implemented (24/7, thousands of prompts per second).
here’s also the energy it needs to fulfill requests once implemented
Just like everyone playing the 3d game once its finished development and sold. A few hours of gaming or a few hours of making AI slop photos is the same watts. No one notices the energy when its spread across millions of homes as compared to centralized at a data center. A few years ago Nvidia, Microsoft and others were pushing gaming as a streaming service (The games were being run remotely and your keyboard/gamepad was transmitted to their servers, then the video was streamed back). Those used massive data centers. Yet no one was screaming to stop gaming.
Now it will be PCs spread out in addition to large data centers in combo that will be consuming energy.
And I do remember that phase of game/device streaming! I was a bit skeptical of it all and ended up never using those technologies but that did allow me to learn about alternatives like Moonlight/Sunshine.
Semi non sequitur argument aside, your math seems to be off.
I double checked my quick phone calculations and using figures provided, Rockstar games with their office space energy use is roughly 18,000,000 (18 million) kWh, not 18,000,000,000 (18 billion).
I put the final answer in Watt hours, not Kw hours to match. ChatGPT used 10B watt hours, not 10B Kwatt hours.
Ahh was wondering where the factor of 1000 came from.
Without turning into a complete shootout, I can kind of see the point with comparing energy usage, but as others have said with these massive data centers it’s like comparing two similar but ultimately different kinds of beasts.
Beyond just the energy used in training of generative AI models in data centers, there’s also the energy it needs to fulfill requests once implemented (24/7, thousands of prompts per second).
Just like everyone playing the 3d game once its finished development and sold. A few hours of gaming or a few hours of making AI slop photos is the same watts. No one notices the energy when its spread across millions of homes as compared to centralized at a data center. A few years ago Nvidia, Microsoft and others were pushing gaming as a streaming service (The games were being run remotely and your keyboard/gamepad was transmitted to their servers, then the video was streamed back). Those used massive data centers. Yet no one was screaming to stop gaming.
Now it will be PCs spread out in addition to large data centers in combo that will be consuming energy.
And I do remember that phase of game/device streaming! I was a bit skeptical of it all and ended up never using those technologies but that did allow me to learn about alternatives like Moonlight/Sunshine.