• jimjam5@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    3 hours ago

    Ahh was wondering where the factor of 1000 came from.

    Without turning into a complete shootout, I can kind of see the point with comparing energy usage, but as others have said with these massive data centers it’s like comparing two similar but ultimately different kinds of beasts.

    Beyond just the energy used in training of generative AI models in data centers, there’s also the energy it needs to fulfill requests once implemented (24/7, thousands of prompts per second).

    • Blue_Morpho@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      3 hours ago

      here’s also the energy it needs to fulfill requests once implemented

      Just like everyone playing the 3d game once its finished development and sold. A few hours of gaming or a few hours of making AI slop photos is the same watts. No one notices the energy when its spread across millions of homes as compared to centralized at a data center. A few years ago Nvidia, Microsoft and others were pushing gaming as a streaming service (The games were being run remotely and your keyboard/gamepad was transmitted to their servers, then the video was streamed back). Those used massive data centers. Yet no one was screaming to stop gaming.

      • jimjam5@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        2 hours ago

        Now it will be PCs spread out in addition to large data centers in combo that will be consuming energy.

        And I do remember that phase of game/device streaming! I was a bit skeptical of it all and ended up never using those technologies but that did allow me to learn about alternatives like Moonlight/Sunshine.