• Grimy@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    5 hours ago

    I should have specified it was an earlier llama model. They have scaled up to more then a flight or two. You are mostly right except for how much a house uses. It’s about 10,500 kW per year, you’re off by a thousand. It uses in an hour about 8 hours of house time, which is still a lot though, specially when you consider musks 1 million gpus.

    https://kaspergroesludvigsen.medium.com/facebook-disclose-the-carbon-footprint-of-their-new-llama-models-9629a3c5c28b

    Their first model took 2 600 000 kwh, a plane takes about 500 000. The actual napkin math was 5 flights. I had done the math like 2 years ago but yeah, I was mistaken and should have at least specified it was for their first model. Their more recent ones have been a lot more energy intensive I think.

    • boor@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 hours ago

      Thanks for catching, you are right that the average USA home is 10.5MWh/year instead of kWh. I was mistaken. :)

      Regarding the remainder, my point is that the scale of modern frontier model training, and the total net-new electricity demand that AI is creating is not trivial. Worrying about other traditional sources of CO2 emissions like air travel and so forth is reasonable, but I disagree with the conclusion that AI infrastructure is not a major environmental and climate change concern. The latest projects are on the scale of 2-5GW per site, and the vast majority of that new electricity capacity will come from natural gas or other hydrocarbons.