• Lulzagna@lemmy.world
      link
      fedilink
      arrow-up
      7
      arrow-down
      2
      ·
      12 hours ago

      This is a dumb misconception. High emissions and energy consumption is when training models, not during prompts

      • vrighter@discuss.tchncs.de
        link
        fedilink
        arrow-up
        4
        arrow-down
        1
        ·
        9 hours ago

        and models are being trained all the time. It’s the only way to assimilate new data. So your point is moot.

        • mika_mika@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          38 minutes ago

          No what he’s saying is the models are being trained whether you mess around with the AI as a user either way.

          It’s like how I didn’t kill the chicken on the store shelves. Myself purchasing it or otherwise doesn’t revive the chicken. The data has/is already being trained.

          • tankfox@midwest.social
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            21 minutes ago

            That’s a really savvy insight! To expand this analogy further, it’s like your phone or computer gives you a free chicken nugget from a small container attached to the side of the device anytime you search for anything at all. It’s room temperature and often spoiled, it’s your choice whether you eat it or not, but you’re going to get it either way. As such you cannot easily choose to avoid chicken in hopes that that will disincentivize further chicken slaughter.

      • Kilgore Trout@feddit.it
        link
        fedilink
        arrow-up
        3
        arrow-down
        2
        ·
        edit-2
        12 hours ago

        False. It’s been shown that resolving prompts also drives a major energy consumption, albeit maybe not so higher than regular search queries.

        • Honytawk@lemmy.zip
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          2 hours ago

          A prompt is like 1/1000 of the power used as a microwave for the same amount of time.

          So the difference between a normal query and an AI query is negligible.

    • Galapagon@sh.itjust.works
      link
      fedilink
      arrow-up
      7
      arrow-down
      2
      ·
      12 hours ago

      I think you should be more concerned about the automatic ai responses on every other search, instead of people having a bit of fun with these

      • Echo Dot@feddit.uk
        link
        fedilink
        arrow-up
        4
        ·
        9 hours ago

        This is my problem with it as well. I wish it was a tolerable option that I could click when I wanted an AI summary, which would be basically never.

        At one point I was looking for a pinout diagram for a chip, and the first result I got was the AI summary, I wanted a picture not text, how’s text helpful? All it did is give me a list of the pins, I know what pins it has, but I want to know where they are.

      • pedz@lemmy.ca
        link
        fedilink
        arrow-up
        10
        arrow-down
        1
        ·
        12 hours ago

        I am. That’s why I switched to DDG and deactivated it.

    • GamingChairModel@lemmy.world
      link
      fedilink
      arrow-up
      20
      ·
      17 hours ago

      AI drives 48% increase in Google emissions

      That’s not even supported by the underlying study.

      Google’s emissions went up 48% between 2019 and 2023, but a lot of things changed in 2020 generally, especially in video chat and cloud collaboration, dramatically expanding demand for data centers for storage and processing. Even without AI, we could have expected data center electricity use to go up dramatically between 2019 and 2023.