• Pacattack57@lemmy.world
    link
    fedilink
    English
    arrow-up
    32
    arrow-down
    1
    ·
    4 hours ago

    When I’m told there’s power issues and to conserve power I drop my AC to 60 and leave all my lights on. Only way for them to fix the grid is to break it.

  • merc@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    34
    ·
    5 hours ago

    Worse is Google that insists on shoving a terrible AI-based result in your face every time you do a search, with no way to turn it off.

    I’m not telling these systems to generate images of cow-like girls, but I’m getting AI shoved in my face all the time whether I want it or not. (I don’t).

    • jjmoldy@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 hours ago

      I am trying to understand what Google’s motivation for this even is. Surely it is not profitable to be replacing their existing, highly lucrative product with an inferior alternative that eats up way more power?

      • AA5B@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 hour ago

        Their motivation is always ads. The ai response is longer and takes time to read so more time looking at their ads. If the answer is sufficient, you might not even click away to the search result.

        AI is a potential huge bonanza to search sites, letting them suck up the ad revenue that used to goto the search results

    • rdri@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 hours ago

      There is a way to “turn it off” with some search parameters. However there is no guarantee that the AI is not consuming resources at the backend.

      • merc@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 hours ago

        Also the search parameters are undocumented internal things that can change or be disabled at any time.

  • Allonzee@lemmy.world
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    9
    ·
    edit-2
    6 hours ago

    We’re going away folks, and nothing of any true value will be lost, except all the species that did live in homeostasis with the Earth that we’re taking with us in our species’ avarice induced murder-suicide

    • millie@slrpnk.net
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      2
      ·
      2 hours ago

      Carlin had some good material, but this is an absolutely stupid mindset. We can cause an extreme level of ecological damage. Will the planet eventually recover? Quite possibly. But that’s not a certainty, and in the mean time we’re triggering a mass extinction precisely because irresponsible humans figure there’s no way we can hurt the Earth and it’s self-important hubris to think that we can.

      But the time we’re living through and the time we’re heading into are all the proof we should need that it’s actually hubris to assume our actions have no meaningful impact.

    • oni ᓚᘏᗢ@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      2
      ·
      6 hours ago

      I’ve been trying to write this comment the more concise possible, I’m trying my best. “We’re going away”, yes, that’s true. No matter what we are leaving this place, but, that doesn’t mean that the last days of humanity have to be surrounded by pollution and trash. All I can get of that quote in the image is that we should let big companies shit on us till we die.

      • Allonzee@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        3
        ·
        edit-2
        3 hours ago

        “let”

        The sociopath fascist capitalists won. They have multilayered protection from us from propaganda dividing us and turning us on one another without end to government capture and having the exclusive use of state violence to protect the capital markets and literally having state sanctioned murder for for private profit. This isnt a war, this is a well oiled Orwellian occupation. The people surrendered half a century ago without terms and received the delusion that they’ll be the rich ones one day herp derp.

        We can’t do anything about the misery they spread, and that sucks. We don’t have to add to our misery by pretending there’s hope and we can turn any of it around. They’re going to do what we’re going to do and short of being a lone “terrorist” that takes a pot shot at them like Luigi there’s nothing to be done, because half of us are too cowardly and social opiate addicted(fast food, social media, literal opiates, etc) or too deluded and actually on the robber baron’s side out of pick me mentality to take it as a rallying cry.

        Only the planet itself, our shared habitat, can stop them. And it will, regardless of all the studies they kill, ecological treaties they betray, and all the propaganda they spread. The capitalists reign of terror will end when enough of their suckers are starving, not before.

      • TriflingToad@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        1
        ·
        edit-2
        6 hours ago

        It’s from https://perchance.org/welcome and is super cool because it’s like half a soul-less AI and half a super cool tool that gets people into programming and they actually care about the Internet because they encourage people to learn how to code their own ais and have fun with it and I would absolutely have DEVOURED it when I was 13 on Tumblr (I forgot my ADHD meds today sorry if I’m rambling)

  • leftthegroup@lemmings.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    6 hours ago

    Didn’t some legislation come out banning making laws against AI? (which I realize is a fucking crazy sentence in the first place- nothing besides rights should just get immunity to all potential new laws)

    So the cities aren’t even the bad guys here. The Senate is.

    • Corkyskog@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 hours ago

      From what I can tell it got stripped from the Senate version that was just approved. They barely have the heads to pass it, so they aren’t going to play volleyball to add it back.

    • MotoAsh@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      5 hours ago

      It’s both. Also don’t let the house, supreme court, or the orange buffoon and his cabinet get out of culpability. Checks and balances can work … when they all aren’t bought and paid for by rich fucks.

  • burgerpocalyse@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    6 hours ago

    i feel it would actually kill some people to just say, yes, ai uses a lot of power, and no other qualifying statements tacked on

  • some_guy@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    81
    arrow-down
    2
    ·
    11 hours ago

    Yeah, that thing that nobody wanted? Everybody has to have it. Fuck corporations and capitalism.

    • bridgeenjoyer@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      1
      ·
      9 hours ago

      Just like screens in cars, and MASSIVE trucks. We don’t want this. Well, some dumbass Americans do, but intelligent people don’t need a 32 ton 6 wheel drive pickup to haul jr to soccer.

      • Madzielle@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        10
        ·
        7 hours ago

        Massive trucks? They need those trucks for truck stuff, like this giant dilhole parking with his wife to go to Aldi today. Not even a flag on the end of that ladder, it filled a whole spot by itself.

        My couch wouldn’t fit in that bed, and every giant truck I see is sparkling shiny and looks like it hasn’t done a day of hard labor, much like the drivers.

      • Schadrach@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        2
        ·
        8 hours ago

        You underestimate the number of people you wouldn’t class as intelligent. If no one wanted massive trucks, they would have disappeared off the market within a couple of years because they wouldn’t sell. They’re ridiculous, inefficient hulks that basically no one really needs but they sell, so they continue being made.

        • moakley@lemmy.world
          link
          fedilink
          English
          arrow-up
          10
          ·
          8 hours ago

          It’s actually because small trucks were regulated out of the US market. Smaller vehicles have more stringent mileage standards that trucks aren’t able to meet. That forces companies to make all their trucks bigger, because bigger vehicles are held to a different standard.

          So the people who want or need a truck are pushed to buy a larger one.

      • IsThisAnAI@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        6
        ·
        edit-2
        8 hours ago

        Do you have any data to support this is actually the case? I see this all the time but absolutely zero evidence but a 2015 Axios survey with no methodology or dataset. Nearly every article cites this one industry group with 3 questions that clearly aren’t exclusive categorical and could be picked apart by a high school student.

        I ask this question nearly every time I see this comment and in 5 years I have not found a single person who can actually cite where this came from or a complete explanation of even hope they got to that conclusion.

        The truck owners I know, myself included, use them all the time for towing and like the added utility having the bed as as secondary feature.

        • Schadrach@lemmy.sdf.org
          link
          fedilink
          English
          arrow-up
          7
          ·
          7 hours ago

          The truck owners I know, myself included, use them all the time for towing and like the added utility having the bed as as secondary feature.

          Then you put it beside a truck from 30 years ago that’s a quarter the overall size but has the same bed capacity and towing power along with much better visibility instead of not being able to see the child you’re about to run over. And then you understand what people mean when they say massive trucks - giant ridiculously unnecessary things that are all about being a status symbol and dodging regulations rather than practicality.

          • IsThisAnAI@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            3
            ·
            6 hours ago

            Absolutely 100% incorrect on towing. The 95 top f150 towed about 7700 compared to 13500 today. That’s an f350 in 95. It’ll also fit a family of 4 comparable to a full size sedan eliminating any need of a secondary vehicle. The old f150/1500s were miserable in the back.

            As for the safety I find the argument disingenuous not based on reality. Roughly 160 kids were killed in 23 with the EU27. It was 220 in the US. Much of that could be correlated to traffic density as well.

            Country / Region Est. Fatalities/Year Child Pop. (0–14) Fatalities per Million

            United States ~225 ~61 million ~3.7 United Kingdom ~22 ~11.5 million ~1.9 Canada ~12 ~6 million ~2.0 Australia ~11 ~4.8 million ~2.3 Germany ~20 ~11 million ~1.8 France ~18 ~11 million ~1.6 Japan ~18 ~15 million ~1.2 India ~3,000 (est.) ~360 million ~8.3 Brazil ~450 ~50 million ~9.0 European Union (EU-27) ~140–160 ~72 million ~2.0–2.2

            I think we should offer incentives for manufacturers to start reducing size and weight, but things you are saying here aren’t really based off of any data nor was it what I was asking.

            I just wish I could find one person to show me what they are referencing when they repeat that seemingly false fact.

        • Bytemeister@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          6 hours ago

          Let me express it to you with some numbers… The US is ~3.81 million square miles in size.

          The F150 has sold 8.810 million units in the US in the last 10 years.

          There are ~ 2.3 F150s fewer than 10 years old for every square mile in this country.

          There is no way the majority of those trucks are going to job sites, or hauling junk, or pulling a trailer, just look around. That’s not even all trucks. Thats just one model, from one brand, for a single 10 yr period.

          These trucks are primarily sold as a vanity vehicle, and a minivan alternative, and that’s what I think when I see one.

            • Bytemeister@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              ·
              5 hours ago

              No, trump style math would be saying that The number of Trucks Towing has gone DOWN 400% PERCENT after the EVIL AMERICA HATING COMMUNIST Dems elected a soon-to-be-illegal Migrant Gang member as Mayor of New York NYC.

      • Tlaloc_Temporal@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 hours ago

        Do the new models even have non-“smart” fittings? I thought all the electronic chip plants closed during covid.

  • anarchiddy@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    130
    arrow-down
    2
    ·
    edit-2
    10 hours ago

    I had my energy company remove their LVTC smart meter this week after they started using it to shut off our condenser unit during our 100 degree days

    The fact that it exists at all is bad enough, but they were doing this at a time when our AC was already malfunctioning due to low refrigerant. On the day they first shut it off, our house reached 94 degrees.

    The program that the previous owner signed up for that enabled them to do this gave them a fucking two dollar a month discount.

    I use a smart thermostat to optimize my home conditioning - having a second meter fucking with my schedule ends up making us all miserable. Energy providers need to stop fucking around and just build out their infrastructure to handle worst case peak loads, and enable customers to install solar to reduce peak loading to begin with.

    The other thing that kills me about this is that our provider administers our city’s solar electric subsidy program themselves. When i had them come out to give us a quote, they inflated their price by more than 100% because they knew what our electricity bill was. All they did was take our average monthly bill and multiplied it by the repayment period. I could have been providing them more energy to the grid at their peak load if they hadn’t tried scamming me.

    FUCK private energy providers.

    • Zwiebel@feddit.org
      link
      fedilink
      English
      arrow-up
      23
      ·
      edit-2
      11 hours ago

      How tf can a meter shut of an applience? Did you also have smart breakers from them?

      Anyway absolutely ridiculous

      • anarchiddy@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        42
        ·
        12 hours ago

        It’s separate from the main meter and connected directly at the condenser unit.

        It monitors power draw and acts as a relay when the provider sends a shutoff signal. The thermostat thinks the system is still going, and the fans still push air through the vents, but the coils aren’t being cooled anymore so the air gets hot and musty.

    • illusionist@lemmy.zip
      link
      fedilink
      English
      arrow-up
      8
      ·
      10 hours ago

      Peak load of households is not during peak solar power generation. Households installing pv isn’t a solution to what you described.

      Today, you could also use a battery to buy power during mid day and use it in the evening when you need it the most.

      • anarchiddy@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        2
        ·
        edit-2
        10 hours ago

        In moderate climates in the US, peak loads are typically the hottest and sunniest hours of the day since condenser units are the most energy-hungry appliance in most homes. Clouds notwithstanding, peak solar generation would typically align (or closely align) with peak load time.

        Batteries would also help a lot - they should definitely be subsidizing the installation of those as well but unfortunately they aren’t yet (at least not in my state).

        • ayyy@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          ·
          9 hours ago

          This is incorrect. Look up the “duck curve” or if you prefer real-world examples look at the California electricity market (CAISO) where they have an excellent “net demand curve” that illustrates the problem.

          • Naz@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            3
            ·
            6 hours ago

            I watch big state and national grid loads (for fun) and I see two distinct peaks: 7-8AM when everyone goes to work, and then around 5-7 PM when people commute home and heat up dinner.

            Otherwise it’s a linear diagonal curve coinciding with temperatures.

            I personally try to keep my own energy usage a completely flat line so I can benefit from baseline load generator plants like nuclear (located not that far away).

          • anarchiddy@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            5
            ·
            9 hours ago

            This curve has changed somewhat since this study in 2016. More efficient home insulation, remote working, and energy-efficient cooling systems have large impact in this pattern. But assuming you have a well-insulated home, setting your thermostat to maintain a consistent temperature throughout the day will shift this peak earlier and lower the peak load at sunset, when many people are returning home. More efficient heat pumps with variable pressure capabilities also helps this a lot, too.

            Given just how many variables are involved, it’s better to assume peak cooling load to be mid-day and work toward equalizing that curve, rather than reacting to transient patterns that are subject to changes in customer behavior. Solar installations are just one aspect of this mitigation strategy, along with energy storage, energy-efficient cooling systems, and more efficient insulation and solar heat gain mitigation strategies.

            If we’re discussing infrastructure improvements we might as well discuss home efficiency improvements as well.

              • anarchiddy@lemmy.dbzer0.com
                link
                fedilink
                English
                arrow-up
                1
                ·
                3 hours ago

                I’m not really saying that the curve itself is changing (sorry, I was really not clear), only that those other variables reduce actual energy demand later in the day because of the efficiency gains and thermal banking that happens during the peak energy production. The overproduction during max solar hours is still a problem. Even if the utility doesn’t have a way of banking the extra supply, individual customers can do it themselves at a smaller scale, even if just by over-cooling their homes to reduce their demand after sundown.

                Overall, the problem of the duck curve isn’t as much about maxing out the grid, it’s about the utility not having instantaneous power availability when the sun suddenly goes down. For people like me who work from home and have the flexibility to keep my home cool enough to need less cooling in the evening, having solar power means I can take advantage of that free energy and bank it to reduce my demand in the evening.

                I get what you were saying now, but having solar would absolutely reduce my demand during peak hours.

              • anarchiddy@lemmy.dbzer0.com
                link
                fedilink
                English
                arrow-up
                4
                ·
                9 hours ago

                Ok now go just one step further and ask yourself what variables factor into this.

                There’s a reason that pattern exists, and it isn’t because solar and cooling hours don’t align.

                • sqw@lemmy.sdf.org
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  4 hours ago

                  the difference between demand and net demand in that graph is purely solar/wind generation, isn’t it?

        • illusionist@lemmy.zip
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 hours ago

          Why do you want a subsidy for batteries? Installing batteries at a large scale at homes is incredibly expensive compared to an off site battery. Especially with regards to the move towards hydrogen.

          • anarchiddy@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            3
            ·
            7 hours ago

            For the same reason we want to subsidize solar production in residential construction even though it’s more efficient and cost-productive to do it at-scale. Having energy production and storage at the point of use reduces strain on power infrastructure and helps alleviate the types of load surging ayyy is talking about.

            It’s not a replacement for modernizing our power grids, too - it simply helps to make them more resilient.

            • illusionist@lemmy.zip
              link
              fedilink
              English
              arrow-up
              1
              ·
              6 hours ago

              That’s understandable but do we need it now? Neither pv nor batteries last forever. I’m just not sure if we need them now (or short-medium term future). But I’m not in the position to decide upon it

    • Serinus@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      12 hours ago

      our city’s solar electric subsidy program

      It sounds like there’s two different things there. There’s a solar installation (hardware, etc.), and there’s likely some kind of net metering program (where they pay you or give you credit for electricity you generate). That paragraph sounds like the first, but the phrase sounds like the second.

      You shouldn’t have to go through them for the solar installation, if your conditions accommodate it. Granted, the conditions don’t apply to everyone. You’ll want to have a suitable roof that ideally faces south-ish, own your home, and plan to stay there for at least 10 years. In the US, you also kind of need to get it done within this calendar year, which is a rough ask, before the federal 30% tax credit goes away. But maybe you can find an installer that isn’t trying to scam you quite as much.

      (It’s early and cloudy today.)

      Solar system stats, Home Assistant panel

      • anarchiddy@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        4
        ·
        10 hours ago

        Sorry, maybe I wasn’t being clear.

        My area has solar incentive programs that are run through the energy utility - meaning the state makes available zero-interest loans for the purposes of solar installation, but those loans are only available through an entity partnered with our utility. They limit the number of homes in each area that are eligible through this program so that solar generation never exceeds demand. Our home was eligible through the program, so I had them come out to give us a quote. Our utility is also transitioning to surge pricing and smart metering, so there’s a pretty high demand for solar installation in my area and they know that they’d lose out on a lot of revenue if everyone installed their own solar systems.

        A part of that process was them asking for the last year of energy bills, along with taking measurements and doing daylighting analysis on our roof area. At the end, they gave us a quote for a 15 year loan for the equipment and installation, and it just so happened that the monthly payment was the same as our average energy bill. I work in AEC and I know what solar panels cost, and they had inflated their price by more than double what it would cost at market rate.

        Of course I could install my own panels, but it would be out-of-pocket and I would have to seek out and apply for out-of-state incentive programs myself, but I can’t afford the up-front costs and the loan terms don’t make sense for how long we’ll be in this house. Id love nothing more than to do it myself, even at a loss if that’s what it took, but I have a spouse that is less spiteful than I am.

        • Serinus@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          10 hours ago

          more than double what it would cost at market rate

          I definitely paid more for labor than for materials. My payoff time is about 13 years with a Tesla Powerwall 3, maybe a bit less now that I have an EV. I had a team of 4 guys plus an electrician here for about five days.

          I did go with a slightly more reputable company that charged slightly more, but I would have gone elsewhere if it was a huge difference.

          Maybe I should get around to making a post in !Solarpunk@slrpnk.net or something, even though it isn’t very punk.

          • anarchiddy@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            4
            ·
            10 hours ago

            I’m factoring in labor. It was an extremely bad deal - they were praying on the fact most home owners do not have familiarity with solar installation pricing.

            Like I said, I would love to still do it on my own, but it just doesn’t make sense for our household.

      • aeiou_ckr@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        9 hours ago

        Your HA dashboard derailed this conversation for me. lol.

        I would love to know more about the equipment you are using to push this info into your HA.

    • TriflingToad@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      edit-2
      6 hours ago

      I have llama 3.2 on my phone and it’s really funny because it’s so low powered and dumb but so sweet.

      it’s like a little friend to talk to when I don’t have Internet. he’s a lil stupid but he got the spirit

      • WorldsDumbestMan@lemmy.today
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 hours ago

        Use Qwen 2.5, that’s my recommendation. You can also set “pals”. And the best part, is I have a portable battery and solar charger, so I could theoretically (and have in the past) run it from solar alone.

  • jsomae@lemmy.ml
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    16
    ·
    edit-2
    8 hours ago

    I know she’s exaggerating but this post yet again underscores how nobody understands that it is training AI which is computationally expensive. Deployment of an AI model is a comparable power draw to running a high-end videogame. How can people hope to fight back against things they don’t understand?

      • MotoAsh@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        5 hours ago

        Well you asked for six tits but you’re getting five. Why? Because the AI is intelligent and can count, obviously.

    • PeriodicallyPedantic@lemmy.ca
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      5 hours ago

      Right, but that’s kind of like saying “I don’t kill babies” while you use a product made from murdered baby souls. Yes you weren’t the one who did it, but your continued use of it caused the babies too be killed.

      There is no ethical consumption under capitalism and all that, but I feel like here is a line were crossing. This fruit is hanging so low it’s brushing the grass.

      • jsomae@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        4 hours ago

        Are you interpreting my statement as being in favour of training AIs?

        • PeriodicallyPedantic@lemmy.ca
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 hour ago

          I’m interpreting your statement as “the damage is done so we might as well use it”
          And I’m saying that using it causes them to train more AIs, which causes more damage.

          • jsomae@lemmy.ml
            link
            fedilink
            English
            arrow-up
            1
            ·
            59 minutes ago

            I agree with your second statement. You have misunderstood me. I am not saying the damage is done so we might as well use it. I am saying people don’t understand that it is the training of AIs which is directly power-draining.

            I don’t understand why you think that my observation people are ignorant about how AIs work is somehow an endorsement that we should use AIs.

            • PeriodicallyPedantic@lemmy.ca
              link
              fedilink
              English
              arrow-up
              1
              ·
              17 minutes ago

              I guess.

              It still smells like an apologist argument to be like “yeah but using it doesn’t actually use a lot of power”.

              I’m actually not really sure I believe that argument either, through. I’m pretty sure that inference is hella expensive. When people talk about training, they don’t talk about the cost to train on a single input, they talk about the cost for the entire training. So why are we talking about the cost to infer on a single input?
              What’s the cost of running training, per hour? What’s the cost of inference, per hour, on a similarly sized inference farm, running at maximum capacity?

    • domdanial@reddthat.com
      link
      fedilink
      English
      arrow-up
      25
      ·
      8 hours ago

      I mean, continued use of AI encourages the training of new models. If nobody used the image generators, they wouldn’t keep trying to make better ones.

    • FooBarrington@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      1
      ·
      8 hours ago

      It’s closer to running 8 high-end video games at once. Sure, from a scale perspective it’s further removed from training, but it’s still fairly expensive.

      • jsomae@lemmy.ml
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        3
        ·
        8 hours ago

        really depends. You can locally host an LLM on a typical gaming computer.

        • FooBarrington@lemmy.world
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          1
          ·
          7 hours ago

          You can, but that’s not the kind of LLM the meme is talking about. It’s about the big LLMs hosted by large companies.

        • floquant@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          5
          ·
          edit-2
          7 hours ago

          True, and that’s how everyone who is able should use AI, but OpenAI’s models are in the trillion parameter range. That’s 2-3 orders of magnitude more than what you can reasonably run yourself

          • jsomae@lemmy.ml
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            4
            ·
            edit-2
            7 hours ago

            This is still orders of magnitude less than what it takes to run an EV, which are an eco-friendly form of carbrained transportation. Especially if you live in an area where the power source is renewable. On that note, it looks to me like AI is finally going to be the impetus to get the U.S. to invest in and switch to nuclear power – isn’t that altogether a good thing for the environment?

        • Thorry84@feddit.nl
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          7 hours ago

          Well that’s sort of half right. Yes you can run the smaller models locally, but usually it’s the bigger models that we want to use. It would also be very slow on a typical gaming computer and even a high end gaming computer. To make it go faster not only is the hardware used in datacenters more optimised for the task, it’s also a lot faster. This is both a speed increase per unit as well as more units being used than you would normally find in a gaming PC.

          Now these things aren’t magic, the basic technology is the same, so where does the speed come from? The answer is raw power, these things run insane amounts of power through them, with specialised cooling systems to keep them cool. This comes at the cost of efficiency.

          So whilst running a model is much cheaper compared to training a model, it is far from free. And whilst you can run a smaller model on your home PC, it isn’t directly comparable to how it’s used in the datacenter. So the use of AI is still very power hungry, even when not counting the training.

        • CheeseNoodle@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          1
          ·
          7 hours ago

          Yeh but those local models are usually pretty underpowered compared to the ones that run via online services, and are still more demanding than any game.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        edit-2
        4 hours ago

        Not at all. Not even close.

        Image generation is usually batched and takes seconds, so 700W (a single H100 SXM) for a few seconds for a batch of a few images to multiple users. Maybe more for the absolute biggest (but SFW, no porn) models.

        LLM generation takes more VRAM, but is MUCH more compute-light. Typically one has banks of 8 GPUs in multiple servers serving many, many users at once. Even my lowly RTX 3090 can serve 8+ users in parallel with TabbyAPI (and modestly sized model) before becoming more compute bound.

        So in a nutshell, imagegen (on an 80GB H100) is probably more like 1/4-1/8 of a video game at once (not 8 at once), and only for a few seconds.

        Text generation is similarly efficient, if not more. Responses take longer (many seconds, except on special hardware like Cerebras CS-2s), but it parallelized over dozens of users per GPU.


        This is excluding more specialized hardware like Google’s TPUs, Huawei NPUs, Cerebras CS-2s and so on. These are clocked far more efficiently than Nvidia/AMD GPUs.


        …The worst are probably video generation models. These are extremely compute intense and take a long time (at the moment), so you are burning like a few minutes of gaming time per output.

        ollama/sd-web-ui are terrible analogs for all this because they are single user, and relatively unoptimized.

        • FooBarrington@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          6 hours ago

          I compared the TDP of an average high-end graphics card with the GPUs required to run big LLMs. Do you disagree?

            • FooBarrington@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              5 hours ago

              They are, it’d be uneconomical not to use them fully the whole time. Look up how batching works.

              • Jakeroxs@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                3
                arrow-down
                2
                ·
                edit-2
                5 hours ago

                I mean I literally run a local LLM, while the model sits in memory it’s really not using up a crazy amount of resources, I should hook up something to actually measure exactly how much it’s pulling vs just looking at htop/atop and guesstimating based on load TBF.

                Vs when I play a game and the fans start blaring and it heats up and you can clearly see the usage increasing across various metrics

                • PeriodicallyPedantic@lemmy.ca
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  ·
                  5 hours ago

                  He isn’t talking about locally, he is talking about what it takes for the AI providers to provide the AI.

                  To say “it takes more energy during training” entirely depends on the load put on the inference servers, and the size of the inference server farm.

                • MotoAsh@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  arrow-down
                  1
                  ·
                  5 hours ago

                  One user vs a public service is apples to oranges and it’s actually hilarious you’re so willing to compare them.

                • FooBarrington@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  1
                  ·
                  5 hours ago

                  My guy, we’re not talking about just leaving a model loaded, we’re talking about actual usage in a cloud setting with far more GPUs and users involved.

      • jsomae@lemmy.ml
        link
        fedilink
        English
        arrow-up
        3
        ·
        8 hours ago

        there is so much rage today. why don’t we uh, destroy them with facts and logic

        • Jakeroxs@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          8 hours ago

          Hahaha at this point even facts and logic is a rage inducing argument. “My facts” vs “Your facts”

  • HeyListenWatchOut@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    ·
    10 hours ago

    Classic neo-liberalism - privatize the benefits, socialize the costs.

    Corporations : “We should get to gobble all power with our projects… and you should have the personal responsibility to reduce power usage even though it would - at best - only improve things at the very edges of the margins… and then we can get away with whatever we want.”

    Just like with paper straws. You get crappy straws and they hope you feel like you’re helping the environment (even though the plastic straws account for like 0.00002% of plastic waste generated) … meanwhile 80% of the actual pollution and waste being generated by like 12 corporations gets to continue.

    • gandalf_der_12te@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      7
      ·
      7 hours ago

      I feel like i’ve read a very similar argument somewhere recently, but i have difficulty remembering it precisely. It went something like this:

      • If a company kills 5 people, it was either an accident, an unfortunate mishap, a necessity of war (in case of the weapons industry) or some other bullshit excuse.
      • If the people threaten to kill 5 billionaires, they’re charged with “terrorism” (see Luigi Mangione’s case).