• just_another_person@lemmy.world
    cake
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    2 days ago

    Low rent comment.

    First: https://www.corsair.com/us/en/explorer/gamer/gaming-pcs/rtx-5090-5080-and-5070-series-gpus-everything-you-need-to-know/

    Second: you apparently are unaware, so just search up the phrase, but as this article very clearly explains…it’s shit. It’s not innovative, interesting, or improving performance, it’s a marketing scam. Games would be run better and more efficiently if you just lower the requirements. It’s like saying you want food to taste better, but then they serve you a vegan version of it. AMD’s version is technically more useful, but it’s still a dumb trick.

    • FreedomAdvocate@lemmy.net.au
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      16 hours ago

      First: https://www.corsair.com/us/en/explorer/gamer/gaming-pcs/rtx-5090-5080-and-5070-series-gpus-everything-you-need-to-know/

      What exactly am I supposed to be looking at here? Do you think that says that the GPUs need their own PSUs? Do you think people with 50 series GPUs have 2 PSUs in their computers?

      It’s not innovative, interesting, or improving performance, it’s a marketing scam. Games would be run better and more efficiently if you just lower the requirements.

      DLSS isn’t innovative? It’s not improving performance? What on earth? Rendering a frame at a lower resolution and then using AI to upscale it to look the same or better than rendering it at full resolution isn’t innovative?! Getting an extra 30fps vs native resolution isn’t improving performance?! How isn’t it?

      You can’t just “lower the requirements” lol. What you’re suggesting is make the game worse so people with worse hardware can play at max settings lol. That is absolutely absurd.

      Let me ask you this - do you think that every new game should still be being made for the PS2? PS3? Why or why not?

      • just_another_person@lemmy.world
        cake
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        13 hours ago

        Like I said…you don’t know what DLSS is, or how it works. It’s not using “AI”, that’s just marketing bullshit. Apparently it works on some people 😂

        You can find tons of info on this (why I told you to search it up), but it uses rendering tables, inference sorting, and pattern recognition to quickly render scenes with other tricks that video formats have used for ages to render images at a higher resolution cheaply from the point of view of the GPU. You render a scene a dozen times once, then it regurgitates those renders from memory again if they are shown before ejected from cache on the card. It doesn’t upsample, it does intelligently render anything new, and there is no additive anything. It seems you think it’s magic, but it’s just fast sorting memory tricks.

        Why you think it makes games better is subjective, but it solely works to run games with the same details at a higher resolution. It doesn’t improve rendered scenes whatsoever. It’s literally the same thing as lowering your resolution and increasing texture compression (same affect on cached rendered scenes), since you bring it up. The effect on the user being a higher FPS at a higher resolution which you could achieve by just lowering your resolution. It absolutely does not make a game playable while otherwise unplayable by adding details and texture definition, as you seem to be claiming.

        Go read up.

        • FreedomAdvocate@lemmy.net.au
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          edit-2
          11 hours ago

          I 100% know what DLSS is, though by the sounds of it you don’t. It is “AI” as much as any other thing is “AI”. It uses models to “learn” what it needs to reconstruct and how to reconstruct it.

          What do you think DLSS is?

          You render a scene a dozen times once, then it regurgitates those renders from memory again if they are shown before ejected from cache on the card. It doesn’t upsample, it does intelligently render anything new, and there is no additive anything. It seems you think it’s magic, but it’s just fast sorting memory tricks.

          This is blatantly and monumentally wrong lol. You think it’s literally rendering a dozen frames and then just picking the best one to show you out of them? Wow. Just wow lol.

          It absolutely does not make a game playable while otherwise unplayable by adding details and texture definition, as you seem to be claiming.

          That’s not what I claimed though. Where did I claim that?

          What it does is allow you to run a game at higher settings than you could usually at a given framerate, with little to no loss of image quality. Where you could previously only run a game at 20fps at 1080p Ultra settings, you can now run it at 30fps at “1080p” Ultra, whereas to hit 30fps otherwise you might have to drop everything to Low settings.

          Go read up.

          Ditto.

          • just_another_person@lemmy.world
            cake
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 hours ago

            I 100% know what DLSS is, though by the sounds of it you don’t. It is “AI” as much as any other thing is “AI”. It uses models to “learn” what it needs to reconstruct and how to reconstruct it.

            No, you don’t. https://en.m.wikipedia.org/wiki/Deep_Learning_Super_Sampling

            This is blatantly and monumentally wrong lol. You think it’s literally rendering a dozen frames and then just picking the best one to show you out of them? Wow. Just wow lol.

            Literally in the docs: https://raw.githubusercontent.com/NVIDIA/DLSS/main/doc/DLSS_Programming_Guide_Release.pdf

            What it does is allow you to run a game at higher settings than you could usually at a given framerate, with little to no loss of image quality. Where you could previously only run a game at 20fps at 1080p Ultra settings, you can now run it at 30fps at “1080p” Ultra, whereas to hit 30fps otherwise you might have to drop everything to Low settings.

            No it doesn’t. It allows you to run a game at a higher resolution for no reason at all, instead of dropping to a lower resolution that your card can handle natively. That’s it.

            Keep claiming otherwise, and you’re just literally denying reality and the Nvidia link to the docs right in front of you.