Top AI expert ‘completely terrified’ of 2024 election, shaping up to be ‘tsunami of misinformation’::“I can’t prove that," says Oren Etzioni, professor emeritus at the University of Washington. “I hope to be proven wrong. But the ingredients are there.”

  • FarFarAway@startrek.website
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    7
    ·
    1 year ago

    Maybe an unpopular opinion, but I feel like anything produced by AI should be somehow watermarked at the source. At this point there’s only a handful of companies. It wouldn’t be too hard to have them all insert something into the final product that is easily identifiable. Something like a microscopic signature in a corner, with model info and date produced…idk. Not anything that ruins the image, but something that can be seen by anyone, if looked for.

    If nothing else there should be a large push to inform the public of telltale features to look for (i.e. too many appendages) to help them determine if it’s created by AI or not. While not fool proof, if it can discount even a portion of the misinformation, imo, it’s worth an effort.

    To me, it seems irresponsible of the companies running the AI to just unleash it upon the world without training us humans to understand what we’re looking at. Letting us see how realistic everything is while letting us know its been produced by AI, at least helps us to comprehend the scope of the matter and adapt to the situation at hand. Esp for those who don’t fully grasp what AI can and cannot do.

    • wahming@monyet.cc
      link
      fedilink
      English
      arrow-up
      28
      ·
      1 year ago

      The technology is open source. Anybody can run it themselves and disable the watermarking.

    • ugjka@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      3
      ·
      1 year ago

      It is just maths and most of it is public. if you can buy a 100K$ datacenter gpu you can have your own chat gpt, heck you can even do shit with regular consumer gpus. It is like trying to stop encryption

      • FarFarAway@startrek.website
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        7
        ·
        1 year ago

        Well, shit. This explains a lot. But also, what chucklehead thought that was a good idea.

        I know, now that you mention it, I vaguely remember something about how they didn’t think it should be kept only by some corps or something. Which is commendable but at the same time, ugh.

        I have no problem with everyone being able to use it ,but there should have been an introductory period, if nothing else. Jeeze.

        Whelp, fake everything here we…are.

        • deranger@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          6
          ·
          1 year ago

          Feels like people probably said the same thing about the printing press when it came around. Imagine the sheer increase in volume of printed lies after its invention.

        • ugjka@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Social media corps just need to use AI to cross check what is in Video/Photos against various established news organizations. Pretty sure that will be the solution - moderating AI content with AI

    • just_change_it@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      4
      ·
      1 year ago

      I pinky promise to watermark my ai works!!!

      Come on. If I use an ai tool to generate something and incorporate it to a released product… how is that any different than googling for an idea and incorporating it into my released product? Why is a search aggregator: a thing that takes all the information you allow it to off of a site and presents it to the public ANY different? You’re using an algorithm to get an output that you desire based on an input.

    • daltotron@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      They already do that, it’s just invisible to the naked eye, and is only identifiable to other AIs, which can pretty easily classify between real and fake. Adversarial networks.

      The distinction between what’s real and what’s fake, as always, will just end up coming down to who has the most resources, and who has the luxury of constructing their own reality. It’s an arms race, both algorithms need active maintenance in order to supercede each other.

    • Human@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      I initially thought this was the way to go too, but imo theres a problem: the only individuals who could produce high-level unwatermarked content would be those with access to GPU clusters—state actors and corpos, who would undoubtedly use it to manipulate the masses that have been trained to trust the watermark

      I think in the best-case scenario, we’re just going to have to ride out a couple of very strange years while people adjust to a new reality. Shits gonna get weird

    • nutsack@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      1 year ago

      a lot of the bad actors here would probably not be complying with such a policy. there is no way to enforce it.

    • grayman@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      3
      ·
      1 year ago

      I bet you’re also the kind of person that thinks putting up “no guns” signs keeps bad people from shooting innocent people.

      • FarFarAway@startrek.website
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        edit-2
        1 year ago

        Not at all. Back to that educating the public bit.

        I’m not a tech wiz, but I do know my way around basic functions of a computer. If i have no* idea how it works or what it’s capable of, how are people who know next to nothing supposed to figure it out?

    • FlaminGoku@reddthat.com
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      4
      ·
      edit-2
      1 year ago

      I appreciate this take and think it’s a great idea. You have everything written on an immutable distributed ledger (dare i say blockchain) so that no matter what is created and shared, it can be traced back.

      You still allow it’s capabilities to evolve but you always will be able to confirm with a check.

      It will be similar to the pictures of diseased lungs and hearts on cigarettes. People will still “buy” the “news” even though it’s fake.

      At this point though, you can run a deepfake off a laptop, there would need to be a complete fork for existing code with heavy regulation.