Twitter will remove nonconsensual nude images within hours as long as that media is reported for having violated someone’s copyright. If the same content is reported just as nonconsensual intimate media, Twitter will not remove it within weeks, and might never remove it at all, according to a pre-print study from researchers at the University of Michigan.

  • ShaggySnacks@lemmy.myserv.one
    link
    fedilink
    English
    arrow-up
    2
    ·
    3 hours ago

    Shocked! That the guy who owns Twitter isn’t making this a priority.

    Wait, it all makes sense when the owner, Elon Musk makes a tone deaf joke about impregnating Taylor Swift.

  • Dizzy Devil Ducky@lemm.ee
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 day ago

    In other words, twatter is probably gonna pull the bullshit where they do business as usual and do nothing until police or any government goes after them.

  • blackbelt352@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    edit-2
    6 hours ago

    It sucks that this is the mechanism we have to use for this but a person’s likeness is their own copyright and posting images of someone without permission could be seen as copyright infringement. Granted this also opens a lot of doors to just completely eliminating almost all images from the internet, like imagine going to a tourist destination and having to get permission from anyone who might be in your overdone posed tourist photo.

    Edit: Since some of yall are dense motherfuckers and/or just arguing in bad faith, I’m pointing out how going using copyright as the enforcement mechanism opens the door for these already flawed copyright systems to be heavily abused even further. I’m specifically pointing to Right of Publicity, where your likeness is protected from commercial use unless you give permission to post. It’s why any show or movie that’s filmed in a public place blurs people out if they haven’t gotten signed release forms from anyone who appears on camera.

    • wizardbeard@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      19 hours ago

      My guy, you seriously aren’t pretending that clothed people in the background of a photo is the same as pictures of someone naked taken or posted without their consent, right?

      Just ignoring the core context?

      Come on.

      • blackbelt352@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 hours ago

        I’m not making a comparison between the two, I’m pointing out how resolving posting non-consensual nudes of someone through copyright systems could be abused in other instances. I’m also not saying there shouldn’t be a system for having non-consensual nudes taken down, we absolutely should, but it needs to be a system dedicated to taking down non-consensual images, not a patchwork workaround using copyright.

    • givesomefucks@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 day ago

      but a person’s likeness is their own copyright and posting images of someone without permission could be seen as copyright infringement

      Whut?

      • John Richard@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 day ago

        Yeah that just isn’t true. If this was true I could charge every business that has ever stored videos of me.

        • communism@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 hours ago

          If they were publicising those videos that sounds illegal to me. If I printed off a copyrighted book for my own personal use, that would be legal. If I started distributing my own reprints of a copyrighted book without permission, the copyright holder could go after me. The businesses can hold copyrighted material without distributing them and not be in breach of the law.

          • John Richard@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 hour ago

            Many of those companies employ use third parties to store those videos and use them to train AI in products that they sell.