• Catoblepas@piefed.blahaj.zone
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      If someone manufactured a knife that told schizophrenics they’re being followed and people are trying to kill them, then yeah. That knife shouldn’t be able to do that and the manufacturer should be held liable.

        • wondrous_strange@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          1 month ago

          A nuclear football / knife are not stochastic parrots which capable of stringing coherent sentences together.

          I agree llms are in the category of tools, but on the other hand they are no like any other tool and requires adjusting to which needs to happen too fast for most people

      • data_science_rocks@scribe.disroot.org
        link
        fedilink
        arrow-up
        0
        ·
        1 month ago

        Allowing the unmedicated schizophrenic near the internet was the mistake.

        Blaming the AI which just plays along with whatever fantasy is thrown at it is what mongoloids do.

    • atopi@piefed.blahaj.zone
      link
      fedilink
      English
      arrow-up
      0
      ·
      1 month ago

      if the knife is a possessed weapon whispering to the holder, trying to convince them to use it for murder, blaming it may be appropriate

    • davidgro@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      1 month ago

      Imagine a knife that occasionally and automatically stabs people trying to cook with it or those near them. Not user error or clumsiness, this is just an unavoidable result of how it’s designed.

      Yes, I’d blame the knife, or more realistically the company that makes it and considers it safe enough to sell.

      • data_science_rocks@scribe.disroot.org
        link
        fedilink
        arrow-up
        0
        ·
        1 month ago

        ChatGPT is physically incapable of stabbing. It is incapable of lying since that requires intent, incapable of manipulation since that requires intent.

        It is deterministic, flavoured by the temperature setting allocated to it. All of what it says depends on the input.

        Funny how you creeps disregard what the obvious lunatic might have been telling ChatGPT before ChatGPT followed along.

        • Rivalarrival@lemmy.today
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 month ago

          All of what it says depends on the input.

          The user is not the only entity supplying input. The operators of the system provide the overwhelming majority of the input.

          It is incapable of lying since that requires intent, incapable of manipulation since that requires intent.

          The operators of the system certainly possess intent, and are completely capable of manipulation.