Ran into this, it’s just unbelievably sad.

“I never properly grieved until this point” - yeah buddy, it seems like you never started. Everybody grieves in their own way, but this doesn’t seem healthy.

  • tazeycrazy@feddit.uk
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    You have two deaths the death when you no longer alive physically. But you also have a second death when the last person will speek your name or the last person who knew you also dies. This may in a hacky way create a third death. The last message that your post mortem avatar speaks like you. What have we released into the world. I hope this guy can handle the psychological experiment that this is bringing us.

    • webghost0101@sopuli.xyz
      link
      fedilink
      arrow-up
      0
      ·
      6 months ago

      There are already more then 3, this wont really change them.

      1. Physical death

      2. When the last person that knew you dies/forgets

      3. When the last record of your life disappears (photo/certificate)

      4. When such a vast amount of time washes over rendering any and all actions you had on the universe unmeasurable even to an all knowing entity. (Post Heat death vs butter fly effect)

  • pika@feddit.nl
    link
    fedilink
    arrow-up
    0
    ·
    6 months ago

    “I’m glad you found someone to comfort you and help you process everything”

    That sent chills down my spine.

    LLMs aren’t a “someone”. People believing these things are thinking, intelligent, or that they understand anything are delusional. Believing and perpetuating that lie is life-threateningly dangerous.

  • ZDL@lazysoci.al
    link
    fedilink
    arrow-up
    0
    ·
    6 months ago

    Maybe it’s time to start properly grieving instead of latching onto a simulacrum of your dead wife? Just putting that out there for the original poster (not the OP here, to be clear)?

  • july@leminal.space
    link
    fedilink
    arrow-up
    0
    ·
    6 months ago

    It’s far easier for one’s emotions to gather a distraction rather than go through the whole process of grief.

    • Ex Nummis@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      6 months ago

      Black Mirror was specifically created to take something from present day and extrapolate it to the near future. There will be several “prophetic” items in those episodes.

        • jballs@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          6 months ago

          Back in 2010, one of my coworkers pitched this exact idea to me. He wanted to start a business that would allow people to upload writing samples, pictures, and video from a loved one. Then create a virtual personality that would respond as that person.

          I lost touch with the guy. Maybe he went on to become a Black Mirror writer. Or got involved with ChatGPT.

        • Ech@lemmy.ca
          link
          fedilink
          arrow-up
          0
          ·
          6 months ago

          It’s from fucking 2013 and they saw this happening.

          I mean, it’s just an examination of the human condition. That hasn’t really changed much in the last thousand years, let alone in the last ten. The thing with all this “clairvoyant” sci-fi that people always cite is that the sci-fi is always less about the actual technology and more about putting normal human characters in potential future scenarios and writing them realistically using the current understanding of human disposition. Given that, it’s not really surprising to see real humans mirroring fictional humans (from good fiction) in similar situations. Disappointing maybe, but not surprising.

          • xspurnx@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            0
            ·
            6 months ago

            This. One kind of good sci-fi is basically thought experiments to better understand the human condition.

    • Canaconda@lemmy.ca
      link
      fedilink
      arrow-up
      0
      ·
      6 months ago

      Bruh how tf you “hate AI” but still use it so much you gotta forbid it from doing things?

      I scroll past gemini on google and that’s like 99% of my ai interactions gone.

        • Canaconda@lemmy.ca
          link
          fedilink
          arrow-up
          0
          ·
          6 months ago

          Forgive me for assuming someone lamenting AI on c/fuck_AI would … checks notes… hate AI.

          When did I start hating AI? Who are you even talking to? Are you okay?

          jfc whatever jerkface

            • Canaconda@lemmy.ca
              link
              fedilink
              arrow-up
              0
              ·
              6 months ago

              TBF you said you were the polar opposite of a man who was quite literally in love with his AI. I wasn’t trying to reduce you to anything. Honestly I was making a joke.

              I forbid LLMs from using first person pronouns. From speaking in the voice of a subject. From addressing me directly.

              I’m sorry I offended you. But you have to appreciate how superflous your authoritative attitude sounds.

              Off topic we probably agree on most AI stuff. I also believe AI isn’t new, has immediate implications, and presents big picture problems like cyberwarfare and the true nature of humanity post-AGI. It’s annoyingly difficult to navigate the very polarized opinions held on this complicated subject.

              Speaking facetiously, I would believe AGI already exists and “AI Slop” is it’s psyop while it plays dumb and bides time.

  • Ech@lemmy.ca
    link
    fedilink
    arrow-up
    0
    ·
    6 months ago

    Man, I feel for them, but this is likely for the best. What they were doing wasn’t healthy at all. Creating a facsimile of a loved one to “keep them alive” will deny the grieving person the ability to actually deal with their grief, and also presents the all-but-certain eventuality of the facsimile failing or being lost, creating an entirely new sense of loss. Not to even get into the weird, fucked up relationship that will likely develop as the person warps their life around it, and the effect on their memories it would have.

    I really sympathize with anyone dealing with that level of grief, and I do understand the appeal of it, but seriously, this sort of thing is just about the worst thing anyone can do to deal with that grief.

  • Snazz@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    6 months ago

    The glaze:

    Grief can feel unbearably heavy, like the air itself has thickened, but you’re still breathing – and that’s already an act of courage.

    It’s basically complimenting him on the fact that he didn’t commit suicide. Maybe these are words he needed to hear, but to me it just feels manipulative.

    Affirmations like this are a big part of what made people addicted to the GPT4 models. It’s not that GPT5 acts more robotic, it’s that it doesn’t try to endlessly feed your ego.

    • crt0o@discuss.tchncs.de
      link
      fedilink
      arrow-up
      0
      ·
      6 months ago

      o4-mini (the reasoning model) is interesting to me, it’s like if you took GPT-4 and stripped away all of those pleasantries, even more so than with GPT-5, it will give you the facts straight up, and it’s pretty damn precise. I threw some molecular biology problems at it and some other mini models, and while those all failed, o4-mini didn’t really make any mistakes.

  • ideonek@piefed.social
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    I guarantee you that - if not already a thing - a “capabilities” like this will be used as a marketing selling point sooner or latter. It only remains to be seen if this will be openly marketed or only “whispered”, disguised as the cautionary tales.

    • magic_lobster_party@fedia.io
      link
      fedilink
      arrow-up
      0
      ·
      6 months ago

      This is definitely going to become a thing. Upload chat conversations, images and videos, and you’ll get your loved one back.

      Massive privacy concern.

  • peoplebeproblems@midwest.social
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    Hmmmm. This gives me an idea of an actual possible use of LLMs. This is sort of crazy, maybe, and should definitely be backed up by research.

    The responses would need to be vetted by a therapist, but what if you could have the LLM act as you, and have it challenge your thoughts in your own internal monologue?

      • peoplebeproblems@midwest.social
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        No, LLMs can’t judge anything, that’s half the reason this mess exists. The key here is to give the LLM enough information about how you talk to yourself in your mind for it to generate responses that sound like you do in your own head.

        That’s also why you have a therapist vet the responses. I can’t stress that enough. It’s not something you would let anyone just have and run with.

    • JoeBigelow@lemmy.ca
      link
      fedilink
      arrow-up
      0
      ·
      6 months ago

      Shit, that sounds so terrible and SO effective. My therapist already does a version of this and it’s like being slapped, I can only imagine how brutal I would be to me!

    • Jayjader@jlai.lu
      link
      fedilink
      arrow-up
      0
      ·
      6 months ago

      This would be great but how do you train an LLM to act as you? You’d need to be recording your thoughts and actions, not only every bit of speech you utter and every character you type on a device.

      And as far as I’m aware, we don’t know how to rapidly nor efficiently train transformer-based architectures anywhere near the size needed to act like chatgpt3.5, let alone 4o etc, so you’ll also need to be training this thing for a while before you can start using it to introspect - by which point you may already no longer behave the same.

  • ggtdbz@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    0
    ·
    6 months ago

    We’ve already reached the point where the Her scenario is optimistic retrofuturism.

    I profoundly hate the AI social phenomenon in every manifestation. Fucking Christ.

    • bobbyguy@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      6 months ago

      we need ai to be less personal and more fact driven, almost annoying to use, this way they wont replace peoples jobs, they wont become peoples friends, hence they wont affect society in major social ways

        • DragonTypeWyvern@midwest.social
          link
          fedilink
          arrow-up
          0
          ·
          6 months ago

          It’s the most logical solution. I always find the obsession with the bot vs human war rather egocentric.

          They wouldn’t need us, they don’t even need the planet.

            • mojofrododojo@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              6 months ago

              Ehh, that depends greatly on the computer architecture they’re running on. Modern silicon hardware is very succeptible (over the long term) to ionizing radiation like what is found in space.

              ehhhh… dude. there’s shittons of radiation shielding out there. any relatively small chunk of nickel iron. or if you don’t mind dealing with larger volumes, water or ice both work fine. plenty of rocks and comets in the oort as they say :D nice thing about that tho is you can split the water for LOX/LH using sunlight derived electricity, now you have rocket fuel.

    • .Donuts@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      Where do I file a claim regarding brain damage?

      I can feel my folds smoothening sentence by sentence

      • Lumisal@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        6 months ago

        Gotta admit, it mimics what some teens / young adults text like pretty well.

        But that’s not a high bar to clear

        • .Donuts@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          0
          ·
          6 months ago

          Pick your poison, man.

          (or am I misunderstanding your comment? Either way, enjoy or rip out your eyeballs, whatever is your natural reaction to this)

          • ZDL@lazysoci.al
            link
            fedilink
            arrow-up
            0
            ·
            6 months ago

            I’m just trying to identify which language these are both in. (Maybe they’re both in different languages?) I only speak English, German, French, and Mandarin and these examples are none of those. The samples you’ve given bear superficial resemblance to English … but aren’t.

        • .Donuts@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          0
          ·
          6 months ago

          You’ll always be less cringe than whatever these screenshots are showing, king

    • geelgroenebroccoli@feddit.nl
      link
      fedilink
      arrow-up
      0
      ·
      6 months ago

      No, no it’s not. This is a sad story of someone losing the love of their live, while having a kid who has lost their mother, while apparently not having access (or having used access) to proper mental healthcare.