• 0 Posts
  • 27 Comments
Joined 3 months ago
cake
Cake day: June 25th, 2025

help-circle
  • Took a look cause, as frustrating as it’d be, it’d still be a step in the right direction. But no, they’re still adamant that these are just a “quirk”.

    Conclusions

    We hope that the statistical lens in our paper clarifies the nature of hallucinations and pushes back on common misconceptions:

    Claim: Hallucinations will be eliminated by improving accuracy because a 100% accurate model never hallucinates. Finding: Accuracy will never reach 100% because, regardless of model size, search and reasoning capabilities, some real-world questions are inherently unanswerable.

    Claim: Hallucinations are inevitable. Finding: They are not, because language models can abstain when uncertain.

    Claim: Avoiding hallucinations requires a degree of intelligence which is exclusively achievable with larger models. Finding: It can be easier for a small model to know its limits. For example, when asked to answer a Māori question, a small model which knows no Māori can simply say “I don’t know” whereas a model that knows some Māori has to determine its confidence. As discussed in the paper, being “calibrated” requires much less computation than being accurate.

    Claim: Hallucinations are a mysterious glitch in modern language models. Finding: We understand the statistical mechanisms through which hallucinations arise and are rewarded in evaluations.

    Claim: To measure hallucinations, we just need a good hallucination eval. Finding: Hallucination evals have been published. However, a good hallucination eval has little effect against hundreds of traditional accuracy-based evals that penalize humility and reward guessing. Instead, all of the primary eval metrics need to be reworked to reward expressions of uncertainty.

    Infuriating.


  • I strugglebused on BoI for a while, trying to force myself to experience it the “right” way and just hating it the whole time. Legitimately thought I didn’t like the game for a long time. Then I finally gave in and just looked up the items as I found them and that made it so much better. I ended up playing enough to get every ending (up through Rebirth), and I stopped worrying about “ruining” games by looking things up after that. Baring legit spoilers, I look up everything now. Turns out that knowing how the game functions is pretty important for enjoying it. Glad they finally got around to addressing that.







  • Ech@lemmy.catoGames@lemmy.world(Rant) Don't buy Rockstar games.
    link
    fedilink
    English
    arrow-up
    94
    arrow-down
    6
    ·
    edit-2
    10 days ago

    Lol, YDI. When asked to provide the identifiable information, you just say “Stop and do the thing I want.” You seriously expect them to just hand over an account to someone who can’t provide basic information about it? You owned yourself, pal.

    I misread the order of the interaction, which painted a much more antagonistic view of OP. Sorry about that. That said, the actual interaction plus their further reaction here is still not good. “Brainstorming how to automate the requests” is not an appropriate response to any of this.






  • It really just means a one-sided relationship with a fabricated personality. Celebrities being real people doesn’t really factor into it too much since their actual personhood is irrelevant to the delusion - the person with the delusion has a relationship with the made up personality they see and maintain in their mind. And a chatbot personality is really no different, in this case, so the terminology fits, imo.


  • Is it actually that hard to talk to another human?

    It’s pretty cruel to blame the people for this. You might as well say “Is it that hard to just walk?” to a paraplegic. There are many reasons people may find it difficult to nigh-impossible to engage with others, and these services prey on that. That’s not the fault of the users, it’s on the parasitic companies.



  • It’s from fucking 2013 and they saw this happening.

    I mean, it’s just an examination of the human condition. That hasn’t really changed much in the last thousand years, let alone in the last ten. The thing with all this “clairvoyant” sci-fi that people always cite is that the sci-fi is always less about the actual technology and more about putting normal human characters in potential future scenarios and writing them realistically using the current understanding of human disposition. Given that, it’s not really surprising to see real humans mirroring fictional humans (from good fiction) in similar situations. Disappointing maybe, but not surprising.



  • Man, I feel for them, but this is likely for the best. What they were doing wasn’t healthy at all. Creating a facsimile of a loved one to “keep them alive” will deny the grieving person the ability to actually deal with their grief, and also presents the all-but-certain eventuality of the facsimile failing or being lost, creating an entirely new sense of loss. Not to even get into the weird, fucked up relationship that will likely develop as the person warps their life around it, and the effect on their memories it would have.

    I really sympathize with anyone dealing with that level of grief, and I do understand the appeal of it, but seriously, this sort of thing is just about the worst thing anyone can do to deal with that grief.