I got a little curious about local LLM hosting I’ll admit, and was playing with a few models to look for obvious censorship etc.

I’ve tended to make some assumptions about Deepseek due to its country of origin, and I thought I’d check that out in particular.

Turns out that Winnie the Pooh thing is jut not happening.

😂

  • octopus_ink@slrpnk.netOP
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 months ago

    I’ve got fairly low end hardware, this was just easily installable via gpt4all so I gave it a shot. :)

    I will play with some others…

    Thanks!

    • brucethemoose@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      9 months ago

      If you mean like a laptop, look for MoE models like Qwen3 A3B. And pay attention to sampling, try low or zero temperature first.