Let me preface by saying I despise corpo llm use and slop creation. I hate it.

However, it does seem like it could be an interesting helpful tool if ran locally in the cli. I’ve seen quite a few people doing this. Again, it personally makes me feel like a lazy asshole when I use it, but its not much different from web searching commands every minute (other than that the data used in training it is obtained by pure theft).

Have any of you tried this out?

  • alecsargent@lemmy.zip
    link
    fedilink
    English
    arrow-up
    11
    ·
    edit-2
    2 days ago

    I’ve run several LLM’s with Ollama (locally) and I have to say that is was fun but it is not worth it at all. It does get many answers right but it does not even come close to compensate the amount of time spent on generating bad answers and troubleshooting those. Not to mention the amount of energy the computer is using.

    In the end I just rather spent my time actually learning the thing I’m supposed to solve or just skim through documentation if I just want the answer.