kersploosh@sh.itjust.works to Programmer Humor@programming.dev · 1 day agoIt works thosh.itjust.worksvideomessage-square31linkfedilinkarrow-up1674arrow-down18
arrow-up1666arrow-down1videoIt works thosh.itjust.workskersploosh@sh.itjust.works to Programmer Humor@programming.dev · 1 day agomessage-square31linkfedilink
minus-squareentropicdrift@lemmy.sdf.orglinkfedilinkarrow-up10·16 hours agoYeah, if they were just running it locally off a GPU it would be cooler
minus-squarepsud@aussie.zonelinkfedilinkEnglisharrow-up1·11 hours agoRunning an LLM isn’t expensive whether locally or in the cloud, all the cost is in the training.
Yeah, if they were just running it locally off a GPU it would be cooler
Running an LLM isn’t expensive whether locally or in the cloud, all the cost is in the training.