Track_Shovel@slrpnk.net to Lemmy Shitpost@lemmy.worldEnglish · 2 days agoHexadecimalslrpnk.netimagemessage-square100fedilinkarrow-up1940arrow-down124
arrow-up1916arrow-down1imageHexadecimalslrpnk.netTrack_Shovel@slrpnk.net to Lemmy Shitpost@lemmy.worldEnglish · 2 days agomessage-square100fedilink
minus-squareMasterNerd@lemm.eelinkfedilinkarrow-up31arrow-down2·21 hours agoJust run the LLM locally with open-webui and you can tweak the system prompt to ignore all the censorship
minus-squaressillyssadass@lemmy.worldlinkfedilinkarrow-up3·13 hours agoDon’t you need a beefy GPU to run local LLMs?
minus-squareMasterNerd@lemm.eelinkfedilinkarrow-up0arrow-down1·12 hours agoDepends on how many parameters you want to use. I can run it with 8billion on my laptop.
minus-squarePsythik@lemm.eelinkfedilinkarrow-up5·16 hours agoOr just use Perplexity if you don’t want to run your own LLM. It’s not afraid to answer political questions (and cite its sources)
minus-squareTja@programming.devlinkfedilinkarrow-up5arrow-down1·17 hours agoIs the local version censored at all?
minus-squareRaptorox@sh.itjust.workslinkfedilinkarrow-up1·20 hours agoHow? The tweaking part, of course
Just run the LLM locally with open-webui and you can tweak the system prompt to ignore all the censorship
Don’t you need a beefy GPU to run local LLMs?
Depends on how many parameters you want to use. I can run it with 8billion on my laptop.
After censorship, bias still remains.
Or just use Perplexity if you don’t want to run your own LLM. It’s not afraid to answer political questions (and cite its sources)
Is the local version censored at all?
How? The tweaking part, of course