petsoi@discuss.tchncs.de to Linux@lemmy.ml · 2 months agoRunning Generative AI Models Locally with Ollama and Open WebUI - Fedora Magazinefedoramagazine.orgexternal-linkmessage-square19fedilinkarrow-up190arrow-down115
arrow-up175arrow-down1external-linkRunning Generative AI Models Locally with Ollama and Open WebUI - Fedora Magazinefedoramagazine.orgpetsoi@discuss.tchncs.de to Linux@lemmy.ml · 2 months agomessage-square19fedilink
minus-squarelelgenio@lemmy.mllinkfedilinkarrow-up3·2 months agoollam runs on the 6700 XT, but you need to add an environment variable for it to work… I just don’t remember what it was and am away from my computer right now
ollam runs on the 6700 XT, but you need to add an environment variable for it to work… I just don’t remember what it was and am away from my computer right now