If you’re like me and you work with computers for a living and you don’t really want to put in the hard work of fixing computers at home, you can do what I did. Which is to download an abliterated local AI and tell it what the problem is and what specs you’re working with and it will almost always fix it for you in like five minutes.
And when it doesn’t fix it in five minutes, it will destroy your operating system with whatever commands it tells you to paste in a terminal, and you were going to be wiping and reinstalling it anyway, so nothing lost.
It makes setting up and running an LLM super easy.
And with how fucked search is, thanks to the enshitification of the last few years, it has become rather difficult to find specific fixes for specific issues.
All of that information has been shoved into the LLMs, and nobody profits from running the open-weight models, it’s not stealing your personal information and sending it off somewhere.
It’s just a leftover from some other process that they don’t want anymore and they’re making it available to everyone else to help drum up interest in AI shit.
I don’t like it.
I’m just sharing a way to still make things work in 2025.
If you’re like me and you work with computers for a living and you don’t really want to put in the hard work of fixing computers at home, you can do what I did. Which is to download an abliterated local AI and tell it what the problem is and what specs you’re working with and it will almost always fix it for you in like five minutes.
And when it doesn’t fix it in five minutes, it will destroy your operating system with whatever commands it tells you to paste in a terminal, and you were going to be wiping and reinstalling it anyway, so nothing lost.
all this time spent on setting up a local llm and reinstalling a whole system + setting it up again instead of reading the documentation 😭😭😭
There are apps for it now like LM Studio.
It makes setting up and running an LLM super easy.
And with how fucked search is, thanks to the enshitification of the last few years, it has become rather difficult to find specific fixes for specific issues.
All of that information has been shoved into the LLMs, and nobody profits from running the open-weight models, it’s not stealing your personal information and sending it off somewhere.
It’s just a leftover from some other process that they don’t want anymore and they’re making it available to everyone else to help drum up interest in AI shit.
I don’t like it.
I’m just sharing a way to still make things work in 2025.
This is how Microsoft develops Azure, except they don’t have the option of starting from scratch.
Mfs can’t fix a wifi and are asked to install a maintain a local llm server.