

The screen tearing from UI scaling being at an uneven interval has been fixed with the switch to Wayland. Screen tearing can still happen but it’s due to something being messed up in the rendering pipeline and not an issue particular to mint.


The screen tearing from UI scaling being at an uneven interval has been fixed with the switch to Wayland. Screen tearing can still happen but it’s due to something being messed up in the rendering pipeline and not an issue particular to mint.


Please I beg of you, just recommend people Mint. Catchy is great, it’s very easy and smooth as arch goes.
But if you have someone who is under the illusion that Linux is hard. The moment they have any issue it might frustrate them enough to bounce off. I know so many people who have gotten recommended some flavor of the week like Manjaro, Bazite, Pop_Os or Nobara, who that has happened with. I’ve never talked to anyone who was recommended Mint with Cinnamon, used it, and then decided it was too hard and went back to windows. Plenty of people will say “well I used XYZ and didn’t have any issues” or the issues were minor enough and the answers easy enough that they stuck around, but that’s survivorship bias, the people who didn’t deal with it aren’t here to say otherwise.
So just send them to cinnamon mint, there will be no hiccups, it will just work. Maybe later they’ll be like “yah, I kind of want to see what else is out there” and then they can try other things. I get that, cinnamon mint is limited in some ways, but not in ways a first time Linux user is going to care about.


It’s not so much that the distro is bad, but the leadership of the project, according to a lot of the community working on it, is very unresponsive, bad at administration, doesn’t make decisions that need to be made in a timely manner and not really doing their job. The community basically wants to cut them out and move on.


the first DE I used on Linux was cinnamon and I was like “wow, this is great, everything makes sense to me out of the box”
And then I tried Gnome and was incredibly put off by it, like “why the hell is this over here, this layout is strange to me. Why are all these unconventional features on by default, this is very annoying.”
And then I tried KDE and I was like “wow, this is great and everything makes sense to me out of the box, also there’s all these features and options, I don’t know what they do, but i don’t have to interact with them if I don’t want to.”


Honestly, I’m just surprised this is the first time someone has dared to put a phone SOC in a laptop chassis.
It seemed kind of obvious to me that a laptop experience on phone hardware (but like… with a bigger screen, keyboard and mouse/trackpad) was sort of perfect for most use cases. I just assumed that it would come in the form of a phone docked in to a hollowed out laptop. The core issue was just that the software was awful with such a set up. Apple just kind of bypassed that by having their whole OS and everything on it switch over to ARM and just running a non-mobile OS on a phone SOC.
It seems like Google is kind of edging that way by merging chrome OS in to android. And windows was maybe flailing that direction with windows on arm… but… I think that was mostly just them trying to copy Apple without really thinking to hard about it.


Gee, maybe there might be some practical, social and legal problems with always recording camera glasses…
Would Tails go in cool and good or schizo?


I’ve not had any issues with librewolf slowing down or crashing even when left open for prolonged times.
It’s a fork of fire fox, but with a lot of stuff pulled out and some ad blocking stuff built in. the devs have said they’re not gonna implement the upstream fire fox AI stuff.


The companies making TVs don’t want to sell simple displays, they want to expand their businesses beyond just one time sales of hardware. So they and the store fronts don’t offer the average consumer a simple display. People can still find them, but they need to be actually looking for a dumb TV and know what to look for.
Also, how to know if you have a special interest in Cold War related history topics.


They’re about a 2/3rds majority in the consumer and workstation market, and that’s not insignificant, but that’s also not a significant part of their revenue by this point, nor is it why their stock makes up a terrifying percentage of the S&P 500.
If their revenue returned to just being that, they’d basically cease to be a relevant company and their stock price would crater. It’s a decent business to be in, but, it’s not a infinite growth, line goes up forever, business to be in.


They have a near monopoly on cloud service genAI data center GPUs. They don’t make the semiconductors. They just hand the design for those chip to TSMC and then sell what TSMC makes for them. The vast majority of their revenue right now is coming from selling stuff to new genAI data centers, if those stop getting built, they loose 80% of their revenue. And their current valuation is based on an assumption of an order of magnitude of new such data centers being built year on year.
I think, that it’s very likely that demand for new such chips is liable to drop to 0 because the capacity of currently extant data center using their chips is already overbuilt for realistic demand. No one other than Nvidia is making money on these data centers, and there is no path to profitability.


I don’t think most corporations would be interested in buying a computer that doesn’t include a windows license. Unless they intend to use it for like… server stuff, but they’d be way better off buying like… actual server hardware… if only for the operating cost.


It depends on the type of productivity TBH. Like, sure some productivity use cases need CUDA, but a lot of productivity use cases are just using the cards as graphics cards. The places where you need CUDA are real, but not ubiquitous.
And “this is my personal computer I play games on, but also the computer I do work on, and that work needs CUDA specifically” is very much an edge case.


I’d say in general, the advantages of Nvidia cards are fairly niche even on windows. Like, multi frame generation (fake frames) and upscaling are kind of questionable in terms of value add most of the time, and most people probably aren’t going to be doing any ML stuff on their computer.
AMD in general offers better performance for the money, and that’s doubly so with Nvidia’s lackluster Linux support. AMD has put the work in to get their hardware running well on Linux, both in terms of work from their own team and being collaborative with the open source community.
I can see why some people would choose Nvidia cards, but I think, even on windows, a lot of people who buy them probably would have been better off with AMD. And outside of some fringe edge cases, there is no good reason to choose them when building or buying a computer you intend to mainly run Linux on.


“We’re sorry… we’ll definitely stop, definitely don’t look at anything else we might have done for them.”


“Hey guys our sales are falling through the floor faster than Tesla, maybe we should rework our pricing and reconsider our policies towards buyers to move more volume and improve our imagine.”
“ Uh… nah. How about instead, we put ads on the center console!”


You can probably get a pirate hat online for a few bucks. And there are plenty of discoverability systems not based on integration with a subscription service.


you may not need a dedicated GPU, the iGPUs on AMD CPUs have gotten really good lately, like better than a mid ranged dedicated GPU from even 4 years ago.
My laptop has a 780m integrated GPU on its 7840HS cpu, and I’ve been blown away at its ability to run modern games. Like, sure, not running cyberpunk 2077 at max settings at 120 FPS in 4k, but running it at medium settings at 60 FPS in 1080? It’ll do it just fine.
My laptop was a bit pricy, but I did a little searching and saw that there’s something called the GPD WIN Mini, the 2023 model was listed as having the same CPU/iGPU as my laptop and it was listed at 700$. It’s an odd form factor, and I couldn’t see it in stock anywhere.
You might be able to get the kind of performance you’re looking form the iGPU on an AMD ryzen 7 or later CPU, and something without a dedicated GPU will probably be a lot cheaper.
If you don’t mind a non-traditional form factor, as other’s have mentioned, definitely check out the steam deck. It’s very affordable and very capable, and other than the form factor, it is just a computer running Linux. You could even boot other distros on to it if you don’t want to run SteamOS (Valve’s arch derived distro).
Most drivers are just in the Linux kernel so updating it will fix issues with them, unless the drivers can’t be bundled in to the kernel for license conflict reasons. In which case they need to be updated manually.
Mint has a GUI program for managing drivers that aren’t in the kernel. It actually has a GUI program for most things that would normally need commands in the terminal. Which is why I think it’s kind of insane to recommend anything else to people who aren’t familiar with using a command line interface.