• 1 Post
  • 73 Comments
Joined 1 year ago
cake
Cake day: June 14th, 2023

help-circle


  • My 4 last employers have used desktop Linux to some extent:

    • Ericsson (Swedish telecoms), default was to have a Windows laptop with X server (Citrix?) but a few of us were lucky enough to get a Linux laptop.
    • Vector (German automotive), Linux dev. environment in a VM on Windows laptops.
    • Opera Software (Norwegian web browser), first day I was given a stack of components and told to assemble my PC and then install my Linux distribution of choice.
    • And a smaller company, which shall remain unnamed, also used Windows laptops with Linux dev. env. in VM.

    Sure most of it was on top of Windows, but if you fullscreen it you can barely tell the difference :)


  • My first couple of computers had AmigaOS and even from the start Windows felt like complete garbage in comparison, but eventually I had to buy a PC to keep up with the times. After that I kept looking for alternative OS:es, tried Linux dual booting but kept going back to Windows since all the programs and hardware I needed to use required it. When I finally decided to go full time Linux, some time between 2005 and 2010, it was because I felt like I was just wasting my life in front of the computer every day. With Windows it was too easy to fire up some game when I had nothing else to do, and at that time there were barely any games for Linux so it removed that temptation. But that has ofc. changed now and pretty much all Windows games work equally well on Linux :)





  • I think a 650 W PSU should be enough for a workload of 490 W idle. Please, correct me, if I am wrong.

    You mean 490W under load, right? One would hope that your computer uses less than 100W idle, otherwise it’s going to get toasty in your room :) I would say this depends on how much cheaper a 650W PSU is, and how likely it is you’ll upgrade your GPU. It really sucks saving up for a ridiculously expensive new GPU and then realizing you also need to fork out an additional €150 to replace your fully functional PSU. On the other hand, going from 650W to 850W might double the cost of the PSU, and it would be a waste of money if you don’t buy a high end GPU in the future. For PSU, check out https://cultists.network/140/psu-tier-list/ .If you’re buying a decent quality unit I wouldn’t worry about efficiency loss from running at a lower % of its rated max W, I doubt it’s going to be enough to be noticeable on your power bill.

    I’ve always had Nvidia GPUs and they’ve worked great for me, though I’ve stayed with X11 and never bothered with Wayland. If you’re conscious about power usage, many cards can be power limited + overclocked to compensate. For example I could limit my old RTX3080 to 200W (it draws up to 350W with stock settings) and with some clock speed adjustments I would only lose about 10% fps in games, which isn’t really noticeable if you’re still hitting 120+ fps. My current RTX3090 can’t go below 300W (stock is 370W) without significant performance loss though.

    If you have any interest in running AI stuff, especially LLM (text generation / chat), then get as much VRAM as you possibly can. Unfortunately I discovered local LLMs just after buying the 3080, which was great for games, and realized that 12GB VRAM is not that much. CUDA (i.e. Nvidia GPUs) is still dominant in AI, but ROCm (AMD) is getting more support so you might be able to run some things at least.

    Another mistake I made when speccing my PC was to buy 2*16GB RAM. It sounded like a lot at the time, but once again when dealing with LLMs there are models which are larger than 32GB that I would like to run with partial offloading (splitting work between GPU and CPU, though usually quite slow). Turns out that DDR5 is quite unstable, and I don’t know if it’s my motherboard or the Ryzen CPU which is to blame, but I can’t just add 2 more RAM. I.e. there are 4 slots, but it would run at 3800MHz instead of the 6200Mhz that the individual sticks are rated for. Don’t know if Intel mobos can run 4x DDR5 sticks at full speed.

    And a piece general advice, in case this isn’t common knowledge at this point; Be wary when trying to find buying advice using search engines. Most of the time it’ll only give you low quality “reviews” which are written only to convince readers to click on their affiliate links :( There are still a few sites which actually test the components and not just AI generate articles. Personally I look for tier lists compiled by users (Like this one for mobos), and when it comes to reviews I tend to trust those which get very technical with component analyses, measurements and multiple benchmarks.


  • It’s not that bad. Of course I’ve had a few games that didn’t work, like CoD:MW2, but nearly all multiplayer games my friends play also work on Linux. The last couple of years we’ve been playing Apex Legends, Overwatch, WoWs, Dota 2, Helldivers 2, Diablo 4, BF1, BFV, Hell Let Loose, Payday 3, Darktide, Isonzo, Ready or Not, Hunt: Showdown to name a few.










  • If you’re using btrfs then you might need to rebalance it. I had the same problem, i.e. “no free space” while tools like df reporting that there should be available disk space, and it confused the hell out of me until I found the solution.

    See manual: https://btrfs.readthedocs.io/en/latest/Balance.html

    This are the commands I run every now and then, especially if my drive has been close to full and I delete a bunch of files to make more space:

    sudo btrfs balance start -dusage=10 /
    sudo btrfs balance start -dusage=20 /
    sudo btrfs balance start -dusage=30 /
    

    The / at the end is the path, since it’s my root mount which uses btrfs. The example in the manual does 40 and 50 too, but higher numbers take longer time, even on an nvme ssd.



  • Maybe it’s changing now with Windows 10/11, but I think historically Windows has had just as difficult learning curve as Linux. People who have complained about Linux being more difficult than Windows just thought so because they had already spent years learning how to deal with Windows, while if they switched to Linux they would have to learn new things. If someone who has used MacOS 100% of their life were to begin using either Windows or Linux then I don’t think there would be much difference in difficulty.

    I’ve come across plenty of bugs and usability issues in Windows, and despite having 10+ years experience with the OS I sometimes found them very difficult to solve, often requiring copy-pasting cryptic texts into the command prompt and/or regedit. I also think troubleshooting on Windows is made worse thanks to them writing witty things like “oops, something went wrong!” instead of actually giving you a useful error message, some many issues are of course unfixable due to its proprietary nature. At best you get an error code which you can look up online, but the OS is not made to be debugged by the user.

    In the past Microsoft had really good support which you chat with, but the last time Windows refused to authenticate after an upgrade all the human support appears to have been replaced by automated troubleshooters. It got stuck in an endless loop of “run local troubleshooter” -> “you should try rebooting” -> “run online troubleshooter” -> “you should try rebooting” -> “back to the local troubleshooter again”. At work I still have a help-desk I can call with people who have taken countless hours of Microsoft trainings to get certifications.

    just so I wasn’t choosing between 100% and 200% scaling. That’s just beyond the average computer user.

    So if I understood you right, Fedora lets you choose either 100% or 200% scaling but you wanted more options than that? I.e. you wanted to overcome a limitation of the OS, rather than having to fix something which was broken? I don’t think the average computer user could do something similar in Windows. For example when I got my work computer with Windows 11, AFAIK there was no option to only show the task bar on one monitor, so it was always visible and taking space on all monitors. IIRC Microsoft added this feature last year, but I think it would’ve been extremely difficult for the average user to find a way to find a way to do it before that.

    Guesstimating 99% of the Windows users I know would just accept that kind of thing like “it’s annoying, but this is how computers are”. I have friends, family members and coworkers who use Windows, and I’ve found them all to be extremely forgiving towards computer issues.