• 0 Posts
  • 78 Comments
Joined 1 year ago
cake
Cake day: August 8th, 2023

help-circle
  • I was just about to post the same thing. I’ve been using Linux for almost 10 years. I never really understood the folder layout anyway into this detail. My reasoning always was that /lib was more system-wide and /usr/lib was for stuff installed for me only. That never made sense though, since there is only one /usr and not one for every user. But I never really thought further, I just let it be.


  • I try to steer as many people as I know to Signal, but I don’t want to be the type of person who accepts no compromise so I also use a bunch of others. Whatsapp is the most common, as pretty much everyone here in the Netherlands uses it. I used to use Telegram, but nowadays I trust it less than Whatsapp and all my Telegram chats have moved to Signal. SMS is only there for backup and older people who don’t use other apps. And Discord is there for people who want their messages to never be read, because that app is a dumpster fire that constantly makes me miss messages.



  • No, people just don’t like crypto because it’s a huge waste of energy that has no use for the average person at the moment and is only used by rich people to get richer without much regulation. Don’t get me wrong, it might definitely be useful when used correctly in the future. Not wasting as much energy by ditching proof of work, becoming actually useful for normal transactions, etc. But right now it’s just an overhyped technology for obnoxious cryptobros.



  • Machine learning and compression have always been closely tied together. It’s trying to learn the “rules” that describe the data rather than memorizing all the data.

    I remember implementing a paper older than me in our “Information Theory” course at university that treated the creation of a decision tree as compression. Their algorithm considered sending the decisions tree and all the exceptions to the decision tree and the tree itself. If a node in the tree increased the overall message size, it would simply be pruned. This way they ensured that you wouldn’t make conclusions while having very little data and would only add the big patterns in the data.

    Fundamentally it is just compression, it’s just a way better method of compression than all the models that we had before.

    EDIT: The paper I’m talking about is “Inferring decision trees using the minimum description length principle” - L. Ross Quinlan & Ronald L. Rivest


  • I’m on Arch (actually a converted Antergos) and I have an NVIDIA card as well. My first attempt a few months ago was horrible, bricking my system and requiring a bootable USB an a whole evening to get Linux working again.

    My second attempt was recently, and went a lot better. X11 no longer seems to work, so I’m kinda stuck with it, but it feels snappy as long as my second monitor is disconnected. I’ve yet to try some gaming. My main monitor is a VRR 144Hz panel with garbage-tier HDR. The HDR worked out of the box on KDE Plasma, with the same shitty quality as on Windows, so I immediately turned it off again. When my second monitor is connected I get terrible hitching. Every second or so the screen just freezes for hundreds of milliseconds. Something about it (1280x1024, 75Hz, DVI) must not make Wayland happy. No settings seem to change anything, only physically disconnecting the monitor seems to work.



  • I bought a ThinkPad new in 2014 for my study for like 1200 euro’s. She’s still happily purring today. Around 2019 I made the mistake of emptying a cup of tea into the ThinkPad accidentally and then holding it upside down to get the water out. I think I should’ve just let it leak out of the bottom since the laptop has holes for that, but I panicked. This broke the keyboard, but not the rest of the laptop. I got an official new keyboard for like 100 euro’s which came with a tool and the simple instructions, and since then everything has been working flawlessly.

    So I recommend ThinkPads, although I can’t really say anything about compatibility of new models


  • I’m not so sure that the laypeople will, but I do expect a shift. Personally I’m still running Windows 10 next to Linux currently. Most of my time is still spent on Windows, because it’s generally a bit more stable and hassle free due to the Windows monopoly. Software is written for Windows, so sadly it’s usually just a better experience.

    But so many things I read about Win 11 (and beyond) piss me off. It’s my computer, I don’t want them to decide things for me or farm my data. I’m mentally preparing for the transition to Linux-only. 90% of the software I use will work out of the box, and I think with some effort I can get like 8% of the rest to work. It’ll be a lot of effort, but Micro$oft has pushed so far that I’m really starting to consider.

    Multiple friends and colleagues (all programmers) I spoke are feeling the same way. I think Linux may double in full-time desktop users in a few years of this goes on.



  • My first experience with the Sims was jumping behind a random computer at some kind of event that was running the Sims 1. Most of the family had just died because the previous person behind the PC had let the house burn down. Needless to say, I was a bit confused. I’ve played the Sims quite a bit after that, and I honestly like messing around with it.

    I don’t think I’ve ever played a game without cheating a lot of money. I don’t like that the Sims that I made have to go off to work or school, so usually I just build a big fence around the property to keep them all there. From there on it used to devolve into chaos when I was younger. Building huge mazes to access basic necessities, launching fireworks indoors, etc. Nowadays im a bit more behaved though.

    Imo the Sims 4 is the best nowadays. The older ones are showing their age. That being said, the Sims 4 is definitely in need of some competition. It’s inexcusably buggy sometimes, and I personally think there’s a lot more that can be done with a game like this. Hopefully the upcoming competitors can spark some fire into this genre.



  • gerryflap@feddit.nltoAndroid@lemdro.idSyncthing saved my ass
    link
    fedilink
    English
    arrow-up
    13
    ·
    4 months ago

    I wouldn’t be so sure if I were you. Everyone, and I mean everyone, uses WhatsApp here. Friends, family, work, doctors, landlords, etc. Not using WhatsApp will make you miss get togethers with friends, make it way harder to communicate with colleagues, take away a lot of convenience when talking to your doctor or landlord or something.

    I have Signal groups with friends, but you’re never going to be able to fully lose WhatsApp here unless you’re prepared to be “that person” everywhere and miss a lot of convenience.






  • Yuppers. I need CUDA for my machine learning projects, both for hobby and professionally. I considered AMD and their alternative at the time, but it wasn’t supported on their consumer cards back then, and I also didn’t fully trust their commitment. It’s getting better though, so hopefully AMD can convince me for my next GPU in a few years.