• 0 Posts
  • 199 Comments
Joined 3 months ago
cake
Cake day: June 9th, 2024

help-circle
  • schizo@forum.uncomfortable.businesstoLinux@lemmy.mlCorel Linux 1.1.2, 1999
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    1
    ·
    15 hours ago

    Linux was the NFT or Blockchain or AI of 1999, so every tech company was jumping on board.

    The sales pitch, as I remember, was that you could run your Wordperfect or CorelDraw shit on it, and not need to have Windows to use it and instead could join the future, which was Linux. Though, amusingly, their version of the future was running Windows binaries via Wine on Linux which, eh, okay but…

    Of course, nobody used Wordperfect or CorelDraw at that point in history so I’m not entirely sure how that was supposed to sell you on buying not-Word and not-Photoshop.





  • That’s been my take on the whole ‘use gopher/gemini!’ bandwagon. Nice idea, but the solution to the problem leads to more problems that need solutions, and we’ve come up with solutions to those, but on other protocols.

    And I mean, if I stab someone in the face with a screwdriver, the misuse of the screwdriver isn’t in some way specific to the screwdriver and thus nobody should use screwdrivers.

    Same thing with all the nonsense a modern website does: HTTP is fine, it’s just being used by shitheads. You could make a prviacy-respecting website that’s not tracking you or engaging in any sort of shifty bullshit, but someone at some point decided that was the only way to make money on the Internet, and here we are.





  • Alternately, what’d be really neat would be an easy way to mostly completely do a webpage setup for someone using the free hosting options that do exist.

    Like, a tool that makes handling deploying something to Github Pages or Cloudflare Pages or whomever else offers basically free web hosting that isn’t nerdy to the point that you need a 6,000 word document to explain the steps you’d have to take to get a webpage from a HTML editor to being actually hosted.

    Or, IDK, maybe going back for ye old domain.com/~username/ web hosting could be an interesting project to take on, since I’m sure handling file uploads like that should be trivial (lots and loooots of ways to do that.). Just have to not end up going Straight To Jail offering hosting for people, I suppose.








  • Fair points on VR games being fairly social. I was more thinking of the in-person social experience, which is still involving some portion of people sitting around stuffing their face into a headset and wandering off into their own world.

    IMO, this is something that AR/MR stuff could do a great job of making more social by adding the game to the world, rather than taking the person out of the world to the game but, of course, this also restricts what kind of games you can do so is probably only a partial solution and/or improvement on the current state of affairs.

    I also agree that it’s way too expensive still, and probably always will be because the market is, as you mentioned, small.

    PCVR is pretty much dead despite its proponents running around declaring that it’s just fine like it’s a Monty Python skit. And the tech for truly untethered headsets is really only owned by a single (awful) company and only because the god-CEO thinks it’s a fun thing to dump money on which means it’s subject to sudden death if he retires/dies/is ousted/has to take time off to molt/has enough shareholder pressure put on him.

    Even then, it’s only on a second generation (the original Quest was… beta, at best) and is expensive enough that you have to really have a reason to be interested rather than it being something you could just add to your gaming options.

    I’d like VR to take off and the experiences to more resemble some of the sci-fi worlds that have a or take place in a virtual reality world, but honestly, I’ve thought that would be cool for like 20 years now and we’re only very slightly closer than we were then, we just have smaller headsets and somewhat improved graphics.



  • Train to Busan, Parasite, Unlocked, Wonderland, Anatomy of a Fall and Close have been ones I’ve seen recently that I liked.

    I think some of those are available on Netflix, but as I don’t use Netflix I can’t say which ones and for certain, though.

    Edit: I just realized some of those are vague and will lead to a billion other movies lol. The first 4 are S. Korean, the last two are French and they’re all from 2020 or newer so anything not from there or older isn’t the right one.


  • You’re not wrong (and those are freaking enormous dies that have to cost apple a goddamn fortune to make at scale), but like, it also isn’t an Apples-to-Apples comparison.

    nVidia/Intel/AMD have gone for the maximum performance and fuck any heat/noise/power usage path. They haven’t given a shit about low-power optimizations or investing in designs that are more suited to low-power usage (a M3 max will pull ~80w if you flog the crap out of it, so let’s use that number) implementations. IMO the wrong choice, but I’m just a computer janitor that uses the things, I don’t design them.

    Apple picked a uarch that was already low power (fun fact: ARM was so low power that the first test chips would run off the board’s standby power and would boot BEFORE they were actually turned on) and then focused in on making it as fast as possible with the least power as possible: the compute cores have come from the mobile side prior to being turned into desktop chips.

    I’m rambling but: until nVidia and x86 vendors prioritize power usage over raw performance (which they did with zen5 and you saw how that shit spiraled into a fucking PR shit mess) then you’re going to get next year’s die shrink, but with more transistors using the same power with slightly better performance. It’s entirely down to design decisions, and frankly, x86 (and to some degree so has nVidia) have painted themselves into a corner by relying on process node improvements (which are very rapidly going to stop happening) and modest IPC uplifts to stay ahead of everyone else.

    I’m hoping Qualcomm does a good job staying competitive with their ARM stuff, but it’s also Qualcomm and rooting for them feels like cheering on cancer.


  • Power consumption numbers like that are expected, though.

    One thing to keep in mind is how big the die is and how many transistors are in a GPU.

    As a direct-ish comparison, there’s about 25 billion transistors in a 14900k, and 76 billion in a 4090.

    Big die + lots and lots of transistors = bigly power usage.

    I wouldn’t imagine that the 5000-series GPUs are going to be smaller or have less transistors, so I’d expect this to be in the die shrink lowers power usage, but more transistors increase power usage zone.