• 0 Posts
  • 27 Comments
Joined 2 years ago
cake
Cake day: July 10th, 2023

help-circle


  • This is one of many examples of a class of problem where the technology is the easy part. There’s room to improve the tech certainly, but the technology sufficient to solve the problem is already well understood.

    The hard part is how to get people to actually do the necessary changes. To consume less, get fewer gas cars on the road, increase the amount of nuclear, hydro, solar, geothermal, and wind in the grid, and minimize coal and gas use. To reduce land use by cows, and increase land use by trees and native plants.

    But maybe AI is the secret here. We have tools that are in the hype moment whose training data already contains several reasonable solutions to climate change. Maybe if AI “finds” the solution to climate change, people will finally listen



  • The thing is business is more booming than it’s ever been, but making the line go up forever is a fool’s errand, at some point you’ll hit a peak. Hitting that peak is immensely punished in our economic system.

    If you make a hammer that’ll last 100 years, you’ll sell as many as you can reach customers who need one, before hammer sales plummet. Instead of being rewarded for making a great product, you’ll be punished when sales fall because you’ve solved a problem for most people.

    Advertising is kind of neutral in abstract in my head. Make a great product for a fair price, and let people know about it, and that’s actually probably a benefit to both parties. Make a terrible product, and tell a bunch of people it’s great, and you’ve spent resources doing them a disservice. But if you can convince them it’s good enough to spend money on it, and keep your revenue per customer above the cost to acquire them, it’s profitable. And that’s all they care about. It’s basically the same pattern as a scam, but profit is the only thing they’re told they’re allowed to care about.



  • nfh@lemmy.worldtoTechnology@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    3
    ·
    5 months ago

    Critically, the people who build these machines don’t typically update drivers to port them to a new OS. You buy a piece of heavy equipment, investing tens, or maybe even a hundred thousand dollars, and there’s an OS it works on, maybe two if you’re lucky. The equipment hopefully works for at least 20 years, and basically no OS is going to maintain that kind of compatibility for that long. Linux might get the closest, but I’ll bet you’re compiling/patching your own kernels before 20 years is up.

    This kind of dynamic is unavoidable when equipment vendors sell equipment which has a long usable life (which is good), and don’t invest in software support (which is them being cheap, to an extent), and OSes change enough that these time horizons likely involve compatibility-breaking releases.


  • It’s a structural challenge more than a fallacy, but I don’t entirely disagree. This sort of thing works best when one of two things is true, there’s some way for people to organize, or it’s relatively small and there are real options.

    The former clearly isn’t true here, but I think the latter is. There’s a lot of companies trying things with AI, and some are working better or worse. This particular use is relatively small, and I think the downside of doing it is also small in the short term. (This is a giant red flag, avoiding a red flag isn’t a large cost)






  • Speed cameras are a privacy issue that doesn’t solve the problem of speeding. People are most comfortable driving the speed the road is designed for, and if that speed is too high, the solution is to modify the road for a safer speed. The speeders in your example are right here, for the wrong reason; speed cameras should be rare if they’re allowed to exist at all. They have, at most, a short term benefit, and broad public surveillance is a very serious issue they contribute to.


  • I was one of the people who went to college to learn things, but the more I learn, the more I’m saddened by all the people I went to school with who studied things they didn’t enjoy, didn’t particularly care to get better at, all because they saw it as a way to make money. In optimizing for money, they miss out on learning and fulfillment.

    This wasn’t that long ago, but I can only imagine how much heavy GenAI use could intensify that effect




  • The way Java is practically written, most of the overhead (read: inefficient slowdown) happens on load time, rather than in the middle of execution. The amount of speedup in hardware since the early 2000s has also definitely made programmers less worried about smaller inefficiencies.

    Languages like Python or JavaScript have a lot more overhead while they’re running, and are less well-suited to running a server that needs to respond quickly, but certainly can do the job well enough, if a bit worse compared to something like Java/C++/Rust. I suspect this is basically what they meant by Java being well-suited.



  • nfh@lemmy.worldtoTechnology@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    17
    ·
    9 months ago

    There’s an important distinction here: “is a good idea” is not “is the right way to do it”. You can also keep kids off of dating apps by banning dating apps, banning children from the Internet, or even just banning children. All of those are horrible solutions, but they achieve the goal.

    The goal should be to balance protecting kids with minimizing collateral damage. Forcing adults to hand over significant amounts of private data to prove their identity has the same basic fault as the hyperbolic examples, that it disregards the collateral damage side of the equation.


  • nfh@lemmy.worldtoTechnology@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    1
    ·
    9 months ago

    It’s all about the implementation. The Washington bill is treating diet products as similar to alcohol (check ID in-store and on delivery), which seems fine to me.

    The NY law seems to be suggesting that dating app services need to collect (and possibly retain) sensitive information on people, like identification, location data. That’s troubling to me.