

I was wrong to think you were trying to debate in good faith. Hell, you didn’t even back up your own claims.
Have a good day.


I was wrong to think you were trying to debate in good faith. Hell, you didn’t even back up your own claims.
Have a good day.


To be fair, LTT has had a very long history of shitting on Linux, while giving Windows a free pass.


Happens all the time. Also, nerds tend to overestimate the amount of resources, like time or money, someone would put on something they care about.
Right here in Lemmy I had this interaction where someone argued that if one were to lose their photos because Google had an oopsie, it’s kind of their fault because they didn’t have a backup plan.


No response? Too bad, I thought you could debate in good faith, I guess I was wrong.


I didn’t say anything that was wrong. In fact, I just asked you a perfectly reasonable question about this model bundled in Chrome, and then you went haywire.
Regardless, I’m no AI researcher, and I suspect that you aren’t either, so I asked you to specify because I could tell that “art” is doing some really heavy lifting here, in the sense that you seem to think that AI can “create” art that is, in any way or form, important or innovative enough to justify its energy usage or the ramifications of it, like the increased cost in wholesale electricity, and thus electricity bills, see https://www.eesi.org/articles/view/data-center-power-demands-are-contributing-to-higher-energy-bills and https://www.bloomberg.com/graphics/2025-ai-data-centers-electricity-prices/
So, is McDonald’s bad for the environment? Sure it is. But food goes to feed people. How much do you think diffusion generated images are worth, compared to a cow? And oil companies, they are definitely bad for the environment as well, and turns out 40% of the energy consumed by data centers comes from natural gas. If we assume that demand drives production, then we should agree that data centers should minimize the use of gas. By the way, 4% of the total energy production in the US goes to power these data centers, see https://www.pewresearch.org/short-reads/2025/10/24/what-we-know-about-energy-use-at-us-data-centers-amid-the-ai-boom/. I would say that if we could reduce that number by cutting off AI clip art generation, it would be a net win for everyone.


What fields, specifically?
I’m trying to understand why there’s so much angry energy here. Some of my colleagues work in research and none of them come across as an AI evangelist, not even close. Even more, there are a lot of valid concerns coming from them, from copyright infringements to privacy to accuracy. In engineering, there’s that, plus security, and above all, cost. Training and running models is extremely expensive and it may not even yield the desired results.
So what’s your angle here?


Is AI your field of work or research? Genuinely curious.


Data center operators can and will negotiate yearly rates for bulk electricity up. That’s how they can guarantee supply, by paying more than the competition. Small local distributors will never have that kind of leverage, that’s why consumers end up paying more.
So yes, you are correct in saying that corporations and billionaires are the problem, but in this particular case, it’s because of a particular subset of those.


I don’t hate AI. I work for an AI company. But I hate the uselessness of a lot of the AI derived products. So, for me, burning a single drop of oil to write an email in business speech, post a video of a kitten in a superman outfit, or make a Trump Jesus pic, is a waste.
And in many ways, AI is actively making the world worse too, from big tech stealing content from everyone to train their models, to deepfake content flooding social media, there’s no good coming out of that. So maybe you should chill a bit before going off rails like you did.


Jeez, calm your tits, I think I asked politely enough so I wouldn’t deserve this kind of response.
My question was what do you think this particular model does, not what is achievable with AI in general. And I’m asking because a model that weights 4GB is not some trivial thing that every Chrome user wants or needs loaded in memory.


What do you think this nano model actually achieves?
Because I know why someone would want to eat a burger, or fill up a tank, but why would anyone want this running in their computers?


Given that some tech bros are measuring AI productivity in lines of code, yes, yes it does.


Correlation doesn’t imply causation.
If you lay off 10% of your workforce, mostly juniors, and your KPIs are loose enough, there’s no reason to think that companies won’t succeed in pushing senior employees to meet them, mostly through overwork.
Also, certain companies track AI usage as a KPI in itself, so there’s a circular logic component to it.


Good. Solar is critical infrastructure.


Most people don’t have a PC laying around, just so it can be hooked to a TV.
Even more true, most people don’t know how to do what you just described.


This looks more like senior staff being forced to pick up more work with the same pay.


They don’t need AI for that. I was handed the exact same deal in 2008.
I’m talking about the LTT staff.
Proof of this is that for ten years, they published zero videos specifically about Windows shortcomings, but they did I believe two separate Linux “challenges”, where the main conductor was Windows users complaining about Linux. And while I agree that some of the complaints were legitimate, many others, like UX differences compared to Windows, definitely are not.