

Yeah, I meant for AI stuff specifically. Their main products are…well I wouldn’t say “good” but they successfully choked out all competition in the 90s so…


Yeah, I meant for AI stuff specifically. Their main products are…well I wouldn’t say “good” but they successfully choked out all competition in the 90s so…


Microsoft has nothing worth using. Microsoft hasn’t made anything that’s even worth talking about. Anyone with an OpenAI key and an afternoon to kill could make something every bit as good as what Microsoft has done. They put the absolute bare minimum of effort into everything they’ve done with AI.
The only advantage they have is customer lock-in. Historically, that’s usually enough for them. I hope it’s not this time.
Eventually Microsoft will probably buy a company with people who know what the fuck they’re doing. I think that’s their only way forward because it looks like the brain drain has finally caught up with them.


UN-confirmed
In case anyone misread that, they mean the United Nations (UN) confirmed it.
Yes, it’s a genocide. Genocide is bad. It’s not that fucking complicated.


There is certainly a very big amount of fuckery going on right now with nvidia drivers.
“Right now” meaning every year for the past decade or two.
It’s always something with Nvidia drivers. Performance+stability is more the exception than the rule.
That said, AMD drivers have a bad rep too. Personally I’ve had zero issues since I switched to AMD but experiences seen to vary a lot from what I’ve read.
Before that, I don’t think I ever got through a full year without at least one weekend lost to troubleshooting Nvidia bullshit. CUDA is a pain in the ass even on Windows.


Jesus Christ what a dumb take. But at least they didn’t say that millennials are killing the cell phone industry. I guess that doesn’t make for good clickbait anymore.
Reminds me if the parable of the broken window, in which French economist Frédéric Bastiat explains the painfully-obvious truth that breaking windows is generally a bad thing, even though it drums up business for the glass maker.
But if, on the other hand, you come to the conclusion, as is too often the case, that it is a good thing to break windows, that it causes money to circulate, and that the encouragement of industry in general will be the result of it, you will oblige me to call out, “Stop there! Your theory is confined to that which is seen; it takes no account of that which is not seen.”
It is not seen that as our shopkeeper has spent six francs upon one thing, he cannot spend them upon another. It is not seen that if he had not had a window to replace, he would, perhaps, have replaced his old shoes, or added another book to his library. In short, he would have employed his six francs in some way, which this accident has prevented.


I think Debian offers a very good compromise. The primary repos follow the Debian Free Software Guidelines (DFSG).
Then they have a separate “non-free” repos for “non-DFSG-compliant packages that are considered important enough to make available anyway”. If you want to be a hardline free software stalwart, you can do that, and Debian supports you. If you are comfortable making a few compromises for the sake of usability, like fonts and device drivers, Debian supports you on that as well.


Also, people’s goals change and “secure” means something different.
When I was making half as much as I am now, I felt fairly secure. I could pay my rent, I had no credit card debt, and I had a few months’ worth of savings. Money was not a day-to-day worry. Most of my peers were in debt and/or living paycheck-to-paycheck so I felt like I was living large.
Now I am objectively more secure but I feel less secure because I am thinking about retirement, childcare, college funds, and elder care. I have nowhere near enough savings to retire in the foreseeable future. I honestly don’t know if I’ll ever get there.


Canadian police seem pretty level-headed here.
“There was not a ding on the bus. He did a great job,” said McKenna said. “It’s comical but at the same time it’s serious. We’re thankful nobody was hurt.”
“We didn’t want to spook him,” he said. “We didn’t want to make this a tragedy.”


The problem here is education.
And I’m not just talking about “average joes” who don’t know the first thing about statistics. It is mind-boggling how many people with advanced degrees do not understand the difference between correlation and causation, and will argue until they’re blue in the face that it doesn’t affect results.
AI is not helping. Modern machine learning is basically a correlation engine with no concept of causation. The idea of using it to predict the future is dead on arrival. The idea of using it in any prescriptive role in social sciences is grotesque; it will never be more than a violation of human dignity.
Billions upon billions of dollars are being invested in putting lipstick on that pig. At this point it is more lipstick than pig.


I think it’s just that Mastodon posts in Lemmy are weird.
Toots don’t have titles so it just duplicates the content, and then you have a mess of @ and # tags that don’t make sense in Lemmy.
Cross-ecosystem federation is cool but also leaves a lot to be desired.


Debian Stable staying true to form!
Thanks for the writeup. Good stuff.


I’m a software developer first and a gamer second. Being a “gaming” distro does not detract from anything else, really. It just means that getting proper GPU acceleration is easy, and you’re likely to want that for development too. That was actually why I chose Bazzite. I was tired of wrestling with CUDA and ROCm.
It’s not “gaming” vs “developing”. That’s a false dichotomy.
The real choice is immutable vs traditional. And I’ll admit, immutable distros have a big learning curve. But it forces you to learn techniques that will make your life easier no matter where you go. The time I spent wrestling with dependencies on Debian or Ubuntu or OpenSuse just because I didn’t know about Distrobox…
Unless your needs are very narrow and unchanging, you’re likely to run into something that’s a giant pain in the ass no matter which distro you choose. I used to use Ubuntu LTSR so I could install a few big things in easy mode, but it made everything else harder because it was so outdated. Switched to OpenSuse Tumbleweed and everything was modern but those few vendors don’t support it so I had to wrestle with dependencies.
The answer to this problem is Distrobox. It’s the answer on Ubuntu, it’s the answer on OpenSuse, and it’s the answer on Bazzite. I’m never going back to dependency hell because I can just run everything the environment it is specifically designed for.
If you’re wondering “should I use distro X, Y, or Z”, the answer is simply “yes”. :D


On bazzite, your search order for apps/packages should be something like:
ujust. This is more for general configs than specific apps, but take a look at what it offers.rpm-ostree is a last resort because it compromises the “atomic” principle of the system, but in a pinch it will give you access to anything you could get with dnf on a regular Fedora install.
Don’t sleep on Distrobox. I have a Debian box so I can run Signal from its official repo and install Geany with both GUI and CLI support. Once you export applications from distrobox they behave like first-class citizens within your desktop.
I strongly recommend trying Distrobox. If you instead hop distros, you’re going to find yourself in a similar situation eventually, where something is unreasonably difficult. That’s why Distrobox exists; so you can get the best of all worlds.


I use Wayland now but there are still apps I run in X mode. Notably mpv and Firefox, because I cannot for the life of me configure them sensibly in Wayland, and I don’t want to write arcane KWin scripts just to get widow sizing/positioning to stay the way I want them on launch. I tried; it was extremely frustrating and still not quite functional.
Perhaps there are other window managers that would make my life easier. I haven’t tried many, but in principle, there is no way for the widow manager to know the correct size and location of new windows for arbitrary applications, so I doubt it. I consider this a user-hostile design choice in Wayland and I pray it will change in the future.


In practice they’re cheap. I saw the Pixel 9 on sale for under $400 before the 9a was even released.
MSRP is an absolute joke, but most people either get it for much cheaper than that, or think they’re getting it for much cheaper through obfuscated costs with carrier deals.
Also, brand reputations tend to outlive reality by a decade or more, so people still think Pixels have great software and Samsung is bloated as hell. The reality is that Samsung and Google have met in the middle.
I can’t fucking wait for a non-Pixel GrapheneOS phone. So tired of Google’s shit.


Yeah, there is no consensus on quantum gravity. There are competing theories, none of which have any viable path to test.
Here’s the abstract from a paper from last year at https://arxiv.org/pdf/gr-qc/0601043 (PDF, unfortunately):
Freeman Dyson has questioned whether any conceivable experiment in the real universe can detect a single graviton. If not, is it meaningful to talk about gravitons as physical entities? We attempt to answer Dyson’s question and find it is possible concoct an idealized thought experiment capable of detecting one graviton; however, when anything remotely resembling realistic physics is taken into account, detection becomes impossible, indicating that Dyson’s conjecture is very likely true. We also point out several mistakes in the literature dealing with graviton detection and production.
Edit: That said, the paper does address this. They cover a variety of QG theories and try to address the fundamental requirements any theory must meet.
As we do not have a fully consistent theory of quantum gravity, several different axiomatic systems have been proposed to model quantum gravity Witten:1985cc ; Ziaeepour:2021ubo ; Faizal2024 ; bombelli1987spacetime ; Majid:2017bul ; DAriano:2016njq ; Arsiwalla:2021eao . In all these programs, it is assumed a candidate theory of quantum gravity is encoded as a computational formal system
ℱQG={ℒQG,ΣQG,ℛalg}.
It’s over my head, personally.


Other terms that made the shortlist of finalists for this year’s Word of the Year included “agnetic,”
Surely they mean “agentic”, right? Right???
I searched for “agnetic” to see if I was out of the loop and it’s kind of funny, kind of sad. I found a lot of what I guess is AI slop that took a typo and just ran with it. Like this one: https://www.linkedin.com/pulse/clash-intelligences-agnetic-ai-vs-agent-explained-robin-biwre
Agnetic AI is a newer conceptual framework that extends beyond the traditional agent-based model. The term “Agnetic” is derived from the word “magnetic,” signifying its dynamic, adaptive nature.
https://www.agnetic.ai/ also looks like slop, but it’s realllllly hard to distinguish between AI bullshit and traditional tech marketing bullshit.


The actual paper presents the findings differently. To quote:
Our results clearly indicate that the resolution limit of the eye is higher than broadly assumed in the industry
They go on to use the iPhone 15 (461ppi) as an example, saying that at 35cm (1.15 feet) it has an effective “pixels per degree” of 65, compared to “individual values as high as 120 ppd” in their human perception measurements. You’d need the equivalent of an iPhone 15 at 850ppi to hit that, which would be a tiny bit over 2160p/UHD.
Honestly, that seems reasonable to me. It matches my intuition and experience that for smartphones, 8K would be overkill, and 4K is a marginal but noticeable upgrade from 1440p.
If you’re sitting the average 2.5 meters away from a 44-inch set, a simple Quad HD (QHD) display already packs more detail than your eye can possibly distinguish
Three paragraphs in and they’ve moved the goalposts from HD (1080p) to 1440p. :/ Anyway, I agree that 2.5 meters is generally too far from a 44" 4K TV. At that distance you should think about stepping up a size or two. Especially if you’re a gamer. You don’t want to deal with tiny UI text.
It’s also worth noting that for film, contrast is typically not that high, so the difference between resolutions will be less noticeable — if you are comparing videos with similar bitrates. If we’re talking about Netflix or YouTube or whatever, they compress the hell out of their streams, so you will definitely notice the difference if only by virtue of the different bitrates. You’d be much harder-pressed to spot the difference between a 1080p Bluray and a 4K Bluray, because 1080p Blurays already use a sufficiently high bitrate.


Does it do that even if you set it to “use device MAC” for the wi-fi network you’re on?
The exact location might depend on brand/OS, but in stock Android it’s in Settings > Network & Internet > Internet > gear icon next to active wi-fi network > Privacy.
In all seriousness, this is very interesting, if only because the methods are easy to control and reproduce.
That said, I’d really like to see comparisons against a more typical warmup routine. I’m not sure the tendon vibration is doing anything more than simulating a warmup. Even that on its own is interesting, just because it opens the door for more targeted experimentation.