The 7950X3D or 9800X3D are both faster (besides the 7800X3D you mentioned).
GPU-wise this is obvious the best AMD has to offer, but an RTX 4090 is obviously faster still. With the typical caveats for NVIDIA on Linux.
The 7950X3D or 9800X3D are both faster (besides the 7800X3D you mentioned).
GPU-wise this is obvious the best AMD has to offer, but an RTX 4090 is obviously faster still. With the typical caveats for NVIDIA on Linux.
I have several components in my network that are at least 6 years old. Is that a problem…?
full mirrors of YT
Yeah…not going to happen.
Sounds cool, I just fail to understand how this takes Cinnamon “out to the real world”.
What’s Ullr?
Include adding kernel level anti cheat to that. This should just give us an option to get a full refund.
Sounds about right. There are some valid and good use cases for “AI”, but the majority is just buzzword marketing.
The main thing (by far) degrading a battery is charging cycles. After 7 years with say 1,500 cycles most batteries will have degraded far beyond “80%” (which is always just an estimate from the electronics anyway). Yes, you can help it a bit by limiting charging rate, heat and limit the min/max %, but it’s not going to be a night and day difference. After 7 years with daily use, you’re going to want to swap the battery, if not for capacity reduction then for safety issues.
I think I have a simple function in my .zshrc
file that updates flatpaks and runs dnf
or zypper
depending on what the system uses. This file is synced between machines as part of my dotfiles sync so I don’t have to install anything separate. The interface of most package managers is stable, so I didn’t have to touch the function.
This way I don’t have to deal with a package that’s on a different version in different software repositories (depending on distribution) or manually install and update it.
But that’s just me, I tend to keep it as simple as possible for maximum portability. I also avoid having too many abstraction layers.
That’s mostly down to Teams though (being the bloated web app that it is), and not the underlying operating system.
Technically, wired charging degrades the battery less than wireless charging, mainly because of the excessive heat generated by the latter. The same way slower wired charging generates less heat. Lower and upper charging limits also help (the tighter the better).
But I personally don’t bother with it. In my experience, battery degradation and longevity mostly comes down to the “battery lottery”, comparable to the “silicon lottery” where some CPUs overclock/undervolt better than others. I’ve had phone batteries mostly charged with a slow wired charger degrade earlier and more compared to almost exclusively wireless charging others. No battery is an exact verbatim copy of another one. Heck, I had a 2 month old battery die on me after just ~20 cycles once. It happens.
Sure, on average you might get a bit more life out of your batteries, but in my opinion it’s not worth it.
The way I see it with charging limits is that sure, your battery might degrade 5% more over the span of 2 years when always charging it to 100% (all numbers here are just wild estimates and, again, depend on your individual battery). But when you limit charging to 80% for example, you get 20% less capacity from the get go. Unless of course you know exactly on what days you need 100% charge and plan your charging ahead of time that way.
Something I personally could never be bothered with. I want to use my device without having to think about it. If that means having to swap out the battery one year earlier, then so be it.
When talking about the kernel, Windows actually skipped 3 major versions iirc from the top of my head. Windows 8 was Windows (NT) 6.2, and Windows 10 skipped that version number to, well, 10.
Why when a simple alias will do?
I also experienced less “hiccups” since switching to Linux with KDE but I’d like to know on what combination of hardware and Windows you experienced anywhere close to an average of 1s response time to “any input”.
The article links an article from March '24 talking about the introduction of these devices that contains this part:
The scanner that Adams and police officials introduced during Thursday’s news conference in a lower Manhattan station came from Evolv, a publicly traded company that has been accused of doctoring the results of software testing to make its scanners appear more effective than they are.
So they could never be trusted but were still allowed to proceed.
iFixit rates it “Difficult” for the Steam Deck OLED and says the time required is 2-3 hours:
https://www.ifixit.com/Guide/Steam+Deck+OLED+Battery+Replacement/168676
This is a slight improvement from the original Deck’s estimated 2-4 hours:
https://www.ifixit.com/Guide/Steam+Deck+Battery+Replacement/149070
It requires removing quite a few parts but the most annoying part is getting rid of the adhesive. It doesn’t have easy-to-access pull tabs or whatever.
They can certainly improve this. Either add pull tabs to the adhesive strips, or better yet use the mechanism from the iPhone 16 where you apply voltage to the adhesive to make dissolve/no longer stick. Or even better make it a screw-in battery without any glue whatsoever. Then update the routing of several cables so they aren’t in the way of removing the battery.
Understandable.
What I will say though is that I personally wouldn’t mind regular spec bumps at all. The Deck isn’t exactly a cheap device and to get the “latest and greatest” for your “investment” at any given point of purchase would help longevity.
But as I said, in this case it makes a lot of sense (for Valve). SteamOS is still under heavy development, even more basic stuff such as the update mechanism and also power management is something they’re still working to improve.
They also use a custom APU designed in collaboration with AMD, and these designs cost a lot of money. It’s not just a rebranded 7840U like the Z1 Extreme for example. This custom design makes a lot of sense in terms of focusing on gaming performance and efficiency, and it clearly shows in (very) power limited scenarios.
Either way, I wouldn’t be surprised if we see a new Steam Deck based on Zen 5 and RDNA 4 with another custom designed APU sometime in 2025 or early 2026. Zen 2 is really starting to show its age and Zen 5 is a solid leap even over Zen 4 (not talking about desktop CPUs here, but Ryzen AI 300). RDNA 4 will likely improve quite a bit over RDNA 3(.5) (with the current Deck having RDNA 2) and include some type of hardware-accelerated machine learning upscaling with FSR4, which could make a lot of sense on the Deck as long as enough games support it.
I’d also like to see a few other improvements. The OLED display is great in many aspects, but VRR would be a great feature to have. Internally I’d like to see an easier way to swap the battery, maybe using similar tech to what Apple does with the iPhone 16’s battery. Currently, swapping the battery is one of the most complex repairs on the Deck, but it’ll also be the most common a few years down the line when all these batteries really start to show their age.
I think we’ll get at least one more x86 Steam Deck generation before it moves to ARM (if it moves to ARM at all).
The Snapdragon X isn’t anything to write home about when it comes to efficiency under load, with the newest CPUs (with iGPUs) from AMD and Intel keeping up or maybe even exceeding it.
Happy cake day!
deleted by creator