According to the Steam HW survey around 6% of users are still using Pascal (10xx) GPUs. That’s about 8.4 million GPUs losing proprietary driver support. What a waste.
GPU % 1060 1.86 1050ti 1.43 1070 0.78 1050 0.67 1080 0.5 1080ti 0.38 1070ti 0.24Fixed: 1050 was noted as 1050ti
Interesting, I’m about to move one more machine to Linux (the one that’s been off for a while) and I’ve got exactly 10xx GPU inside lol.
Doubly evil given that GPU prices are still ridiculous.
8.4 million GPUs losing proprietary driver support.
Are they all on Linux though?
Are they supported longer on the windows driver?
Apparently? Title only mentions dropping the support on Linux. 🤷♂️
You don’t have to updare your drivers though, isn’t this normal with older hardware?
You don’t have to updare your drivers though.
Not sure if you’re on Windows or Linux but, on Linux, we have to actively take explicit actions not to upgrade something when we are upgrading the rest of our system. It takes more or less significant effort to prevent upgrading a specific package, especially when it comes in a sneaky way like this that is hard to judge by the version number alone.
On Windows you’d be in a situation like “oh, I forgot to update the drivers for three years, well that was lucky.”
I believe the same SW version is packaged. Nvidia said they’d drop support in the 580 release, but they shifted it to 590 now.
The arch issues are another layer of headache by the maintainers changing the package names and people breaking their systems on update when a non-compatible version is pulled replacing the one with still pascal support in it.
Not really a problem of Arch, but of the driver release model, then, IMO. You’d have this issue on Windows too if you just upgraded blindly, right? It’s Nvidia’s fault for not naming their drivers, or versioning/naming them in a way that indicates support for a set of architectures. Not just an incrementing number willy nilly.
It’s 2025, can we not display a warning message in pacman? Or letting it switch from nvidia-590 to nvidia-legacy?
I’m not an arch user, I admit, I don’t like footguns.
Those are the GPUs they were selling — and a whole lot of people were buying — until about five years ago. Not something you’d expect to suddenly be unsupported. I guess Nvidia must be going broke or something, they can’t even afford to maintain their driver software any more.
I don’t get what needs support, exactly. Maybe I’m not yet fully awake, which tends to make me stupid. But the graphics card doesn’t change. The driver translates OS commands to GPU commands, so if the target is not moving, changes can only be forced by changes to the OS, which puts the responsibility on the Kernel devs. What am I missing?
The driver needs to interface with the OS kernel which does change, so the driver needs updates. The old Nvidia driver is not open source or free software, so nobody other than Nvidia themselves can practically or legally do it. Nvidia could of course change that if they don’t want to do even the bare minimum of maintenance.
The driver needs to interface with the OS kernel which does change, so the driver needs updates.
That’s a false implication. The OS just needs to keep the interface to the kernel stable, just like it has to with every other piece of hardware or software. You don’t just double the current you send over USB and expect cable manufacturers to adapt. As the consumer of the API (which the driver is from the kernel’s point of view) you deal with what you get and don’t make demands to the API provider.
Device drivers are not like other software in at least one important way: They have access to and depend on kernel internals which are not visible to applications, and they need to be rebuilt when those change. Something as huge and complicated as a GPU driver depends on quite a lot of them. The kernel does not provide a stable binary interface for drivers so they will frequently need to be recompiled to work with new versions of linux, and then less frequently the source code also needs modification as things are changed, added to, and improved.
This is not unique to Linux, it’s pretty normal. But it is a deliberate choice that its developers made, and people generally seem to think it was a good one.
They have access to and depend on kernel internals
That sounds like a stupid idea to me. But what do I know? I live in the ivory tower of application development where APIs are well-defined and stable.
Thanks for explaining.
You’re re-opening the microkernel vs monlithic kernel debate with that. For fun you can read how Andrew S. Tanenbaum and Linus Torvalds debated the question in 1992 here: https://groups.google.com/g/comp.os.minix/c/wlhw16QWltI
I don’t generally disagree, but
You don’t just double the current you send over USB and expect cable manufacturers to adapt
That’s pretty much how we got to the point where USB is the universal charging standard: by progressively pushing the allowed current from the initially standardized 100 mA all the way to 5 A of today. A few of those pushes were just manufacturers winging it and pushing/pulling significantly more current than what was standardized, assuming the other side will adapt.
The default standard power limit is still the same as it ever was on each USB version. There’s negotiation that needs to happen to tell the device how much power is allowed, and if you go over, I think over current protection is part of the USB spec for safety reasons. There’s a bunch of different protocols, but USB always starts at 5V, and 0.1A for USB 2.0, and devices need to negotiate for more. (0.15A I think for USB 3.0 which has more conductors)
As an example, USB 2.0 can signal a charging port (5V / 1.5A max) by putting a 200 ohm resistor across the data pins.
The default standard power limit is still the same as it ever was on each USB version
Nah, the default power limit started with 100 mA or 500 mA for “high power devices”. There are very few devices out there today that limit the current to that amount.
It all begun with non-spec host ports which just pushed however much current the circuitry could muster, rather than just the required 500 mA. Some had a proprietary way to signal just how much they’re willing to push (this is why iPhones used to be very fussy about the charger you plug them in to), but most cheapy ones didn’t. Then all the device manufacturers started pulling as much current as the host would provide, rather than limiting to 500 mA. USB-BC was mostly an attempt to standardize some of the existing usage, and USB-PD came much later.
A USB host providing more current than the device supports isn’t an issue though. A USB device simply won’t draw more than it needs. There’s no danger of dumping 5A into your 20 year old mouse because it defaults to a low power 100mA device. Even if the port can supply 10A / 5V or something silly, the current is limited by the voltage and load (the mouse).
People love to say Linux is great for old hardware. But not 10 series Nvidia cards apparently?
Using 10 year old hardware with 10 year old drivers on 10 year old OS require no further work.
The hardware doesn’t change, but the OS do.
Well it still worked until this update, so few week old OS and driver was also good. It’s Arch so expect it to break. It will probably be fixable, we are Linux users.
Pascal is coming up on 10 years old. You can’t expect companies to support things forever.
They started 9 years ago, but they remained popular into 2020 and according to wikipedia the last new pascal model was released in 2022. The 1080 and the 1060 are both still pretty high up on the Steam list of the most common GPUs.
What model came out in 2022? The newest I could find was the GT 1010 from 2021 (which is more of a video adapter than an actual graphics card) but that’s the exception. The bulk of them came out in 2016 and 2017 https://www.techpowerup.com/gpu-specs/?f=architecture_Pascal
Hate to break it to ya, but 2020 was 5 years ago. More than half of these GPUs lifespan ago. Nvidia is a for profit company, not your friend. You can’t expect them to support every single product they’ve ever released forever. And they’re still doing better than AMD in that regard.
You can’t expect them to support every single product they’ve ever released forever. And they’re still doing better than AMD in that regard.
If nvidia had the pre-GSP cards’ drivers opensourced at least there would be a chance of maintaining support. But nvidia pulled the plug.
Intel’s and AMD’s drivers in the Mesa project will continue to receive support.
For example, just this week: Phoronix: Linux 6.19’s Significant ~30% Performance Boost For Old AMD Radeon GPUs These are GCN1 GPUs from 13yrs ago.
thanks to work by Valve
AMD did nothing to make their drivers better, Vale did.
Making them open to contributions was the first step, but ok I won’t engage in this petty tribalism.
The topic was about nvidia’s closed source drives.
Valve couldn’t do the same for pascal GPUs. Nobody but nvidia has the reclocking firmware, so even the reverse engineered nouveau NVK drivers are stuck at boot clock speeds.
AMD did nothing to make their drivers better, Vale did.
That’s literally the point of open source though, both AMD and Intel rely on open source drivers, so anybody can fix a flaw they encounter without having to rely on the company to “consider allocating resources towards a fix for legacy hardware”
If they’re going to release things under a proprietary license and send lawyers after individuals just trying to get their hardware to work, then yes, yes I can.
Don’t want to support it anymore? Fine. Open source it and let the community take over.
That’s why I don’t like closed source proprietary. They decide to stop the support.
Make them open source the drivers so the community can do it then.
fuck. I just realised I have a pascal NVIDIA card on my laptop.
I’m running debian 13, wtf do I do?
EDIT : seems ok?
It was expected that the Linux Driver support will end with the GPU driver branch 580 as well, but NVIDIA extended this to branch 590 (it jumped straight from branch 580 to 590 and a single v580 Linux GPU driver exists). So, if you are boasting any of these GPUs, you won’t be getting Game Ready drivers that offer day-one game support and optimizations for the upcoming titles. However, there should be no issue in using them for how long you wish. Still, users should keep an eye on the quarterly updates as these are essential.
580 is LTS tho, you’ll get security patches for a few years at least
edit: where are my manners? here’s the source
https://docs.nvidia.com/datacenter/tesla/drivers/supported-drivers-and-cuda-toolkit-versions.html
Nothing. You run Debian. It’ll keep working at least until Debian 14. Possibly even after.
😎🤓
Take that, bleeding edge. 🤪
I’ve had so many problems with Nvidia GPUs on Linux over the years that I now refuse to buy anything Nvidia. AMD cards work flawlessly and get very long-term support.
I had an old NVidia gtx 970 on my previous machine when I switched to Linux and it was the source of 95% of my problems.
It died earlier this year so I finally upgraded to a new machine and put an Intel Arc B580 in it as a stop gap in hopes that video cards prices would regain some sanity eventually in a year or two. No problems whatsoever with it since then.
Now that AI is about to ruin the GPU market again I decided to bite the bullet and get myself an AMD RX 9070 XT before the prices go through the roof. I ain’t touching NVidia’s cards with a 10 foot pole. I might be able to sell my B580 for the same price I originally bought it for in a few months.
I’m with you, I know we’ve had a lot of recent Linux converts, but I don’t get why so many who’ve used Linux for years still buy Nvidia.
Like yeah, there’s going to be some cool stuff, but it’s going to be clunky and temporary.
When people switch to Linux they don’t do a lot of research beforehand. I, for one, didn’t know that Nvidia doesn’t work well with it until I had been using it for years.
Similar for me. All the talk about what software Linux couldn’t handle, I didn’t learn that Linux is incompatible with Nvidia until AFTER I updated my GPU. I don’t want to buy another GPU after less than a year, but Windows makes me want to do a sudoku in protest… but also my work and design software wont run properly on Linux and all anybody can talk about is browsers and games.
I’m damned whether I switch or not.
Linux hates Nvidia
got that backwards
Linus openly hated Nvidia, but I suspect Nvidia started it
If you only suspect then you never heard the entire quote and only know the memes.
My point is they dont work together. I can believe Nvidia ‘started’ it, but it doesnt matter or help me solve my problem. I’ve decided I want to try Linux but I can’t afford another card so I’m doing what I can.
You somehow still learned wrong, and I don’t understand how any of that happened. Nvidia not working well with Linux is so widely known and talked about, I knew about it, and the actual reason (which is the reverse of what you think), for several years before switching. I feel like you must have never tried to look anything up, spent any time in a place like lemmy or any forums with a Linux focus and basically must have decided to and kept yourself in some bubble of ignorance and no connection to learn anything.
This is an uncharitable interpretation of what I said.
Nvidia doesn’t tell me it doesn’t work. Linux users do. When I first used Linux for coding all those years ago, my GPU wasn’t relevant, nobody mentioned it during my code bootcamp or computer science certification several years ago, and ubuntu and Kubuntu both booted fine.
When I upgraded my GPU, I got Nvidia. It was available and I knew what to expect. Simple as.
Then as W10 (and W11) got increasingly intolerable, I came to Linux communities to learn about using Linux as a Windows replcement, looking into distros like Mint and Garuda, and behold: I come across users saying Linux has compatibility issues with Nvidia. Perhaps because it is ‘so well known’ most don’t think to mention it, I learned about it from a random comment on a meme about gaming.
I also looked into tutorials on getting Affinity design software to work on which distros, and the best I could find was shit like, I finally got it to run so long as I don’t [do these three basic functions].
I don’t care who started it, I can already believe it’s the for-profit company sucking up to genAI. But right now that doesn’t help me. I care that it’s true and that’s the card I have, and I’m still searching for distros that will let me switch and meets work needs and not just browsing or games.
I’m here now, aware that they don’t work, still looking for the best solution I can afford, because I did look up Linux.
Nvidia’s poor Linux support has been a thing for decades.
If at all, the situation has recently improved. And that only after high-profile Linux developers telling Nvidia to get their shit together.
It’s a good way for people to learn about fully hostile companies to the linux ecosystem.
To be fair, Nvidia supports their newer GPUs well enough, so you may not have any problems for a while. But once they decide to end support for a product line, it’s basically a death sentence for that hardware. That’s what happened to me recently with the 470 driver. Older GPU worked fine until a kernel update broke the driver. There’s nobody fixing it anymore, and they won’t open-source even obsolete drivers.
I JUST ran into this issue myself. I’m running Proxmox on an old Laptop and wanted to use its 750M…. Which is one of those legacy cards now that I guess means I’d need to downgrade the kernel to use?
I’m not knowledgeable enough to know the risks or work I’d be looking at to get it working so for now, it’s on hiatus.
You might be able to use the Nouveau driver with the 750M. Performance won’t be great, but might be sufficient if it’s just for server admin.
Even now, CUDA is gold standard for data science / ML / AI related research and development. AMD is slowly brining around their ROCm platform, and Vulcan is gaining steam in that area. I’d love to ditch my nvidia cards and go exclusively AMD but nvidia supporting CUDA on consumer cards was a seriously smart move that AMD needs to catch up with.
Sorry for prying for details, but why exactly do you need NVIDIA?
CUDA is an Nvidia technology and they’ve gone out of their way to make it difficult for a competitor to come up with a compatible implementation. With cross-vendor alternatives like OpenCL and compute shaders, they’ve not put resources into achieving performance parity, so if you write something in both CUDA and OpenCL, and run them both on an Nvidia card, the CUDA-based implementation will go way faster. Most projects prioritise the need to go fast above the need to work on hardware from more than one vendor. Fifteen years ago, an OpenCL-based compute application would run faster on an AMD card than a CUDA-based one would run on an Nvidia card, even if the Nvidia card was a chunk faster in gaming, so it’s not that CUDA’s inherently loads faster. That didn’t give AMD a huge advantage in market share as not very much was going on that cared significantly about GPU compute.
Also, Nvidia have put a lot of resources over the last fifteen years into adding CUDA support to other people’s projects, so when things did start springing up that needed GPU compute, a lot of them already worked on Nvidia cards.
People buy Nvidia for different reasons, but not everyone faces any issues with it in Linux, and so they see no reason to change what they’re already familiar with.
I just replaced my old 1060 with a Radeon 6600 rx myself.
Same. Refuse to use NVIDIA going forward for anything.
Yeah, I stopped using Nvidia like 20 years ago. I think my last Nvidia card may have been a GeForce MX, then I switched to a Matrox card for a time before landing on ATI/AMD.
Back then AMD was only just starting their open source driver efforts so the “good” driver was still proprietary, but I stuck with them to support their efforts with my wallet. I’m glad I did because it’s been well over a decade since I had any GPU issues, and I no longer stress about whether the hardware I buy is going to work or not (so long as the Kernel is up to date).
That’s fine and dandy until you need to do ML, there is no other option
I successfully ran local Llama with llama.cpp and an old AMD GPU. I’m not sure why you think there’s no other option.
Amd had approximately 1 consumer gpu with rocm support so unless your framework supports opencl or you want to fuck around with unsupported rocm drivers then you’re out of luck. They’ve completely failed to meet the market
I mean… my 6700xt dont have offical rocm support, but the rocm driver works perfectly fine for it. The difference is amd hasnt but the effort in testing rocm on their consumer cards, thus cant make claims support for it.
The fact that it’s still off label like that is kinda nuts to me. ML and AI have been huge money makers for a decade and a half and amd seemingly doesn’t care about gpus. I wish they would invest more in testing and packaging drivers for the hardware that’s out there. In the year of our lord 2025 I shouldn’t have to compile from source or use aur packages for drivers 😭
Llama.cpp now supports Vulkan, so it doesn’t matter what card you’re using.
Devs need actual support, tensor and other frameworks
ML?
Machine learning
Is that a huge deal for the average user?
Average user no, power user/enthusiast/techie absolutely critical
Sadly GPU passthrough only worked on Nvidia cards when I was setting up my server, so I had to get one of them :(
He tried to warn y’all…

whose this and what speaking/event was it, do you remember?
Here is old man me trying to fogure out what PASCAL code there is in the linux codebase, and how NVIDIA gets to drop it.
Anything that starts with a 10 according to the very first sentence in the article.
As a proud owner of a 10xx I can tell you I bought it in 2016 and I think it was about a year old at that point.
Same- Pascal was the first coding language I learned in high school. I was confused here.
Man that was my first thought and I didn’t even use it 😂
Nvidia was awful before the LLM craze, now they’re awful AND evil.
I wasted days of my life getting nVidia to work on Linux. Too much stress. Screw that. Better ways to spend time. If I can’t game, that’s OK too.
I switched from a 3080 to a 7900 xt. It’s one of the better decisions I’ve made even though on paper the performances are not too far away.
I’m told AMD works better with Linux, but I haven’t tried it myself.
AMD is and has been much more friendly towards linux than nivdia. I run mine in proxmox passing through to linux and windows gaming VMs. AMD has invested in open source drivers.
https://thetechylife.com/does-amd-support-linux/
https://arstechnica.com/information-technology/2012/06/linus-torvalds-says-f-k-you-to-nvidia/

AMD is plug and play on Linux. With my 7800XT there isn’t a driver to install. Only issue is that AMD doesn’t make anything that competes with the 5080/5090.
Only “issue” is that AMD doesn’t make anything that competes with the 5080/5090.
And do you really need the performance of a 5080? Certainly not that of a 5090.
My 9070 XT runs everything I need at perfectly acceptable rates on maximum settings. AAA games among them.
That’s such a bad way to look at it. I would’ve bought a 5090 if I could afford it because I want to hold onto the 5090 for almost a decade like I did with my 1080. Depending on prices, it doesn’t make sense to upgrade twice in 10 years because you bought a budget option, and then be stuck trying to sell a budget card. 5090s will hold their value for years to come. Good luck playing AAA titles maxed out in 5 years on a 7800XT.
Generally, you’ll get better results by spending half as much on GPUs twice as often. Games generally aren’t made expecting all their players to have a current-gen top-of-the-line card, so you don’t benefit much from having a top-of-the-line card at first, and then a couple of generations later, usually there’s a card that outperforms the previous top-of-the-line card that costs half as much as it did, so you end up with a better card in the long run.
My 7800XT can’t play Hogwarts Legacy without stuttering (on Linux). I’m really regretting not getting a 5080 at this point.
Yeah, I am looking at spending less than I did before though. But when will an under £200 card give like double the performance of a 2070? I don’t want to spend that much for +20%. Unless my current card dies there is little reason to upgrade.
Good luck playing AAA titles maxed out in 5 years on a 5080 too… 5090 isn’t even considered a consumer card anyway, it’s more like an enthusiast, collector’s item. It’s so expensive compared to its performance value.
You have to look at performance-to-price ratio. That’s the only metric that matters, and should determine how much you can sell it for when upgrading, and how often you upgrade.
I don’t want to play AAA games now, why would I want to with 5 more years of further enshitification?
Open source drivers are a major plus, I’ve had a much easier time than my partner on NVIDIA. I mean I make both machines work but the NVIDIA has been a real pig at times.
Fuck, what do I do when they inevitably discontinue support for 20xx? Just cry and accept that I no longer have a computer, as every component costs as much as a house? D:
this bubble will pop sooner or later, 2nd hand market about to get flooded
Start watching the second hand market. Most of my PC components are bought second hand, and at much cheaper than buying any of those components new.
None of these components are of course bleeding edge, but still sufficient to play any game I want.
I bought an AMD Radeon RX 5700 XT this summer for 1000 DKK (~€133 or ~$157).
Keep using it, you don’t need them to support it to keep using it. All old driver versions still exist.
“Brodie” mentioned. To be fair on the Arch side, they are clear the system could break with an update and you should always read the Arch news in case of manual intervention. You can’t fault Archlinux for users not following the instructions. This is pretty much what Arch stands for.
And IMO if anything this is Nvidia’s doing, arch is just being arch, like it sucks but I also don’t see a problem with arch in this instance.
Brodie
Thinking Forth was a great book! I’m surprised it came up here though.
My son was going to switch to Linux this week. He has a GTX 1060.
Nouveau might be good enough by now for most games that will run on a 1060, maybe worth a try.
AFAIK they still don’t support reclocking on anything older than Turing, meaning the GPU is stuck at the lowest clock frequency and therefore runs very slowly.
Am I the only one that can’t manage to make their Nvidia GTX 1060 run correctly on Linux? It has way worse performance than on Windows, even with the proprietary drivers.
I’ve tried both Kubuntu and Linux Mint.
I’ve got my 1060 running ok on Kububtu, though it was my wife’s when it was running on Windows so I can’t compare the performance. But I’m able to stream Cyberpunk in 1080p via Sunshine to Moonlight on my Apple TV, and it runs just fine.
He just needs to stay on the 580 driver. Bazzite is handling that transparently and wont update you to the 590 driver if you have an unsupported GPU.
Then next time round, buy an AMD or Intel GPU. They tend tp treat their customers better.
Nice, maybe bazzite it is then.
I guess he can’t say he uses arch btw
He wants to use Mint. This is what is called planned obsolescence. I say what Linus Torvalds says.
Might be able to use Mint Debian Edition.
I have a 1050 Ti running the 580 driver under Linux Mint; it works fine.
It’s not that bad as I understand it. If you are using arch with a Pascal GPU you can switch to the legacy proprietary branch. https://archlinux.org/news/nvidia-590-driver-drops-pascal-support-main-packages-switch-to-open-kernel-modules/
deleted by creator
I can’t believe they would do this to poor Borland. I guess I’ll just need to use an AMD GPU for my Turbo Pascal fun.
Sounds like it’s time to switch out the 1080ti for a 9070xt. Been almost 10 years, probably due for an upgrade.
I will miss having that CUDA compatibility on hand for matlab tinkering. I wonder if any translation layers are working yet?
https://github.com/vosen/ZLUDA I’ve heard is doing pretty well
I really dodged a bullet upgrading from my 1070Ti to the last AMD (9070XT or I misremembered?) for the black Friday. Lowest price of the generation just before RAM’s price skyrocketed.
My SO is not so lucky…
Maybe we should use this card for under TV computers with Windows… sadly?
The last time I updated my driver, BG3 didn’t start anymore. So I really could not care less about driver updates for my 8 years old card.
But still, fuck nvidia.























