this post was submitted on 07 Sep 2025
113 points (96.7% liked)
technology
23950 readers
198 users here now
On the road to fully automated luxury gay space communism.
Spreading Linux propaganda since 2020
- Ways to run Microsoft/Adobe and more on Linux
- The Ultimate FOSS Guide For Android
- Great libre software on Windows
- Hey you, the lib still using Chrome. Read this post!
Rules:
- 1. Obviously abide by the sitewide code of conduct. Bigotry will be met with an immediate ban
- 2. This community is about technology. Offtopic is permitted as long as it is kept in the comment sections
- 3. Although this is not /c/libre, FOSS related posting is tolerated, and even welcome in the case of effort posts
- 4. We believe technology should be liberating. As such, avoid promoting proprietary and/or bourgeois technology
- 5. Explanatory posts to correct the potential mistakes a comrade made in a post of their own are allowed, as long as they remain respectful
- 6. No crypto (Bitcoin, NFT, etc.) speculation, unless it is purely informative and not too cringe
- 7. Absolutely no tech bro shit. If you have a good opinion of Silicon Valley billionaires please manifest yourself so we can ban you.
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
To play devils advocate, it can be really hard to maintain an ever increasing list of hardware to support. Especially when new hardware have new features that then require special care to handle between devices, and further segment feature sets.
To support every device that came before it, also requires devs to have said hardware to test with, lest they release a buggy build and get more complaints. If devs are already strapped for time on the new and currently supported devices, spending any more time to ensure compatibility with a 10-15 year old piece of hardware that has a couple dozen active users is probably off the table.
See 32bit support for Linux. I get why they're dropping support but I also don't like it. In Linux, there's basically a parallel set of libraries for i386. These libraries need maintenance and testing to ensure they're not creating vulnerabilities. As the kernel grows, it essentially doubles any work to ensure things build and behave correctly for old CPUs, CPUs that may lack a lot of hardware features that have come to be expected today.
People also will like to say "oh but xyz runs just as good/bad as my 15-20 year old computer, why do we need NEW???". Power, it's power use. The amount of compute you get for the amount of power for modern chipsets is incomparable. Using old computers for menial tasks is essentially leaving a toaster oven running to heat your room.
It takes a real long time for the inefficiency of an old computer to add up to the embodied energy cost of a new computer, though.
also true
Yeah, this really seems like a problem inherent to capitalist production. New hardware doesn't need to release every year. We could cut back on a lot of waste if R&D took longer and was geared toward longevity and repairing rather than replacing. Unfortunately, all the systems of production were set up when Moor's Law was still in full swing, so instead we're left with an overcapacity of production while at the same time approaching the floor component miniaturization as Newton gives way to Schrödinger.
I agree with you completely.
I do want to add that, I'd wager this overall scenario would be present regardless how long the release schedules are. Especially since people are wanting to continue to use their 10-15 year old hardware.
It's somewhat frustrating that many useful features are also in opposition to repair-ability. Most performance improvements we've seen in the last 10 years basically require components to get closer and closer. Soldered RAM isn't just the musings of a madman at Apple, it's the only way to control noise on the wires to get the speeds the hardware is capable of. Interchangeable parts are nice but every modular component adds a failure point.
Add in things like say, a level of water resistance. Makes the device way more durable to daily life at the expense of difficult repairs. Or security, it's great grandma won't get her phone hacked but now we can't run our own code.
There's also a bit of an internet-commie brainworm that I'm still trying to pin down. Like you said, "We could cut back on a lot of waste if R&D took longer and was geared toward longevity and repairing rather than replacing", what does this actually look like? I think it's at odds with how most of us use technology. Do we want somewhat delicate, fully modular, bulky devices? What does it mean to be repairable if the entire main-board is a unit? If you need to make each component of the main board modular, the device will quadruple in size, making it overall worse to use than the small disposable device(more expensive too). The interconnects will wear out, making modules behave in unexpected ways. The level of effort required to support a dozen interconnected modules for years and years would be substantial. Not only that, the level of expertise to repair things at a level below a plug-and-play module is far higher.
I had a small wireless dongle die on me after about a year of use. It basically stopped connecting to my phone. I noticed that it would connect for a short period before disconnecting. When the device hadn't been used in a while, this period was longer. Due to my own repair experience, I knew this was a product of a cracked solder joint and expansion due to heat. I brought it into work and held a reflow gun to it for 5 or so minutes, heating up the PCB enough to reflow whatever bad joint was causing the issue. I looked this up after the fact, and found another person had found the same solution, basically told people to put the whole device in the oven for a period and hope it fixed it. People commented in disbelief that this seemingly magical ritual could revive such a black-hole of a device. They couldn't comprehend how someone could come across this solution. Of course it wasn't magic, it was fairly simple if you have encountered thermal expansion, cracked solder joints, and misbehaving electronics in the past. The point of this story is that, had this not been the solution, the device would have been e-waste, more so than it already is because even the most simple repair was magic to people unconcerned with technology. I've dedicated my life to technology and engineering, and even then I'm basically an idiot when it comes to a lot of things. Most people are idiots at most things and good at a couple things.
I understand people are upset when their technology stops working. It stops working because the people who are experts in it, don't have the time or funding to keep it working, and the people who want it to work don't understand how it works in the first place, because if they did and really wanted their old hardware working, they'd develop a driver for it. People do this all the time, and it takes months if not years of effort from a handful of contributors to even begin to match what a well funded team of experts can get done when it's their day job.
There's a fundamental disconnect between those who use and purchase technology and those who make it. The ones who make it are experts in tech and idiots at most things, and the ones who use it are likely experts in many things but idiots with tech, or simply have a day job they need to survive and don't have the time to reverse engineer these systems.
Even in an ideal communist state, resources need allocation, and the allocation would likely still trend towards supporting the latest version of things that currently have the tooling and resources, who's power consumption is lower and speed higher. We might get longer lifespans out of things if we don't require constant improvement to serve more ads, collect more data, and serve HD videos of people falling off things, but the new technology will always come around, making the old obsolete.
It's just the nature of technology that the more advance it is, the harder it is to actually repair it.
Repairing pre-19th century tech (ie a shovel or a blanket or a wooden chair) is trivial because devices made with pre-19th century tech don't have some crazy demand for precision. A shovel will still be useful as a shovel even if the handle is an inch wider than the actual specs. It doesn't matter that the wooden leg you've replaced with isn't the exact same as the other three legs.
Repairing some 19th century tech like mechanical alarm clocks isn't hard either and is more than doable for hobbyists if they have access to machinist equipment like lathe machines and drill presses. You could use lathe machines to make your own screws, for example. CNC machines open up a lot of possibilities.
Repairing 1950s-1980s commercial electronics like ham radios becomes harder in the sense that you can't just make your own electrical components but have to buy them from a store. Repairing is being reduced to merely swapping parts instead of making your own parts to replace defective parts. But as far as swapping out defective components, it's not particularly hard. You basically just need a soldering iron. As far as how precise the components have to be, plenty of resistors had 20% tolerance. The commercial ham radio isn't build with parts that have <0.1% tolerance.
By the time you get to modern PCs, you mostly don't have the ability to truly repair them. You can swap out parts, but it's not like 1980s electronics where "parts" mean an individual capacitor or an individual transistor. Now "parts" mean the PSU or the motherboard or the CPU. People with defective radios can troubleshoot and pinpoint the components that fail while people with defective motherboard at best sniff at it to see if parts of it smelled burnt and look for bulging capacitors.
The only parts of a modern PC that you can still repair are the PSU, the chassis, and various fans. Everything else is just "it stopped working, so I'm going to order new parts on Amazon and throw the old part away." It's a far cry from a wooden chair where everything from the seat to the legs to the upholstery to the nails can be replaced.
I think people who are into computers don't really understand to the extend in which computers aren't really repairable relative to purely mechanical devices. "Do not panic because we can always make our own parts" which is present within hobbyist machinists is completely absent in computer enthusiasts.
Your post is great & I love it in its entirety but I think this part kinda boils down to people thinking everything is capitalism's fault and sometimes things are just exacerbated by capitalism. As you acknowledge later, there are real challenges with developing and implementing technology not that it couldn't be done in a more responsible way and I think many people not in technical fields get jaded and stop believing this.
I have (for the last 24 hours) heard so many people say linux is dropping 32 bit support.
Some (most) distros have dropped support for 32 bit, and firefox stopped providing a 32 bit version for linux, but the kernel still very much supports 32bit. I believe there recently were some talks of cutting out some niche functionality for certain 32bit processors, but I've not heard anything about actually gutting 32bit support from the kernel.
Idk, I'm probably too invested in this. Internets got me going nuts. I should prolly touch grass.
FWIW this isnt hapenning right now. There is a gradual reorganization where 32bit i686 packages are being isolated. WoW Wine allows for emulating 32 bit packages using only 64 bit systems for example. Fedora will probably be the one to do this first, but this change is contingent on Steam (which is a 32 bit application on GNU and Windows).
There will always be specialized distributions that will always cater to 32 bit systems though. Also Debian is probably never going to drop i686 until it physically stops compiling.
Well, that's Debian for you. The whole point of it is that it'll run in situ forever.
I think there was some news recently about maintainers wanting to call 32bit support over. That might be why a lot of folks are talking about it.
Yep, its never just "upgrade your gpu". That's one bit. But for the new gpu to work, I've gotta get a new mobo, and the new mobo needs new ram and a new cpu. Basically replacing everything but the damn case, and sometimes that's gotta go too.
It's honestly a miracle any of this shit worked in the first place.
I recently got a deal on RAM, which I later realized required a new motherboard, which I later realized had wifi-7, which required forcefully updating my fine Windows-10 install to the yucky windows 11. Awful all around. Linux was fine but goddamn I bought the ram to improve some games.
Windows sucks. It's always sucked. But it definitely used to suck way less.
And, like, Linux is annoying in some ways too. I just personally hate it less.
I feel like the GPU slot on a mobo (pcie x16) hasn't changed since I started building PCs in the early 2000s? I got the sense AGP was already mostly phased out by then.
Obviously you couldn't take advantage of the full speed of current cards with something that old but it should work no?
The physical slot hasn't, but depending on the age, size, and layout, getting a big new beefy triple slot gpu is gonna be tight, to say the least.