Short disclosure, I work as a Software Developer in the US, and often have to keep my negative opinions about the tech industry to myself. I often post podcasts and articles critical of the tech industry here in order to vent and, in a way, commiserate over the current state of tech and its negative effects on our environment and the Global/American sociopolitical landscape.
I’m generally reluctant to express these opinions IRL as I’m afraid of burning certain bridges in the tech industry that could one day lead to further employment opportunities. I also don’t want to get into these kinds of discussions except with my closest friends and family, as I could foresee them getting quite heated and lengthy with certain people in my social circles.
Some of these negative opinions include:
- I think that the industries based around cryptocurrencies and other blockchain technologies have always been, and have repeatedly proven themselves to be, nothing more or less than scams run and perpetuated by scam artists.
- I think that the AI industry is particularly harmful to writers, journalists, actors, artists, and others. This is not because AI produces better pieces of work, but rather due to misanthropic viewpoints of particularly toxic and powerful individuals at the top of the tech industry hierarchy pushing AI as the next big thing due to their general misunderstanding or outright dislike of the general public.
- I think that capitalism will ultimately doom the tech industry as it reinforces poor system design that deemphasizes maintenance and maintainability in preference of a move fast and break things mentality that still pervades many parts of tech.
- I think we’ve squeezed as much capital out of advertising as is possible without completely alienating the modern user, and we risk creating strong anti tech sentiments among the general population if we don’t figure out a less intrusive way of monetizing software.
You can agree or disagree with me, but in this thread I’d prefer not to get into arguments over the particular details of why any one of our opinions are wrong or right. Rather, I’d hope you could list what opinions on the tech industry you hold that you feel comfortable expressing here, but are, for whatever reason, reluctant to express in public or at work. I’d also welcome an elaboration of said reason, should you feel comfortable to give it.
I doubt we can completely avoid disagreements, but I’ll humbly ask that we all attempt to keep this as civil as possible. Thanks in advance for all thoughtful responses.
The proliferators of theftbox technology and everyone who ups it/demands it for my career’s advancement deserves to get put on an upturned pike, chest-first. To me it’s like being a battle rapper: like a battle rapper better not EVER be relying on ghostwriters for their bars, if you need CoPilot to code, you don’t deserve to call yourself a programmer; and I was an artist first-- so I don’t see any of this LLM bullshit as anything more than tricknology that robbed me and everybody I consider my actual peers (which is to say, not the theftbox touchers).
I’d rather see a journeyman programmer cracking open the books they taught themselves out of than see them turning to CoPilot.
I’ve introduced my coworkers to the concept of the “copilot pause” where you stop typing and your brain turns off while you wait for copilot to make a suggestion. Several of them can’t unsee it now and have stopped using copilot.
Several of them can’t unsee it now and have stopped using copilot.
Gigabased; you’re doing God’s honest work with that
I’m personally very conflicted between my love of computers and the seeming necessity of conflict minerals in their construction. How much coltan is dug up every year just to be shoved into an IoT device whose company will be defunct in six months, effectively bricking the thing? Even if the mining practices were made humane, they wouldn’t be sustainable. My coworkers are very cool for tech workers. Vague anticapitalist sentiments. Hate Elon. But I don’t think they’re ready for this conversation.
If the person I will report to can’t code, I pass on the contract.
Too many management types are the classic middle management who knows people, but not the tech they manage.
Also related - I will NEVER take a contract if my report to drives a Mercedes. 101% I will pass on that opportunity. Life’s too short to deal with that type of entitlement. After 30 years in the industry, that single vehicle type is by far, to me, the largest of red flags.
My secret sexist opinion is: Fill your DBA team with women, lead by a woman, and then just stand back and turn them loose. I absolutely love all female DBA teams because they kick fucking ass always. [edit I’m a cis wm 50s for context]
[Dbl edit - I will also never hire anyone who was ‘educated’ in a Florida University. They are fucking worthless.]
If the person I will report to can’t code, I pass on the contract
I get this, it’s really frustrating to have a clueless manager. But to me, a bigger problem is the reverse.
I’d rather have a manager with no technical ability and excellent people skills, than a manager with excellent technical ability but no people skills. The latter is all too common in my experience.
Yeah it is a mixed bag of shit isn’t it?
If the person I will report to can’t code, I pass on the contract.
I feel like that’s just a preference regarding jobs.
Part of the job of being the chief coder is having to translate back and forth between the people doing the coding and the people paying them to do so. You need a lot of high level technical knowledge to do the job well, but you aren’t going to be technical in application.
That’s not been my experience. I have more of a ‘hospitality’ mindset, ie: If the GM isn’t willing to hit the line and dishdog in a crunch, he’s a shitty GM and you’ll end up with a poorly performing restaurant.
The ‘chief coder’ might not be the best coder, but when I or the team have to give a presentation to explain heap spraying to the boss, then that’s not a boss I want.
That’s only if the company specializes in one type of software.
It is common in larger companies or companies that need software but aren’t software companies where you are going to hit a manager with little technical talent, let alone less technical talent in what you’re working on.
Fortunately, I’m in a position where I can choose to pass on that type of scenario.
My current employer was founded on the basis of the first two statements. They said they would never hire anyone who didn’t have a background in tech. Even the HR manager lady who processed my onboarding had a history of coding and I’ve never before seen an individual who had been in both industries.
Unfortunately, since I started, my company was bought by a bigger company who was then themselves bought by a bigger company. Though my employer still has one of the best workforces I’ve ever seen, it seems we no longer hold the “tech background only” policy.
I think that the industries based around cryptocurrencies and other blockchain technologies have always been, and have repeatedly proven themselves to be, nothing more or less than scams run and perpetuated by scam artists.
Can you please expand on this and help me out here?
I’m coming across people who are true believers in crypto and while I insist it’s a scam and it’s destroying the fucking planet, they go down the rabbit hole into places I can’t follow because I’ve literally not had the interest nor desire to read up on crypto.
They keep saying that what’s really destroying the planet is the existing financial system with all of the logistics involved with keeping it up as opposed to the cryptofarms adding to the demand on the electric grid. They say that is the goal, to replace the existing financial energy demand with crypto but again, it’s only added to it. Another talking point is that in the case of global climate catastrophe there will be pockets of electricity and cryptoservers somewhere on the planet and that while crypto will remain all the other financial systems will disappear
They also seem to somehow think it’s the fix to workplace bureaucracy somehow and everything in sight
Please impart some knowledge.
Bitcoin and all similar crypto were intentionally designed to be self deflating, it won’t replace finance, it’s speed running the same problems. The reason almost every country on earth switched to fiat/self inflating currencies is that the best way to invest a deflating currency is to stash it and forget about it.
Please explain like I’m a bean
Why deflation is bad: deflation means that as time goes on the same amount of money is worth more. This means that a viable way to invest the money is to hold onto it. Say there is yearly deflation of 4%, that means any investment which has a return lower than 4% is losing you money. Additionally intelligent consumers will cut down on purchases since they can buy more for less later. This leads to economic slowdowns and can self compound if suppliers decide to lower prices.
This is one reason why countries like inflation, it encourages spending and investment.
Bitcoin and similar crypto require new coins to validate all previous coins and interactions. Each new coin is exponentially more expensive than the previous. Therefore Bitcoin wealth is extremely stratified to early adopters who built up a collection before the value became this obscene.
What about the new sentiment that pushes the switch back to the gold standard, is this a pipe dream? Aren’t there some major backers of this idea who hold it to be viable?
Complete pipe dream, commodity backed currency means the currency issuer loses control of inflation/deflation to production of said commodity. For a commodity backed currency to maintain value, the commodity stores owned by the issuer have to grow in proportion to monetary demand (usually GDP growth).
When I was in undergrad I did debate, and a term that was used to describe the debate topics was “a solution in need of a problem”. I think that that very often characterizes the tech industry as a whole.
There is legitimately interesting math going on behind the scenes with AI, and it has a number of legitimate, if specialized, use-cases - sifting through large amounts of data, etc. However, if you’re an AI company, there’s more money to be made marketing to the general public and trying to sell AI to everyone on everything, rather than keeping it within its lane and letting it do the thing that it does well, well.
Even something like blockchain and cryptocurrency is built on top of somewhat novel and interesting math. What makes it a scam isn’t the underlying technology, but rather the speculation bubbles that pop up around it, and the fact that the technology isn’t being used for applications other than pushing a ponzi scheme.
For my own opinions - I don’t really have anything I don’t say out loud, but I definitely have some unorthodox opinions.
-
I think that the ultra-convenient mobile telephone, always on your person at all times, has been a net detriment societally speaking. That is to say, the average iPhone user would be living a happier, more fulfilling, more authentic life if iPhones had not become massively popular. Modern tech too often substitutes genuine real-in-person interactions for online interactions that only approximate it. The instant gratification of always having access to all these opinions at all times has created addictions to social media that are harder to quit than cocaine (source: I have a friend who successfully quit cocaine, and she said that she could never quit instagram). The constantly-on GPS results in people not knowing how to navigate their own towns; if you automate something without learning how to do it, you will never learn how to do it. While that’s fine most of the time, there are emergency situations where it just results in people being generally less competent than they otherwise would have been.
-
For the same reason, I don’t like using IDEs. For example when I code in java, the ritual of typing “import javafx.application.Application;” or whatever helps make me consciously aware that I’m using that specific package, and gets me in the headspace. Plus, being constantly reminded of what every single little thing does makes it much easier for me at least to read and parse code quickly. (But I also haven’t done extensive coding since I was in undergrad).
-
Microsoft Office Excel needs to remove February 29th 1900. I get that they have it so that it’s backwards compatible with some archaic software from the 1990s; it’s an annoying pet peeve.
-
Technology is not the solution to every problem, and technology can make things worse as much as it can make things better. Society seems to have a cult around technological progress, where any new tech is intrinsically a net good for society, and where given any problem the first attempted solution should be a technological one. But for example things like the hyperloop and tesla self-driving cars and so forth are just new modern technology that doesn’t come anywhere near as close to solving transportation problems as just implementing a robust public transit network with tech that’s existed for 200 years (trains, trolleys, busses) would.
For the same reason, I don’t like using CLIs.
IDEs?
Yes, my bad, I get all the TLAs mixed up.
I’m interested in reading more about coding java without an IDE, what’s your usual workflow? Do you use maven or gradle or something else? Are there solutions or scripts you use to make up for some functionality of an IDE?
-
A very large portion (maybe not quite a majority) of software developers are not very good at their jobs. Just good enough to get by.
And that is entirely okay! Applies to most jobs, honestly. But there is really NO appropriate way to express that to a coworker.
I’ve seen way too much “just keep trying random things without really knowing what you’re doing, and hope you eventually stumble into something that works” attitude from coworkers.
I actually would go further and say that collectively, we are terrible at what we do. Not every individual, but the combination of individuals, teams, management, and business requirements mean that collectively we produce terrible results. If bridges failed at anywhere near the rate that software does, processes would be changed to fix the problem. But bugs, glitches, vulnerabilities etc. are rife in the software industry. And it just gets accepted as normal.
It is possible to do better. We know this, from things like the stuff that sent us to the moon. But we’ve collectively decided not to do better.
Main difference is, a bridge that fails physically breaks, takes months to repair, and risks killing people. Your average CRUD app… maybe a dev loses a couple or hours figuring out how to fix live data for the affected client, bug gets fixed, and everybody goes on with their day.
Remember that we almost all code to make products that will make a company money. There’s just no financial upside to doing better in most cases, so we don’t. The financial consequences of most bugs just aren’t great enough to make the industry care. It’s always about maximizing revenue.
maybe a dev loses a couple or hours figuring out how to fix live data for the affected client, bug gets fixed, and everybody goes on with their day.
Or thousands of people get stranded at airports as the ticketing system goes down or there is a data breach that exposes millions of people’s private data.
Some companies have been able to implement robust systems that can take major attacks, but that is generally because they are more sensitive to revenue loss when these systems go down.
I’m not sure if you’re agreeing or trying to disprove my previous comment - IMHO, we are saying the exact same thing. As long as those stranded travelers or data breaches cost less than the missed business from not getting the product out in the first place, from a purely financial point of view, it makes no sense to withhold the product’s release.
Let’s be real here, most developers are not working on airport ticketing systems or handling millions of users’ private data, and the cost of those systems failing isn’t nearly as dramatic. Those rigid procedures civil engineers have to follow come from somewhere, and it’s usually not from any individual engineer’s good will, but from regulations and procedures written from the blood of previous failures. If companies really had to feel the cost of data breaches, I’d be willing to wager we’d suddenly see a lot more traction over good development practices.
… If companies really had to feel the cost of data breaches, I’d be willing to wager we’d suddenly see a lot more traction over good development practices.
that’s probably why downtime clauses are a thing in contracts between corporations; it sets a cap at the amount of losses a corporation can suffer and it’s always significantly less than getting slapped by the gov’t if it ever went to court.
I’m just trying to highlight that there is a fuzzier middle ground than a lot of programmers want to admit. Also, a lot of regulations for that middle ground haven’t been written; the only attention to that middle ground have been when done companies have seen failures hit their bottom line.
I’m not saying the middle ground doesn’t exist, but that said middle ground visibly doesn’t cause enough damage to businesses’ bottom line, leading to companies having zero incentive to “fix” it. It just becomes part of the cost of doing business. I sure as hell won’t blame programmers for business decisions.
It just becomes part of the cost of doing business.
I agree with everything you said except for this. Often times, it isn’t the companies that have to bear the costs, but their customers or third parties.
Yup, this is exactly it. There are very few software systems whose failure does not impact people. Sure, it’s rare for it to kill them, but they cause people to lose large amounts of money, valuable time, or sensitive information. That money loss is always, ultimately, paid by end consumers. Even in B2B software, there are human customers of the company that bought/uses the software.
That’s why I don’t work on mission critical stuff.
If my apps fail, some Business Person doesn’t get to move some bits around.
A friend of mine worked in software at NASA. If her apps failed, some astronaut was careening through space 😬
Managers decided that by forcing people to deliver before it’s ready. It’s better for the company to have something that works but with bugs, rather than delaying projects until they are actually ready.
In most fields where people write code, writing code is just about gluing stuff together, and code quality doesn’t matter (simplicity does though).
Game programmers and other serious large app programmers are probably the only ones where it matters a lot how you write the code.
Kind of the opposite actually.
The Business™️ used to make all decisions about what to build and how to build it, shove those requirements down and hope for the best.
Then the industry moved towards Agile development where you put part of the product out and get feedback on it before you build the next part.
There’s a fine art to deciding which bugs to fix win. Most companies I’ve worked with aren’t very good at it to begin with. It’s a special skill to learn and practice
Agile is horrible though. It sounds good in theory but oh my god its so bad.
I read somewhere that everyone is bad at their job. When you’re good at your job you get promoted until you stop being good at your job. When you get good again, you get promoted.
I know it’s not exactly true but I like the idea.
They call that the Peter Principle, and there’s at least one Ig Nobel Prize winning study which found that it’s better to randomly promote people rather than promote based on job performance.
I don’t want to get promoted… Once my job isn’t mainly about programming anymore (in a pretty wide sense though), I took a wrong turn in life 😅
maybe not quite a majority
VAST majority. This is 80-90% of devs.
I think it’s definitely the majority. The problem is that a lot of tech developments, new language features and Frameworks then pander to this lack of skill and then those new things become buzzwords that are required at most new jobs.
So many things could be got rid of if people would just write decent code in the first place!
deleted by creator
Most of the high visibility “tech bros” aren’t technical. They are finance bros who invest in tech.
No class consciousness. Too many tech workers think they’re rugged individuals that can negotiate their own contracts into wealth.
Working for free on nights and weekends to “hit that deadline” is not good. You’re just making the owners rich, and devaluing labor. Even if you own a lot of equity, it’s not as much as the owners.
And then there’s bullshit like return to office mandates and people are like “oh no none of us want to do this but there’s no organized mechanism to resist”
Join Tech Workers Coalition
The whole “tech industry” naming is bulllshit, there is more technology let’s say in composite used to build an aircraft wing or in a surgerical robots, than in yet another mobile app showing you ads
The whole tech sector also tend to be over evaluated on the stock market. In no world Apple is worth 3 trillion while coca cola or airbus are worth around 200 billions
More people own an iPhone than an Airbus plane.
I think most people who actually work in software development will agree with you on those things. The problem is that it’s the marketing people and investors who disagree with you, but it’s also them who get to make the decisions.
Not a software dev, but for me it’s the constant leap from today’s “next best thing” to tomorrow’s. Behind the Bastards did an episode on AI, and his take resonated with me. Particularly his Q&A session with some AI leaders at, I think, CES not long ago. When the new hotness gets popular, an obscene amount of money is paired with the “move fast and break things” attitude in a rush to profit. This often creates massive opportunities for grifters as legislators are mind numbing slow to react to these new technologies. And when regulations are finally passed (or more recently, allowed by the oligarchs), they’re often written to protect the billionaires (read: “job creators”) more than the common customer. Everyone’s bought into the idea that slow and methodical stifles innovation. At least the people funding and regulating these things have.
I think companies that use unethically trained AI (read: basically all gen AI) should be subject to massive litigation, or at least severely damaging boycotts.
Have mentioned it to a lawyer at work, and he was like “I get it, but uh… fat chance, lol”. Would not dare mention it to the AI-hungry folks in leadership.
You can’t litigate against owner class as working class. Federal government is sold out their asses so they won’t do it.
Litigation is a dispute resolution tool for the owners, between owners.
There is NOT a viable way forward within the courts or political processes.
Things will get worse before anything changes.
Source: Dead CEO and how they treat luigi
All software should be open source
For the sake of humanity
All software should be released as a common good that cannot be captured by corporations. Otherwise it’s just free labor for Amazon, Google and Facebook
Please stop with the AI pushing. It’s a solution looking for a problem, it’s a waste in 90% of the cases.
‘Using cloud software will lead to lower costs and a better overall service quality’