Short disclosure, I work as a Software Developer in the US, and often have to keep my negative opinions about the tech industry to myself. I often post podcasts and articles critical of the tech industry here in order to vent and, in a way, commiserate over the current state of tech and its negative effects on our environment and the Global/American sociopolitical landscape.
I’m generally reluctant to express these opinions IRL as I’m afraid of burning certain bridges in the tech industry that could one day lead to further employment opportunities. I also don’t want to get into these kinds of discussions except with my closest friends and family, as I could foresee them getting quite heated and lengthy with certain people in my social circles.
Some of these negative opinions include:
- I think that the industries based around cryptocurrencies and other blockchain technologies have always been, and have repeatedly proven themselves to be, nothing more or less than scams run and perpetuated by scam artists.
- I think that the AI industry is particularly harmful to writers, journalists, actors, artists, and others. This is not because AI produces better pieces of work, but rather due to misanthropic viewpoints of particularly toxic and powerful individuals at the top of the tech industry hierarchy pushing AI as the next big thing due to their general misunderstanding or outright dislike of the general public.
- I think that capitalism will ultimately doom the tech industry as it reinforces poor system design that deemphasizes maintenance and maintainability in preference of a move fast and break things mentality that still pervades many parts of tech.
- I think we’ve squeezed as much capital out of advertising as is possible without completely alienating the modern user, and we risk creating strong anti tech sentiments among the general population if we don’t figure out a less intrusive way of monetizing software.
You can agree or disagree with me, but in this thread I’d prefer not to get into arguments over the particular details of why any one of our opinions are wrong or right. Rather, I’d hope you could list what opinions on the tech industry you hold that you feel comfortable expressing here, but are, for whatever reason, reluctant to express in public or at work. I’d also welcome an elaboration of said reason, should you feel comfortable to give it.
I doubt we can completely avoid disagreements, but I’ll humbly ask that we all attempt to keep this as civil as possible. Thanks in advance for all thoughtful responses.
I think that the industries based around cryptocurrencies and other blockchain technologies have always been, and have repeatedly proven themselves to be, nothing more or less than scams run and perpetuated by scam artists.
Can you please expand on this and help me out here?
I’m coming across people who are true believers in crypto and while I insist it’s a scam and it’s destroying the fucking planet, they go down the rabbit hole into places I can’t follow because I’ve literally not had the interest nor desire to read up on crypto.
They keep saying that what’s really destroying the planet is the existing financial system with all of the logistics involved with keeping it up as opposed to the cryptofarms adding to the demand on the electric grid. They say that is the goal, to replace the existing financial energy demand with crypto but again, it’s only added to it. Another talking point is that in the case of global climate catastrophe there will be pockets of electricity and cryptoservers somewhere on the planet and that while crypto will remain all the other financial systems will disappear
They also seem to somehow think it’s the fix to workplace bureaucracy somehow and everything in sight
Please impart some knowledge.
Bitcoin and all similar crypto were intentionally designed to be self deflating, it won’t replace finance, it’s speed running the same problems. The reason almost every country on earth switched to fiat/self inflating currencies is that the best way to invest a deflating currency is to stash it and forget about it.
Please explain like I’m a bean
Why deflation is bad: deflation means that as time goes on the same amount of money is worth more. This means that a viable way to invest the money is to hold onto it. Say there is yearly deflation of 4%, that means any investment which has a return lower than 4% is losing you money. Additionally intelligent consumers will cut down on purchases since they can buy more for less later. This leads to economic slowdowns and can self compound if suppliers decide to lower prices.
This is one reason why countries like inflation, it encourages spending and investment.
Bitcoin and similar crypto require new coins to validate all previous coins and interactions. Each new coin is exponentially more expensive than the previous. Therefore Bitcoin wealth is extremely stratified to early adopters who built up a collection before the value became this obscene.
What about the new sentiment that pushes the switch back to the gold standard, is this a pipe dream? Aren’t there some major backers of this idea who hold it to be viable?
Complete pipe dream, commodity backed currency means the currency issuer loses control of inflation/deflation to production of said commodity. For a commodity backed currency to maintain value, the commodity stores owned by the issuer have to grow in proportion to monetary demand (usually GDP growth).
I think most people who actually work in software development will agree with you on those things. The problem is that it’s the marketing people and investors who disagree with you, but it’s also them who get to make the decisions.
I took some VC money to build some bullshit and I’ll do it again!
Please stop with the AI pushing. It’s a solution looking for a problem, it’s a waste in 90% of the cases.
like pretty much all industries there are holding companies buying up anything profitable that is not to big to aquire consolidating a hold on the industry. this one https://en.wikipedia.org/wiki/Vista_Equity_Partners bought out my company. I was let go and I don’t think that came from vista but the separation agreement they put in front of me Im pretty sure was. Needless to say I did not sign it as it was crazy.
Blockchain is a joke
CI/CD and a lot of container fuckery is entirely unnecessary for like 80-90% of orgs
The jobs AI will eliminate are managerial and their hustle to implement it will their own death sentence
All software should be open source
All software should be released as a common good that cannot be captured by corporations. Otherwise it’s just free labor for Amazon, Google and Facebook
For the sake of humanity
You’re becoming an old man yelling at clouds. People sad all the same shit about websites back in the 90s. They said the same shit about personal computers in offices in general over the mainframe systems. Unless your software is going to be responsible for actual lives it’s better to get something buggy out on time then drag things out like star citizen soaking up money for no returns.
IT is slowly starting to get regulated like a real engineering field and that’s a good developement.
I’m sad that I missed my opportunity to take a PE exam in software engineering.
Most of the high visibility “tech bros” aren’t technical. They are finance bros who invest in tech.
Much of what we do and have built is overpriced and useless bullshit that doesn’t make anybody better off.
We are inventing solutions and products to manage other solutions and products to manage other solutions and products to…etc etc.
Websites used to be static HTML pages with some simple graphics, images, and some imbedded stuff. Now, you need to know AWS for your IaaS, Kubernetes to manage your scaling and container orchestration for the thousands of Docker containers that you use to compose your app written in some horrific pile of JavaScript related web stacks like NodeJS, Typescript, React, blah blah blah…
Then you need a ton of other 3rd party components that handle authentication, databasing, backups, monitoring, signaling, account creation/management, logging, billing, etc etc.
It’s circles within circles within circles, and all that to make a buggy, overpriced, clunky web app.
Similar is true for IT, massive software suites that most people in the company use 10% of their functionality for stupid shit.
I’m all for advancing technology, I love technology, it’s my job and my hobby.
But the longer I work in this industry, the more I get this sick feeling that we lost the train long time ago. Buying brand new $1,500 laptops every 3 years so that most of our users can send emails, browse the web, and type up occasional memos.
An inability to understand that ‘e-mail’ doesn’t get an S is not how I guessed you work in a lot of Azure.
Few things would make me happier than to never log into an Azure instance ever again lol.
A very large portion (maybe not quite a majority) of software developers are not very good at their jobs. Just good enough to get by.
And that is entirely okay! Applies to most jobs, honestly. But there is really NO appropriate way to express that to a coworker.
I’ve seen way too much “just keep trying random things without really knowing what you’re doing, and hope you eventually stumble into something that works” attitude from coworkers.
maybe not quite a majority
VAST majority. This is 80-90% of devs.
I read somewhere that everyone is bad at their job. When you’re good at your job you get promoted until you stop being good at your job. When you get good again, you get promoted.
I know it’s not exactly true but I like the idea.
They call that the Peter Principle, and there’s at least one Ig Nobel Prize winning study which found that it’s better to randomly promote people rather than promote based on job performance.
I don’t want to get promoted… Once my job isn’t mainly about programming anymore (in a pretty wide sense though), I took a wrong turn in life 😅
I think it’s definitely the majority. The problem is that a lot of tech developments, new language features and Frameworks then pander to this lack of skill and then those new things become buzzwords that are required at most new jobs.
So many things could be got rid of if people would just write decent code in the first place!
I actually would go further and say that collectively, we are terrible at what we do. Not every individual, but the combination of individuals, teams, management, and business requirements mean that collectively we produce terrible results. If bridges failed at anywhere near the rate that software does, processes would be changed to fix the problem. But bugs, glitches, vulnerabilities etc. are rife in the software industry. And it just gets accepted as normal.
It is possible to do better. We know this, from things like the stuff that sent us to the moon. But we’ve collectively decided not to do better.
Main difference is, a bridge that fails physically breaks, takes months to repair, and risks killing people. Your average CRUD app… maybe a dev loses a couple or hours figuring out how to fix live data for the affected client, bug gets fixed, and everybody goes on with their day.
Remember that we almost all code to make products that will make a company money. There’s just no financial upside to doing better in most cases, so we don’t. The financial consequences of most bugs just aren’t great enough to make the industry care. It’s always about maximizing revenue.
maybe a dev loses a couple or hours figuring out how to fix live data for the affected client, bug gets fixed, and everybody goes on with their day.
Or thousands of people get stranded at airports as the ticketing system goes down or there is a data breach that exposes millions of people’s private data.
Some companies have been able to implement robust systems that can take major attacks, but that is generally because they are more sensitive to revenue loss when these systems go down.
That’s why I don’t work on mission critical stuff.
If my apps fail, some Business Person doesn’t get to move some bits around.
A friend of mine worked in software at NASA. If her apps failed, some astronaut was careening through space 😬
Yup, this is exactly it. There are very few software systems whose failure does not impact people. Sure, it’s rare for it to kill them, but they cause people to lose large amounts of money, valuable time, or sensitive information. That money loss is always, ultimately, paid by end consumers. Even in B2B software, there are human customers of the company that bought/uses the software.
I’m not sure if you’re agreeing or trying to disprove my previous comment - IMHO, we are saying the exact same thing. As long as those stranded travelers or data breaches cost less than the missed business from not getting the product out in the first place, from a purely financial point of view, it makes no sense to withhold the product’s release.
Let’s be real here, most developers are not working on airport ticketing systems or handling millions of users’ private data, and the cost of those systems failing isn’t nearly as dramatic. Those rigid procedures civil engineers have to follow come from somewhere, and it’s usually not from any individual engineer’s good will, but from regulations and procedures written from the blood of previous failures. If companies really had to feel the cost of data breaches, I’d be willing to wager we’d suddenly see a lot more traction over good development practices.
… If companies really had to feel the cost of data breaches, I’d be willing to wager we’d suddenly see a lot more traction over good development practices.
that’s probably why downtime clauses are a thing in contracts between corporations; it sets a cap at the amount of losses a corporation can suffer and it’s always significantly less than getting slapped by the gov’t if it ever went to court.
I’m just trying to highlight that there is a fuzzier middle ground than a lot of programmers want to admit. Also, a lot of regulations for that middle ground haven’t been written; the only attention to that middle ground have been when done companies have seen failures hit their bottom line.
I’m not saying the middle ground doesn’t exist, but that said middle ground visibly doesn’t cause enough damage to businesses’ bottom line, leading to companies having zero incentive to “fix” it. It just becomes part of the cost of doing business. I sure as hell won’t blame programmers for business decisions.
It just becomes part of the cost of doing business.
I agree with everything you said except for this. Often times, it isn’t the companies that have to bear the costs, but their customers or third parties.
Managers decided that by forcing people to deliver before it’s ready. It’s better for the company to have something that works but with bugs, rather than delaying projects until they are actually ready.
In most fields where people write code, writing code is just about gluing stuff together, and code quality doesn’t matter (simplicity does though).
Game programmers and other serious large app programmers are probably the only ones where it matters a lot how you write the code.
Kind of the opposite actually.
The Business™️ used to make all decisions about what to build and how to build it, shove those requirements down and hope for the best.
Then the industry moved towards Agile development where you put part of the product out and get feedback on it before you build the next part.
There’s a fine art to deciding which bugs to fix win. Most companies I’ve worked with aren’t very good at it to begin with. It’s a special skill to learn and practice
Agile is horrible though. It sounds good in theory but oh my god its so bad.
It’s usually the implementation of Agile that’s bad.
The Manifesto’s organizing principles are quite succinct and don’t include a lot of the things that teams dislike.
We follow these principles: Our highest priority is to satisfy the customer through early and continuous delivery of valuable software. Welcome changing requirements, even late in development. Agile processes harness change for the customer's competitive advantage. Deliver working software frequently, from a couple of weeks to a couple of months, with a preference to the shorter timescale. Business people and developers must work together daily throughout the project. Build projects around motivated individuals. Give them the environment and support they need, and trust them to get the job done. The most efficient and effective method of conveying information to and within a development team is face-to-face conversation. Working software is the primary measure of progress. Agile processes promote sustainable development. The sponsors, developers, and users should be able to maintain a constant pace indefinitely. Continuous attention to technical excellence and good design enhances agility. Simplicity--the art of maximizing the amount of work not done--is essential. The best architectures, requirements, and designs emerge from self-organizing teams. At regular intervals, the team reflects on how to become more effective, then tunes and adjusts its behavior accordingly.
deleted by creator
I don’t feel comfortable sharing any personal opinions at work. The workplace is somewhere one should arrive, work, go home, not somewhere to share opinions and in doing so make potential enemies or risk your position.
Why do I care if my colleagues know my opinion on X or Y? It changes nothing about my life or theirs, we’re not even friends, just colleagues, or work friends at best.
Anyway yeah, that’s just my thoughts :-)
My opinion on tech is that there are cool things being done that do one shiney thing, but everyone disregards the shit it produces behind the scenes. Blookchain is an awesome concept, the whole chain depends on all the other parts of it, but the fact that in order to use it, you have to download the whole thing in several systems. The size of a single will grow so large, only a few companies will be able to analyze it at scale. And AI is a huge joke. Nobody should be celebrating generative AI. A ton of computing power that is dangerous to our eco system, and it’s all trained on shady material. Nobody is doing anything significant about the power consumption, just coming up with agencies to help companies use AI properly. It’s all a joke. Most of our most influencial technologies are just someone asking how to make big bucks off something comes else created for free.
It’s one of the reasons I enjoy working on open source. Sure the companies that pay the bills for that maintenance might not be the ones you would work for directly but I satisfy myself that we are improving a commons that everyone can take advantage of.
I told my lib colleague about how many software creators provide their stuff and its source code for free and he could barely get why; I also told him historically many nations just left their research and findings available publicly for people to learn from and he can’t grasp why that was either.
He does truly believe the profit motive is the only (best?) way to advance science.
Yes and no. A lot of the projects I work on the majority of the engineers are funded by companies which have very real commercial drivers to do so. However the fact the code itself is free (as in freedom) means that everyone benefits from the commons and as a result interesting contributions come up which aren’t on the commercial roadmap. Look at git, a source control system Linus built because he needed something to maintain Linux in and he didn’t like any of the alternatives. It solved his itch but is now the basis for a large industry of code forges with git at their heart.
While we have roadmaps for features we want they still don’t get merged until they are ready and acceptable to the upstream which makes for much more sustainable projects in the long run.
Interestingly while we have had academic contributions there are a lot more research projects that use the public code as a base but the work is never upstreamed because the focus is on getting the paper/thesis done. Code can work and prove the thing they investigating but still need significant effort to get it merged.
Software dev tools and process are so convoluted and unnecessary. We need to find a happy medium between sites being published via FTP uploads like before and the CI/CD madness of today. And there’s too many tooling options available. It’s caused a huge amount of disparity between options. Look at the JavaScript ecosystem for example.
The abortion known as Terraform is a great example of what you mean.
Meh, I don’t think Terraform was so bad. I mean, I wouldn’t vote to use it all the time. But I didn’t mind it.