This discussion grapples with the evolving role of software developers in the age of AI, exploring themes of job displacement, skill adaptation, the changing nature of "grunt work," and the broader economic and societal implications of AI adoption.
The Optimistic View: AI as an Augmentation and Opportunity
A significant portion of the discussion centers on the idea that AI, rather than being purely a threat, represents an opportunity to elevate human creativity and efficiency. This perspective suggests that AI will automate repetitive tasks, freeing up developers to focus on more complex, strategic, and innovative aspects of their work.
- "What I see is a future where AI handles the grunt work, freeing us up to focus on the truly human part of creation: the next step, the novel idea, the new invention. If we donāt have to spend five years of our early careers doing repetitive tasks, that isnāt a threat, itās a massive opportunity. Itās an acceleration of our potential," stated billy99k.
The Pessimistic View: Job Displacement and the "Two-Tiered" Developer Workforce
Countering the optimistic outlook is a strong current of concern regarding widespread job displacement, particularly for developers whose work primarily consists of "grunt work" or less complex tasks like CRUD applications. This view posits that a large segment of the developer population may struggle to adapt or find new roles.
- "The problem is that only a fraction of software developer have the ability/skills to work on the hard problems. A much larger percentage will only be able to work on things like CRUD apps and grunt work. When these jobs are eliminated, many developers will be out of work," warned billy99k.
- chii echoed this sentiment, stating, "which is lower valued, and thus it is economically 'correct' to have them be replaced when an appropriate automation method is found."
- csomar provided a stark outlook: "There are millions of software developers. There are hundreds (thousands?) that are working on the cutting edge of things. Think of popular open source projects used by the masses, usually there is one or a handful of developers doing most of the work. If the other side of the puzzle (integration) becomes automated, 95% or more of software developers are redundant."
Skill Adaptation and the Evolving Role of Seniority
Several participants discussed the necessity for developers to adapt their skillsets and anticipate a shift in career progression. The idea of "moving up the value chain" or developing new, more specialized skills to remain relevant emerged as a key theme. Some also suggested a move towards more formalized, licensed engineering roles.
- "Key human engineer skills will be to take liabilty for the output produced by agents. You will be responsible for the signoff, and any good/bad that comes from it," predicted anilgulecha.
- anilgulecha also ventured, "Some engineering roles/areas will become a 'licensed' play - the way canada is for other engineering disciplines."
- Further, anilgulecha offered, "Careers will meaningfully start only at the senior level. At the junior level, your focus is to learn enough of the fundamentals, patterns and design principles so you reach the senior level and be a net positive in the team."
- However, chii expressed skepticism about this progression, noting, "I suspect that juniors will not want to do this, because the end result of becoming a senior is not lucrative enough given the pace of LLM advancement."
- AbstractH24 offered a more nuanced view on this, suggesting, "Thereās a sweet spot right now to be in. Early enfough career to have gotten in the door, but young enfough to be mailable and open to new ways."
The Nature of "Grunt Work" and AI's Limitations
A contentious point within the discussion is the actual capability of current AI models to handle what is considered "grunt work." Some argue that the most difficult and nuanced "grunt work"ālike debugging complex, undocumented issuesāremains beyond AI's current grasp.
- "90% of real grunt work is 'stitch an extra appendage to this unholy abomination of God' or 'points at the screen look at this shit, figure out why is it happening and fix it'. LLMs are more or less useless for either of those things," asserted 123yawaworht456.
- zeta0134 elaborated on this, stating, "It's not really possible for an LLM to pick up on the hidden complexities of the app that real users and developers internalize through practice. Almost by definition, they're not documented! Users 'just know' and thus there is no training data to ingest."
- Conversely, csomar countered that "LLMs are very good at fixing bugs. They do lack broader contexts and tools to navigate the codebase/interface." danielbln agreed, stating, "I disagree, agentic LLMs are incredibly useful for both."
The Role of Management, Hype, and "Vibe Coding"
A significant portion of the conversation focused on the influence of executive decision-making, the tendency to follow trends ("hype"), and the adoption of a less rigorous coding methodology dubbed "vibe coding," often driven by AI. There's a strong sentiment that business leaders might be making decisions based on enthusiasm for AI rather than solid technical feasibility, leading to potentially poor outcomes.
- "I agree that we're not about to be all replaced with AGI, that there is still a need for junior eng, and with several of these points. None of those arguments matter if the C suite doesn't care and keeps doing crazy layoffs so they can buy more GPUs, and intelligence and rationality are way less common than following the cargo cult among that group," argued Arainach.
- simonw offered a counterpoint, suggesting, "The C suite may make some dumb short-term mistakes - just like twenty years ago when they tried to outsource all software development to cheaper countries - but they'll either course-correct when they spot those mistakes or will be out-competed by other, smarter companies."
- scarface_74 described a personal experience: "Honestly this is true for most problems and has been forever for most developers. ... I didnāt write a single line of code, I started by giving ChatGPT the diagram and very much did 'vibe coding' between the code, the database design and the IAC."
- lloeki expressed frustration with the "high velocity hype train," observing, "I keep seeing and talking to people that are completely high guzzling Kool-Aid straight from a pressure tap. The level of confirmation bias is absolutely unhinged 'I told $AGENTICFOOLLM to do that and WOW this PR is 90% AI!', ignoring any previous failed attempt..."
- Adding to this, bluefirebrand commented, "Execs are convinced LLMs produce something that no programmer ever could. Or at least the same thing, faster than any programmer ever could. So they will drive us off a cliff chasing it."
- zombot suspected a more deliberate propagation of the narrative: "I bet there are also a good number of paid shills among those. If you look at how much money goes into the tech it's not too far-fetched to invest a tiny fraction of that in human 'agents' to push the narrative."
- guappa suggested another motive: "Not necessarily paid shills, but it's a good way to get a promotions, and then when it will be revealed that it doesn't actually work, they got the promotion already and will jump on the next hype to get the next promotion."
Concerns about AI Quality and Degradation of Knowledge
A persistent concern is the potential for AI-generated code and content to be of lower quality ("slop") or inaccurate, leading to a decline in overall code quality and a disincentive for humans to produce original content.
- "My guess is that the humans are better at this, and I mostly speak from experience on two separate support floors that tried to add AI to that problem. It fails, miserably," shared zeta0134.
- 999900000999 lamented, "The problem is with more and more AI sloop, less humans will be motivated to write. AGI at least the first generation is going to be an extremely confident entity that refuses to be wrong. Eventually someone is going to lose a billion dollars trusting it, and itāll set back AI by 20 years. The biggest issue with AI is it must be right. Itās impossible for anything to always be right since itās impossible to know everything."
- erentz echoed this practicality: "The problem with AI is if itās right 90% of the time but I have to do all the work anyway to make sure itās not one of the 10% of times itās extremely confidently wrong, what use is it to me?"
- mirsadm expressed a broader fear: "However now I am much more concerned about what will happen in 5 or 10 years of widespread AI slop. When humans lose motivation to produce content and all we're left is AI continually regenerating the same rubbish over and over again. I suspect there'll be a shortage of programmers in the future as people are hesitant to start a career in programming."
- lawgimenez shared a direct negative experience: "I just recently inherited a vibe coded project in iOS (fully created with ChatGPT), and not even close to working."
Capitalism, Rationality, and Market Forces
The discussion touched upon the nature of capitalism and market forces in driving these changes. Some participants expressed cynicism regarding the "rationality" of markets and corporate behavior, especially in the face of potential job losses.
- "Well thatās the nice thing about capitalism. If it doesnāt work, it eventually dies," offered paulddraper.
- Arainach provided a harsh rejoinder: "And the not nice thing about capitalism is that it can keep not working longer than most of us can pay for rent and food."
- zombot cynically stated, "The belief in a rational market approaches religion."
- hnthrow90348765 articulated strong frustration: "Cannot understate how absolutely enraging it is every time 'basic economics', 'supply and demand', or 'basic capitalism' comes up as a thought-terminating response despite everything government does to keep failing stuff going."
The "Shared Brain" and Ownership Concerns
A philosophical point was raised about the collective effort in building AI knowledge bases, questioning the ownership and distribution of benefits from this "shared brain."
- disambiguation queried, "'Shared' as in shareholder? 'Shared' as in piracy? 'Shared' as in monthly subscription? 'Shared' as in sharing the wealth when you lose your job to AI?"
- 3D30497420 expressed distrust: "I would be more excited about this concept if this shared brain wasn't owned by rich, powerful people who will most certainly deploy this power in ways that benefit themselves to the detriment of everyone else."
The Need for Professional Standards and Unionization
In light of the uncertainties and potential disruption, a call for professional standards and even unionization emerged as a way to bring credibility and stability to the.
- "Software engineers should unionize. Weāre not real engineers until we have professional standards that are enforced (as well as liability for what we make). Virtually every other profession has some mandatory license or other mechanism to bring credibility and standards to their fields. AI just further emphasizes the need for such an institution," argued asimpletune.
Shifting Value and the Future of Online Presence
The conversation also touched upon the decline of manual website coding and the broader shift in how personal and business online presences are managed, with platforms like WordPress and Shopify significantly changing the landscape.
- "It's hard to see manual coding of web sites remaining a thing for much longer," mused Animats.
- sublinear noted, "If you're talking about personal websites I think that ship sailed almost 20 years ago with the rise of social media. If you mean business websites, they are just about the most volatile code out there..."
- bluefirebrand agreed, stating, "Well forget social media even. Wordpress, and now Shopify have definitely eaten the personal website."