General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsThe end of coding as we know it: ChatGPT has come for software developers (Insider)
https://www.businessinsider.com/chatgpt-ai-technology-end-of-coding-software-developers-jobs-2023-4Archive page at https://archive.ph/7XceZ .
Please read all of this article, published this morning. I ran across a link to it in an Insider update on tech news at https://www.businessinsider.com/google-amazon-apple-chevy-microsoft-united-philadelphia-maine-2023-4 .
I don't even know where to begin to excerpt this, but maybe here...
Coding, as an occupation, has long been considered a haven from the relentless advance of technology. Even as new gizmos replaced other jobs, the people who wrote the instructions for the machines felt untouchable. Universities rushed to expand their computer-science programs. Policymakers scrambling to futureproof the workforce stuck to one unwavering message: Learn to code! But in recent weeks, behind closed doors, I've heard many coders confess to a growing anxiety over the sudden advent of generative AI. Those who have been doing the automating fear they will soon be automated themselves. And if programmers aren't safe, who is?
Much has been written about how AI is coming for white-collar jobs. Researchers at OpenAI, which created ChatGPT, recently examined the degree to which large language models could perform the 19,000 tasks that make up the 1,000 occupations across the US economy. Their conclusion: 19% of workers hold jobs in which at least half their tasks could be completed by AI. The researchers also noted two patterns among the most vulnerable jobs: They require more education and come with big salaries. "We didn't think that would be the case," says Ethan Mollick, a professor of management at Wharton who studies innovation. "AI was always supposed to automate dangerous, dirty tasks not the things we want to do."
But one white-collar skill set, the study found, is especially at risk for being automated: computer programming. The reason? Large language models like the one powering ChatGPT have been trained on huge repositories of code. Researchers at Microsoft and its subsidiary GitHub recently divided software developers into two groups one with access to an AI coding assistant, and another without. Those assisted by AI were able to complete tasks 56% faster than the unassisted ones. "That's a big number," Mollick says. By comparison, the introduction of the steam engine in the mid-1800s boosted productivity at large factories by only 15%.
Tech companies have rushed to embrace generative AI, recognizing its ability to turbocharge programming. Amazon has built its own AI coding assistant, CodeWhisperer, and is encouraging its engineers to use it. Google is also asking its developers to try out new coding features in Bard, its ChatGPT competitor. Given the tech industry's rush to deploy AI, it's not hard to envision a near future in which we'll need half as many engineers as we have today or, down the line, one-tenth or one-hundredth (Emad Mostaque, the CEO of Stability AI, has gone as far as predicting "there's no programmers in five years." ). For better or worse, the rise of AI effectively marks the end of coding as we know it.
-snip-
That's less than 1/4 of the article. All well worth reading.
Torchlight
(6,274 posts)My dad would bring into the press-room and show me the hundreds of drawers the individual characters were stored in. Weird thing I learned, upper case type was called as such s they were kept in the upper drawers.
Now I work in a publishing company half a world away, and can accomplish five weeks of his work, plus his layout crew in one morning. In '89 WordPerfect didn't change the publishing industry though, we simply adapted (four times as many on payroll today than when I started 32 years ago).
I usually look forward to new tech and the concomitant art of human adaptability.
dalton99a
(91,880 posts)Mollick adds that he's watched people try ChatGPT for a minute, find themselves underwhelmed by its abilities, and then move on, comforted by their superiority over AI. But he thinks that's dangerously shortsighted, given how quickly the technology is improving. When ChatGPT, powered by the 3.5 model of GPT, took the bar exam, for instance, it scored in the 10th percentile. But less than a year later, when GPT 4 took the test, it scored in the 90th percentile. "Assuming that this is as good as it gets strikes me as a risky assumption," Mollick says.
Hughes has seen the same head-in-the-sand reaction from his fellow coders. After ChatGPT aced his tic-tac-toe challenge, he was scared to look his phone, for fear of seeing yet another headline about the tool's human-like capabilities. Then, as an act of catharsis, he wrote a long post on his Medium blog a step-by-step, worst-case scenario of how he thought AI could replace programmers over the next decade. The response was telling: Developers flooded the comments section with impassioned critiques, some of them so aggressive and toxic that Hughes felt forced to delete them. In post after post, they listed all the ways they thought they were still better coders than ChatGPT. "You are a really bad software developer if you don't understand the number of AI limitations," one seethed. AI, they were confident, won't replace what they bring to the job anytime soon.
Reading the comments, I found myself thinking the critics were missing the point. AI is still in its infancy. Which means, much as with a newborn human, we need to start thinking about how it will affect our lives and our livelihoods now, before its needs outstrip our ability to keep up. For the moment, we still have time to shape the future we actually want. Sooner or later, there may come a day when we no longer do.
getagrip_already
(17,802 posts)For years now, "coders" have been doing exactly what chatgpt is doing.
They have been "re-using" code they find on-line in places like github. If you can google, you can code.
ChatGPT is just making it so you don't have to be a programmer to search and assemble exsting code. ChatGPT can't build complex solutions, or add layers of security (it actually sucks at generating secure code), or incorporate user test feedback results.
It can code a framework for simple models. But it doesn't "create" new ideas.
Maybe it will get there someday.
But in the meantime, good software engineers (not coders) will still have a place. The job shoppers, and spec contractors, not so much.
speak easy
(12,595 posts)Software engineers are designers.
HariSeldon
(536 posts)1) LLMs "hallucinate" with startling frequency. Do you trust the code they generate? Programming languages exist, at least in part, because natural languages aren't specific enough to provide unambiguous instruction to machines. Additionally, most programming languages have subtle nuances...without a solid understanding of these, how can the output of the LLM be evaluated for correctness?
2) To date, no one has spent much effort accounting the cost of LLM assistance. Such costs include the direct monetary costs of providing or accessing the AI, the cost of maintaining/updating the model, and the environmental impact of the energy used to host the AI, where the latter is, for the most part, currently absorbed by the commons (i.e. the public). If those costs are factored in, is it still economically beneficial to use LLMs for programming (or programming assistance)?
Also, as a practitioner of over two decades, I'd say that writing the code becomes mental background noise. On the highest -productivity teams in which I've participated, we've frequently used source code as the part of our team dialog, especially when the details are critical -- programming languages intentionally leave little-to-no room for ambiguity.
Learning to code may still be good advice, even if the coding itself ends up mostly handled by AIs; coding involves a certain mental paradigm that is generally useful in problem solving, but also useful in simply understanding how the computer system we nowadays inevitably interact with are organized. I've often been able to coax balky software into desired behavior because I could intuit the solution model it used, which seemed impenetrable to my non-coding friend/coworker/significant-other.
Bernardo de La Paz
(60,320 posts)Happy Hoosier
(9,385 posts)Dedicated autocoding tools have been around a while. They are mediocre at best.
It's hard for me to imagine that a General Purpose AI is gonna do better than a dedicated autocoder.
Bernardo de La Paz
(60,320 posts)Not yet, not until there is a general super-human AI. Not specialty super-human like a Go-playing AI or a construction excavator. Collaboration will be needed for a long time.
I worked for a startup 25 years ago on assisted coding. It has come some distance since then (no AI at the time) but still has far to go.
There is more to coding than mere coding: there is programming. And there is more than just programming, there is software engineering. And there is more to software engineering: knowledge of what things mean to humans, how they use stuff, and what economic and societal effects pieces of "code" have.
Emad Mostaque is either an idiot or a shameless huckster.
(Sorry, but not the half hour response I wrote earlier that posting software wiped out (in the go-back step after the 403 server error))