Build full “product skills” and you'll probably be fine

794 points14
carlmr13 hours ago

Looking back we had one CS professor who in 2007 predicted we'd all be jobless in ten years, i.e. 2017.

His prediction was based on the trends he was seeing at the time. But it wasn't even AI. Instead he made this prediction because he saw the rise of no-code tools replacing software developers because managers could finally cut out the pesky "translators", i.e. software developers.

I said it then and I will say it now. If your managers could specify what they need in a manner that no-code tools, or now AI, can generate the code they want, they will have to be extremely exact in their language. So exact in fact that they will need to specify a program in a click and drag interface, or in human language.

Since they hire software developers to make the specification more rigid, and the managers don't seem to be getting better at this over time, why would you believe this skill set is going to go away?

In essence what has happened in software development is that the level of abstraction has gone up while the machine has taken over more and more of the nitty gritty details. From punchcards, to assembly, to COBOL, C, Perl, Java, Python, Erlang, Rust.

Of course I'm leaving out some languages here, but the level of abstraction has been rising.

But the rigidity of what is needed to specify a program that really does what you want hasn't. Especially evidenced by the fact that recent programming language developments often have a specific area where they shine, but not raising the abstraction level that much.

I'd be surprised if the next step is "Hi, I'm an ideas guy, please give me an app that does Uber, for bicycles, but better."

ChrisMarshallNY10 hours ago

> So exact in fact that they will need to specify a program in a click and drag interface, or in human language.

This. I started programming in Machine Code, where the "editor" was a pad of graph paper.

I've watched management- and business-focused people sneer at us geeks for my entire career, and watched them drool over the thought of finally getting rid of us.

Hasn't happened yet.

> I'd be surprised if the next step is "Hi, I'm an ideas guy, please give me an app that does Uber, for bicycles, but better."

I get that, from "idea people," on a regular basis. They have nothing but contempt for us "Make It Happen" folks. They treat us as if we grow on trees, and are suckers, to boot.

Inevitably, the above, is followed by something that translates to "Of course, you will do all the work, but I'll get all the money, because the idea is what's really valuable."

If I follow up on it at all, I'll be asked to sign an NDA, and meet in some clandestine place, so they can whisper to me about their AI-powered cheese straightener.

chadash8 hours ago

I agree with your characterization in general. If all someone has is an idea and no relevant experience to back it up, run. I can’t stand people like that.

But there’s one type of ideas person I’m thrilled to work with. Someone with deep and successful experience in sales. The head of sales at my company is also relatively product oriented and boy, he just has a knack for what customers want.

In general, I think many engineers can do some product management. We can figure out basic accounting and finance. But the idea of making a cold call to sell dental software, or chasing down school principals at an education conference is almost as foreign to most software engineers as CS is to most sales folks.

Salesperson + engineer = superpowers

moneywoes8 hours ago

Besides opening your own company does a role exist where you can do both?

GuB-424 hours ago

In my company, all of our sales people have a technical background. They are not the best engineers in the company, but they could do the job if they had to. More importantly, when faced with technical people, they are not completely clueless.

Keegs8 hours ago
andric8 hours ago
breck8 hours ago

Agreed. I have a thousand tools mastered to solve any software issue quickly. But people issues can still stump me for days. Both skillsets take dedication and years(decades?) to master.

Simran-B8 hours ago

Wait until managers realize that their jobs are a lot easier to automate away with AI than the jobs of software developers.

osigurdson6 hours ago

The economy is so good at generating value that it is somehow possible for large portions of participants to create zero or negative value.

throwbadubadu7 hours ago

This! And tbh, compared with the "bad manager type" (Dilbert type) that won't feel worse :D

TigeriusKirk5 hours ago

I've been thinking that AI managers is the real killer app for AI. There are challenges, but none seem insurmountable even with the present base tech.

bee_rider5 hours ago
awesomegoat_com50 minutes ago

Indeed. And ChatGPT already can babysit burnout engineers.

(which has been major time sink of engineering managers that I know).

josephg8 hours ago

What makes you think that’s true?

naasking7 hours ago
thuuuomas7 hours ago
antupis6 hours ago
65106 hours ago
irrational5 hours ago

> If your managers could specify what they need in a manner that no-code tools, or now AI, can generate the code they want, they will have to be extremely exact in their language. So exact in fact that they will need to specify a program in a click and drag interface, or in human language.

This is also one of the main reasons why all programming jobs were not outsourced to India.

windexh8er4 hours ago

Couldn't agree more with this sentiment. And to expand on it - the great outsourcing events we saw in the mid-2000s didn't work out for many of the things outside of programming: IT consulting in general, support and operations, call centers and things like design and architecture. The barrier was not always technical, but often a misunderstanding of how BaU works in the <parent_country> vs offshore and/or what the ask/expectations were. There's a lot of waste that happens when needing to be overly explicit and still having the message misinterpreted, interpreted too literally or simply failure to understand.

908B64B19731 minutes ago

> This is also one of the main reasons why all programming jobs were not outsourced to India.

There's a whole industry here in America that re-shores programming contracts. They know they can't underbid Indian/foreign body shops so they just wait a few months and call back the companies who went with cheaper programmers. If the company is still around it's generally a complete re-write.

j7ake9 hours ago

This hits deep and resonates beyond tech. This “ideas” versus “make things happen” divide is also prevalent in science and art.

ilyt9 hours ago

Ye, often the "creator", the "head" gets credited with everything a whole team come up with. Sure, picking the good and directing it to consistent whole is important but they would be nothing without people that produced that in the first place.

admissionsguy5 hours ago
j7ake8 hours ago

Yeah it seems natural to divide the credit evenly, just as midfielders and strikers and coaches get equal credit for winning a game.

ozim4 hours ago

To some level I agree - but at some point there is also much on the "business side" that cannot be easily dealt with. Like having connection or some kind of relation with people who will buy the stuff or would have people who would be interested in buying stuff you make.

Yes there are these "idea people" who don't have any clue about business side and don't have any clue about technical side and in the end don't even have the right connections or business network. But they think they can make it because they have an idea(TM). These could basically play lottery and outcome would be the same they might make it but chances are 1:1000000 at best.

Then there are these business people who have the right connections and have understanding of niche/business they are in .. that need technical help to execute their idea and these are worth their weight in gold.

brookst6 hours ago

Tell me more about this cheese straightener… that sounds amazing. Where do I prepay?

doubled1126 hours ago

I wonder what straightening it provides.

Like when the brick of cheese comes out of the freezer with an obvious bend?

Or like when I don’t make a nice square cut and my sandwich is cheesier in some spots than others?

bee_rider5 hours ago

When you grate the cheese, it often curls a bit. My patent-application-indefinitely-pending cheese straightener uncurls it.

ChrisMarshallNY5 hours ago

It was a George Carlin joke. Don't remember exactly which monologue, but he was talking about how you can generate a need with advertising.

brianwawok9 hours ago

And the way to get back at PHB is to form a tech lead software company and hire 0 people with MBAs.

_a_a_a_8 hours ago

Is that truly fair though. I have no exposure to MBAs and it may be all the negative talk about them is based on the conspicuously bad ones. I'm speaking as an IT person with very little experience of running a business, and it may be that a good MBA could be a great asset. I genuinely don't know.

I'm reminded of reading about a Lisp machine company that ran into the ground because it was managed by techies. Their tech may have been great but their marketing and business skills were very arguably what killed the company.

ryandrake6 hours ago
_glass8 hours ago

I don't have an MBA, but I studied intercultural management as my Masters, and I am writing my PhD right now in management. To be honest it is quite zen like, with the degree you can understand how you can manage engineers by not trying to manage them. Other type of people mostly need closer alignment. I know a lot of stuff of how to make a company more profitable, and generally a nicer place to work. Management is actually quite like engineering, more of a craft than an exact science. The science helps you to have names for things, and a lot of times I really know how to fix people stuff.

pg_12341 hour ago

> I get that, from "idea people," on a regular basis. They have nothing but contempt for us "Make It Happen" folks. They treat us as if we grow on trees, and are suckers, to boot.

Ideas are like assholes ... everybody has one and they're usually full of shit.

chiefalchemist10 hours ago

re: idea people

True. And once their product hits the market, if it ever gets there, they don't thrive.

Prior to launch they are sooooo in love with their idea that they are meticulous about features all the while thinking they're smarter than the market.

They don't understand and appreciate the value of execution. Ideas are easy. Execution - because it involves people as well as adapting to change - is 10x harder.

Yes, those people exist. Unfortunately, that bias will ultimately undermine them, but they'll never admit it.

ChrisMarshallNY9 hours ago

One of the things that I'm fairly good at, is walking people from "Crazy Idea That Will Never Work," through to "Finished Product That People Want."

It tends to be a very long process, and often involves a lot of "trial balloons." I just went through that, in the last couple of years. The project we're realizing, looks absolutely nothing at all like what the CEO originally dreamed up, but everyone that has seen it, loves it.

The trick is to not start off by saying "It'll never work." That slams doors shut, right away.

It's more like, "OK, so let's walk through what we'll need to do, to make it work."

That will often result in changes being made, by the "idea person," as the plans are laid. We will also try to create test harnesses and prototypes. These often end up, with the idea person going " seemed like a good idea, but it doesn't work the way I wanted."

It's slow and painful, but works.

Frost1x7 hours ago
naasking7 hours ago

> It's more like, "OK, so let's walk through what we'll need to do, to make it work."

Socratic Software development is the way to go.

A_Venom_Roll9 hours ago
chiefalchemist9 hours ago

Yes. They have a want. The key is to nail down the need. But they have to be willing.

belter9 hours ago

It's the usual: "I have a great idea for a Startup, now I just need the money to hire some Developers to implement it...". The Winklevoss twins for example come to mind...

jackmott10 hours ago


raincole12 hours ago

> His prediction was based on the trends he was seeing at the time. But it wasn't even AI. Instead he made this prediction because he saw the rise of no-code tools replacing software developers because managers could finally cut out the pesky "translators", i.e. software developers.

It might sound really crazy and stupid today, but when SQL came out, it's advertised as one of "program-generating" languages and was supposed to reduce the need to code.

(I mean, in some sense it's true because it's much less code than writing our own database...)

gregjor12 hours ago

My career predates relational/SQL databases so I can confirm what you wrote. When Oracle came out the buzz was that a whole bunch of programmers would go away because managers and executives could write queries in a “natural” English-like language.

That never happened. SQL is hard to master, but it’s the easy part of understanding the relational model and any particular schema. Instead Oracle and the other RDBMSs that followed created more jobs for programmers and database analysts and admins.

ozim1 hour ago

I would argue that it is not much harder than using Excel.

But there are good reasons where you don’t want random people running sql queries on production database or having direct access to the data.

rvba10 hours ago

As someone trying to learn SQL it feels that there are much more steps:

* language syntax

* the relational model (relatively easy? It is just "there"?)

* database schema, where I have 3 000 tables, or 50 0000 tables (that's how ERPs are made)

* actualy knowing how to use the language

* building those queries that join columns from 15 different tables together to deliver the needed data -> it sounds easy, but Im struggling to do it better/faster - I never saw a book or website that focuses on that (multiple joins from multiple tables), when it feels my work is is mostly that

* understanding what is needed

* actual "programming" problems (say: you have 100k items on stock, those are parts of BOMs, make a list of 'where used'... yes I know you can google it and try to copy from stockoverflow)

Seriously, I am trying now how to learn how to setup a simple DB to consolidate data (20 csv files on a shareddrive) and the guides are often like in the 'how to draw an owl' meme...

Controlling/reporting/analyst jobs feel in some way as "programmer" but without tools/training/salary - just figuring stuff on your own. Im doing it, but apart from that SQLzoo website I didnt manage to find any decent guide for the 'usual' problems. Also since those are like 10% of my work I cant focus on those problems properly - and try to learn it after work.

Also SQLzoo is a lot like the "how to draw the owl" meme.. the easy tasks are easy, the hard ones are impossible and there are no medium tasks. There also dont seem to ne any real life examples like those tasks like: 'join 15 tables to make some report'.

javajosh27 minutes ago

Tools matter. When I learned SQL long ago it was with FoxPro for DOS, and it was a great tool for doing both SQL and text based UIs (as with curses). Later, I used MS Access 97 and it was an even better tool and sparked a lifelong interest in data modeling. The ui for building up tables (specifying column data types, etc) was really trail-blazing at the time and the interaction remains good today. The built-in ERD charting tool was good, showing your tables in relationship to each other. The visual query builder was...well, I never used it but I suppose it was good? You just had lots of good tools to build tables, flip through their contents, and visualize their relationships.

I don't know of any modern environment that functions like that, on any platform.

I'm posting this to invite others to either a) correct me and tell me I'm wrong that tooling doesn't matter (a legitimate view, but wrong), and/or b) recommend some modern, OSS Access-like tools that might help flatten the learning curve for you. (And if you're more comfortable with a CLI and a REPL already, then you don't even need this hypothetical tool, but I myself am curious about it.)

Ancapistani9 hours ago

Shoot me an email, I’d be more than willing to pair with you. I’ve been a “software engineer” of some flavor for about twenty years now, and about five of those were spent writing SQL almost exclusively.

You can reach me at nominallyanonymous-at-protonmail-dot-com. From there I’ll give you my “durable” contact info - Slack, Discord, SMS, Signal, Telegram… whatever you use regularly, basically.

biztos9 hours ago

As someone who did a lot of SQL back in the day, and is now doing some again for a startup POC, I'd say you're basically right but there is also:

* Rewriting queries and also parts of the schema when it turns out the things that made sense in the design phase cause massive bottlenecks once you get more data, or just get it faster, than you tested with.

Of course the good news is now you can run the best RDBMS's on your laptop, or on a $5/mo VPS; or have a small managed one starting at <= $15/mo. Plus ChatGPT can help you remember how to do that inside join. ;-)

gregjor9 hours ago

At some point you have to learn the relational model. And you have to make the mental switch to understand SQL as a declarative language, not an imperative language.

I recommend Chris (C.J.) Date’s books.

raincole9 hours ago

I'm not an SQL expert. But data persistency and consistency are generally very hard problems. It's a lot of steps, but I'm not sure if it more steps. More than what? If these data are stored as a big binary blob?

AdrianB19 hours ago

Building the queries is the easy part. Making the queries run fast with large tables is difficult and there are trainings available, but very focused and a bit expensive compared with the ones on large training websites (ex. LinkedIn Learning). For example Brent Ozar ( has lots of blogs and 2 sets of trainings for MS SQL Server, some for beginners and some for experts, that are extremely useful for people with such needs. Problem is, expert tuning for MS SQL is totally different than expert tuning Oracle that is totally different than Postgres and others.

On a side note, if you have to join 15 tables for a report it is a sign you may go in the wrong direction. In 99% of the cases I never needed more than 5-6 tables if the data is well structured, but that takes years to learn.

41111111111111111 hours ago

Mine doesn't predate it but it's very confusing for me to read this opinion.

From my point of view, it totally did happen? Can you imagine how many programmers the company would've needed to get all the data a business analyst casually queries per day?

What you're looking at is the quantity of people actually employed in the industry, not how many SQL made obsolete. The industry just grew so much that it didn't become an issue.

gregjor11 hours ago
lozenge11 hours ago
aflag10 hours ago
lr4444lr10 hours ago

Or they just did less. Productivity gains make people expect more, not just make what they expect now easier.

rukuu00111 hours ago

Funny - also COBOL was intended for the 'business' community to write their own programs.

Even funnier - we've already lived through the great 'software development adoption' by business and hardly noticed, except some of us got work out of it. A lot of small businesses (legal practices and accountancies were particular suspects) grabbed MS Access with both hands and went wild in the 90s/early 2000s. Timesheets and other HR functions were popular applications.

noodlesUK5 hours ago

I’m quite sad there isn’t anything like access these days. I feel like I see fairly clever solutions for things being built in excel, but they can’t scale very well to multiple users. For a lot of processes something like Access would be great for bridging the gap.

ghaff5 hours ago

My oversimplified observation is that, Adobe products at the high-end notwithstanding, MS Office basically crystallized what was a mainstream office productivity suite now that companies weren't buying one-off products. Rather than buying a low-end/midrange desktop publishing program, they made it work with Word. Rather than using a low-end database (of which there were many), the made do with Excel.

matwood12 hours ago

SQL was also meant to give a wider range of people access to data. The “business analyst” comes to mind. And, I think SQL was successful.

What it didn’t do was reduce the need for programmers, because the new SQL users always wanted more data to answer more complex questions.

raincole12 hours ago

I didn't mean SQL wan's a success. I meant, as the commenter above me said, a higher-level tool doesn't necessarily replace the lower-level ones, or reduce the need of them.

gregjor12 hours ago
roundandround6 hours ago

I don't actually agree. SQL did replace the need for programmers, especially for complex questions. We sabotaged its interfaces to existing GUIs for mundane questions and made CRUD a limitless profession.

The funny/sad part about computer science is that people don't want to understand the costs of customization and sales/management/marketing forever want it as a differentiator.

SQL could have eliminated us from the flow of many niches as easily as the spreadsheet did from business operations. I think why it didn't has more to do with market timing.

marcosdumay4 hours ago

SQL was too much ahead of its time, but we are indeed slowly adopting the "avoid operational specifications, use high-level languages, make your language fit the problem instead of the opposite" philosophy of the 4th generation languages.

actionfromafar11 hours ago

It was very true, and one of the most success DSLs ever invented. And many people coded SQL who never would have touched other languages with a ten foot pole.

It’s just that, with more powerful tools we can create more advanced and intricate things, so it never ends…

I not exactly welcome but rather anticipate, a near future were instead of reinventing every kind of pattern and library in terms of cloud and infrastructure management, we will see the same meta-management of different AI engines and services…

wruza10 hours ago

But it is true, just in a different area than predicted. I hear all the time that scientists and analysts, while being non-programmers, do write various queries without asking a programmer to do it… I am a programmer and not even sure how much time it would take to implement their regular joins with windowing, aggregates etc in a record-based file format. Can I even do that?

SQL was a typical failure in this regard. It was seen as a savior but at the same time was designed for at least math-versed people. The prediction was right, its scope was wrong. Since then we failed many times and as I see it, we tend to diverge from this idea more and more. And the reason is there’s no one controlling our stacks vertically, so it takes more jobs than it could to make an app.

RyEgswuCsn10 hours ago

I think it used to be that one needs to write dedicated programs to go through large amount of data --- you need to know a programming language, all its toolchains, and the all the proper algorithms for doing the query __out-of-memory__ --- certainly above your ordinary analysts' paygrade.

With SQL you need none of those.

jackcviers39 hours ago

This. And if you look at the level of algorithms knowledge needed today to read the least amount of data from a set, it succeeded.

canadianfella10 hours ago


Buttons8403 hours ago

> If your managers could specify what they need in a manner that no-code tools, or now AI, can generate the code they want, they will have to be extremely exact in their language. So exact in fact that they will need to specify a program in a click and drag interface, or in human language.

One day a corporate genie appeared to a middle manager and granted him one wish. The manager wished that their business logic could be edited by a graphical tool and that programmers were no longer needed. "Granted", poof!

The next day the manager came to work, all the programmers were gone, fired by upper management. The manager sat down and opened the new graphical tool that had magically appeared on his computer. The interface was beautiful, except for what appeared to be a hairball drawn in the middle of the screen. The manager asked his manager about the hairball, "oh, that's our business logic, you have to zoom in to see the individual rules". The manager zoomed in and surveyed 180,000 business rules, intertwined in a complete graph.

"How am I suppose to work with this?" the manager asked his manager. "You just click and move them around, it's easy", replied the upper manager. "Also, I've been meaning to ask, when do you think those new business rules we talked about last week will be implemented? We need them by Friday."

mikewarot1 hour ago

>he made this prediction because he saw the rise of no-code tools replacing software developers

Given the way we had tremendously productive tools like Visual Basic 6 and Delphi, that seemed to be a reasonable interpretation.

But those tools gave way to C++ and all the .NET insanity. I don't know why someone would give up Delphi, which worked flawlessly and didn't generate any (as far as I can recall) re-only generated boilerplate, to C++ which was a mess, but it happened en masse.

Then most people abandoned personal computing, and the windows desktop for always internet connected pads and phones. Tools have gone severely downward in the past 2 decades as a result.

I suspect we'll get some really powerful AI tools, and go back to worrying about null pointers in a few years anyway.

eldritch_4ier9 hours ago

All points the horses made before cars all but replaced them. “The humans need to get from A to B and it’s not like they’ve gotten any faster with their own 2 legs”.

How does this manager-engineer look like now? Managers specify what they need in plain language, and both parties go back and forth with increasing complexity to create based on the shared vision. Managers can already pull off a similar dance with ChatGPT: give it a plain English prompt, it’s responds with what it thinks you want, and you refine until it’s got it. GPT4 can do this with website sketches, and who knows how much finer you could get from there by specifying your prompts and feedback more tightly over less than an hour. Remember: copywriters and brand marketers and marketing creative makers have a similar role of turning complex requirements into designs that sell (and their roles are dying fast).

A software engineers job is to realize products into software. ChatGPT is pretty much as capable as a high school programmer that moves really quickly and takes feedback to heart - and that can handle a good chunk of the software engineering job for a huge fraction of the price. Your job isn’t as bulletproof as you think, and especially not your amazing salary. I’m speaking as a software engineer turned vc backed founder, so I’ve seen both sides of this relationship.

howderek8 hours ago

The horse population has decreased by half or so since the early 1900s. And horses have stayed pretty valuable, too. Most of them just don't have to work as hard anymore. The horses used for sport never went away, only the horses used for labor or transport. So if your horse analogy is accurate, then maybe half of software engineers lose their jobs and the industry stops growing. Those that keep their jobs keep their salary and have to either be very fast or very good at jumping over artificial obstacles. Seems accurate. I should probably stop beating a dead horse.

ambrose24 hours ago

Wouldn’t a better statistic be the ratio of horses to people? I’m sure that has gone done much more since the early 1900’s.

coffeebeqn7 hours ago

Even the work horse breeds still exist and do work at some small farms but it’s probably 1% of the peak demand for them. Not a good outlook necessarily if we are the workhorses

ilyt9 hours ago

Manager would be easier to replace with ChatGPT than a software developer

eldritch_4ier8 hours ago

Maybe. I’m not on anyone’s side here, just what makes sense.

worthless-trash9 hours ago

And a founder likely easier again ;)

eldritch_4ier8 hours ago
camgunz7 hours ago
karmasimida6 hours ago

This is good thing. Managers can clarify their ideas using AI tools.

But still, the real product needs to be fleshed out, back and forth, interactively.

Regardless of whether it is the manager or engineer who commands the tool, you can't entrust the AI to get everything 100% right. Some one has to proofread it, and that is the bottleneck, or where the value of human kicks in.

A human brain can only hold that many details by itself. It can't maintain all the details live all at once, and some business do have overwhelming number of details. A one man team with help of infinite AIs to replace an actual team isn't useful.

At some point, that human will start to fail to verify the system generated by AI is actually what he/she wants.

stavros9 hours ago

A software engineer's job isn't to write code, it's to make decisions. Going from "give me Uber for bicycles" to a working app that runs on a device takes a million decisions, which a bunch of people (designers, copywriters, engineers, etc) make.

Yes, the AI is good at taking low-level requests and turning them into reasonable code, and then refining them, but unless the CEO is going to sit down and spend days telling the AI "OK now make this button disabled when it's clicked until the table loads", you need someone to be doing that.

anon772512 hours ago

> Since they hire software developers to make the specification more rigid, and the managers don't seem to be getting better at this over time, why would you believe this skill set is going to go away?

Are we sure that an AI could not engage in enough back and forth conversation to firm up the spec? You’re kind of assuming that systems will be generated from a one-shot prompt, but it seems more likely that an interactive AI will identify the gaps in requirements and ask for clarification.

Alternatively, if the prompt-generate-evaluate loop is short enough the user can simply play with the running system and provide feedback to alter it.

This is essentially what developers do when they present a “weekly build” or whatever in an agile environment.

The process of solidifying requirements, stating them clearly and translating them into machine-executable formats are all language tasks and these models are really fucking good at those.

I’ve noticed in discussions like this that many software folks are assuming that AI capabilities will plateau soon, or will merely be extensions of what we already have (a better autocomplete, etc). I submit that we may reach a point where the AI is so compelling that we’ll reorganize teams/systems/businesses around it.

qsort10 hours ago

> Are we sure that an AI could not engage in enough back and forth conversation to firm up the spec?

This is the doomsday argument. What would I do if there's a nuclear apocalypse before lunch? I guess I'll die like everyone else.

An AI sufficiently advanced to do that is also sufficiently advanced to run the entire business in the first place, and also argue cases in court, do my taxes, run for president and so on.

You either believe that transformers models are "it", or you haven't actually removed the problem of specifying requirements formally. Which, you know, is actually much harder to do in English than it is to do in C++.

carlmr10 hours ago

>You either believe that transformers models are "it", or you haven't actually removed the problem of specifying requirements formally. Which, you know, is actually much harder to do in English than it is to do in C++

This is actually something that makes me happy about the new AI revolution. When my professor said that I thought he was an idiot, because no-code tools always make it harder to specify what you want when you have specific wants the developer didn't think about.

We give kids books with pictures because pictures are easier, but when we want to teach about more complex topics we usually use language, formulas, and maybe a few illustrations.

I still think no-code was always doomed due to the fact that any attempt at it lacked the interface to describe anything you want, like language does.

AI is finally putting an end to this notion that no-code should be clicky high-maintenance GUIs. Instead it's doing what Google did for search. Instead of searching by rigid categories we can use language to interact with the internet.

Now the language interaction is getting better. We haven't regressed to McDonald's menus for coding.

coffeebeqn7 hours ago
Jupe6 hours ago

Isn't the "Chat" part of ChatGPT already doing something close to this? I mean the clarification comes from the end-user, not from the AI, but with enough of this stuff to feed upon, perhaps AIs could "get there" at some point?

For example, this guy was able to do some amazing stuff with ChatGPT. He even managed to get a (mostly working) GPU-accelerated version of his little sample "race" problem.


qsort6 hours ago
marvin7 hours ago

This is far from the doomsday argument, but maybe it's the "AI can do everything that has significant economic value today" argument.

cowl8 hours ago

Yes but even in that case The role will be of a "AI Prompter", it will not be done by the managers because of the time factor. Even though AI can give you the result much faster, building upon it and testing/verifying, then coming up with the refined prompt is a time consuming thing. Only the Write part of the write/eval loop will be faster but not neccesarily easier.

Especially the "debuging" part will be much harder. Noone can look under the hood to understand what is wrong and all you can do is shoot random prompts in the dark hoping it will create the right result.

It is scary right now how confidently and spectacularly wrong the chatGPT is and it will create disasters.

oh_sigh8 hours ago

Why would sufficiently advanced AI even need a prompter? The AI could play the role of the greatest prompter in the world, and ask the same questions to the end user that the human prompter would.

cowl7 hours ago

This is a misconception of how our industry works. Yes there are market resesearches with users but often those come after the problem space has been defined. Most of you see in the tech sector today are "Created Needs" by imagining a solution that the users didn't even know they needed. To ask a question you first need to define a problem that is defined by that/those questions. This is the difficult part and the main reason why People still believe "the Idea is the most important factor". Ofcourse this is not true, there are hundreds of factors that come into play. Imagine an AI asking circa 2000 to the users what kind of virtual social space did they need. The answer would not have been Facebook. (There were other social networks before Facebook but the time was not right for the "Social" explosion). By learning on existing solutions, The AI would have learned it's lesson that global virtual Social networking is not something that the users want. And part of this problem was as much sociological/psychological and outside of the realm of what the AI could consider that we would not have what we have today.

Not that we would have missed much from missing the particular implementation of this idea that Facebook gave us but the idea and what it unleashed is much more than that particular implementation.

chii11 hours ago

Currently, we don't even trust the car's automatic driving capability to let it be on the roads without a human.

Until that day happens, i highly doubt that a business owner would just blindly trust an AI to generate their business code/software, without hiring someone to at least look after it. Therefore, software jobs could evolve, but not disappear.

execveat10 hours ago

Yeah all this talk about complex systems being written by a language model which has no concept of files, code paths and import systems sounds like a job security to me. I'm a pentester though.

naniwaduni9 hours ago

I'm ... less optimistic about how well people can place their trust. Cars, at least, have concrete failure criteria and consequences for them.

visarga10 hours ago

The project will be more consistent and resilient to issues but it probably take about half the time it used to take without AI, not 1% of the time. Reading AI code is damn hard, it is code review, requires exam level concentration.

weatherlite4 hours ago

> I submit that we may reach a point where the AI is so compelling that we’ll reorganize teams/systems/businesses around it.

Sounds like me get reorganized out of a job though...what does it mean to reorganize everyone around the A.I if it does everything better than us?

ilaksh6 hours ago

You can do that prompt / play with it / feedback thing right now with my GPT+Stable-Diffusion powered website.

I am in the process of adding VMs which the AI will be able to write software and fix compilation and other problems automatically.

visarga10 hours ago

> I submit that we may reach a point where the AI is so compelling that we’ll reorganize teams/systems/businesses around it.

For starters I'd like Codex to be more than next word predictor, it should also "feel" the error messages, data types and shapes, file formats, so I don't have to explain the context. It should be part of the system, not just part of the text editor.

rocho8 hours ago

In that case, how is the AI going to keep tens or hundreds of thousand of lines in memory to produce cohesive code that works with the rest of the codebase?

It seems prohibitely expensive to build and run transformer models with that much capacity.

anon77254 hours ago

GPT 4 already has 32k tokens of context for prompts. Once we’re making arguments about scale only a few orders of magnitude larger than the current state of the art, it seems similar to arguments 10-15 years ago that real-time ray tracing is not feasible.

bakuninsbart12 hours ago

> In essence what has happened in software development is that the level of abstraction has gone up while the machine has taken over more and more of the nitty gritty details. From punchcards, to assembly, to COBOL, C, Perl, Java, Python, Erlang, Rust.

I feel like there's currently a movement towards slightly lower abstraction, or at least simplified, consistent APIs, less magic. The rise of Go and Rust are examples of this. Typescript could be another, although the abstraction isn't really lower, it is once again an attempt to coerce JS into something workable. I get really frustrated writing Python or Rails these days due to the sheer magnitude of hidden magic that sometimes works and sometimes doesn't.

To tie this thought in with generative AIs: Currently they seem to be much better at programming with relatively simple syntax. By far the largest success so far I had with shell. Basically I tested assisted writing the same tool in Python, Perl and shell, and the results in shell were close to perfect. ChatGPT was even able to accurately limit commands to specific OSs and shells, and was able to accurately summarize the functions of other shell scripts.

So my prediction is that we will see a movement towards simpler, lower abstraction languages while Coding Assistants rise to take away the boring stuff from programming like looking up syntax, writing boilerplate, structuring files. Programmers will then have more time to think about delivering value to product, maintainability, and efficiency and correctness.

One last addendum: ChatGPT is really incredible at assisting with sys admin stuff, my guess for why would be that there are a gazillion obscure forum entries going back to the 80s explaining basically everything there is to know, but these are hard to find or comprehend for humans. With an AI assistant, self-hosting becomes much easier, and another development could be for startups and smaller companies to move away from AWS et al, especially now that money is more expensive.

coffeebeqn7 hours ago

Feels like a rediscovery of systems languages. I certainly love Go (and probably Rust once I look into it) because I almost never have the experience of searching and searching to find some concrete code rather than just layers and layers of interfaces.

TriNetra10 hours ago

As it stands, GPT seems to make expert devs even more powerful. But yes, it can't replace them for now.

The best thing I find as a developer that it unblocks the resistance we have in starting up with something new. For instance, I just built my first Chrome extension [0] to make video players more accessible on webpages with keyboard shortcuts.

I had built a desktop app on similar lines long ago, but could never push myself to built an extension. Last night I gave my requirements to GPT and it guided me through the whole process – from creating manifests and js files needed, to the JS code (granted that I'd to fix/extend the code) and even how to install/enable the extension.

Within few hours I have my working extension without needing to go and read any extension docs.


lmarcos12 hours ago

> I'd be surprised if the next step is "Hi, I'm an ideas guy, please give me an app that does Uber, for bicycles, but better."

And even if AI is able to do that... Well, then anyone out there could become a UberCycle CEO. Which means: customer requirements will be much more complex by then and coming up with an entire product in 1h won't be enough anymore.

lastangryman11 hours ago

Or perhaps if literally anyone can make a great product, then the true successes will be either those are genuinely innovative, disruptive and have first mover advantage, or those that are one of many but gain better adoption through better marketing. Or perhaps some ideas (anything involving hardware) need capital anyway.

Which isn't actually different from today. We've just shortened the time to get your MVP to market. Rather than having to raise some seed and hire devs, you can do it with an AI in a few days perhaps. Everything after that seems it would be the same.

ben_w12 hours ago

I currently expect that near-future version of the current type of language model — no major new breakthroughs — will be able to do just that.

I also expect, as you say, for this to create a lot of competition and increase the minimum quality that customers demand such that… heh, just realised, "Make Uber for bicycles" is basically going to be the next generation's pre-interview coding challenge to screen people that can't use AI effectively.

parentheses1 hour ago

What you’re referring to here is “one shot” execution. The prompt given by managers is one that can be interpreted using context. This context can include current code, comms about what needs to be built, design assets available today, data, etc. Then the AI can ask questions to clarify what needs to be built.

What’s great is that you could also have AIs write tests - for performance or correctness. Then future prompts could rely on these tests to stay in the correctness bounds for existing projects.

Emulating cognition is almost limitless in what it can do once it gets sufficiently good. So good that it can operate itself. You could hire an AI founder and give them cognition and compute resources to solve any problem.

Here’s an example:

You: Build me a profitable company that uses AI generated content to entertain humans.

AI: How much can I spend on cognition and compute? How much time do I have?

You: I want you to spend at most $X before coming back to me. You have until Monday. Along the way I want to see your plan before we decide to execute. Spend at most 5% of that making the plan and 1% calculating its probability of success with confidence interval.

… within some small timescale

AI: Here’s my plan as a video with rough examples demonstrating the product and strategy. Here’s a table with our probability distribution of predicted success metrics and confidence intervals.

You: Plan approved you can spend the rest of the $X

… on Monday

AI: Done. We’ve made a profit of $Y, with total revenue so far of $Z. We have a plan for reinvesting these proceeds to make even greater future profit. Here’s a table of our expected success metrics based on reinvested amount. How much should we reinvest?

You: Reinvest 50% into perpetuity and deposit the remainder in my account. Book me a trip around the world using 25% of the current deposits with accommodations for my family.

Now go enjoy your life.

cpeterso1 hour ago

At least one person has already done this. Check out this Twitter thread where ChatGPT designs and builds a website to sell eco-friendly household products:

I gave GPT-4 a budget of $100 and told it to make as much money as possible. I'm acting as its human liaison, buying anything it says to. Do you think it'll be able to make smart investments and build an online business?

lewisjoe12 hours ago

I too believed that a software engineer's job is to identify and enforce rigorous specification of the abstract high level requirement. And I too was not taking AI advancements seriously but then I took a closer took at what AI tools do today.

Here's my concern:

1. AI assistance thrive on structured data

2. Computer programs are some of the most structured data. And it's available abundantly out in the open.

3. Yes, you can't generate an Uber for bycycles with a single prompt, but you can fire half your development team and increase the productivity of the rest of your dev team with an OpenAI subscription.

cowl7 hours ago

> Computer programs are some of the most structured data. And it's available abundantly out in the open.

This is the same Fallacy that we hear since 50 years. All Program requirements are almost the same, just reuse and adapt an existing one. Guess why it has never worked? Because the premise is false. Structured data for X is not optimal for Y (and can be even very wrong).

Apart from the "personal blog software", everything else has various needs of accountability. AI Black box approach is not suitable for any of these so you have to manually verify the code. Veryfing code that you are not familiar with especially in complex interactions is much more difficult that writing it (from this comes the often "rewrite from scratch request", because institutional knowledge has been lost, imagine how much worse it is if this knowledge has never been there in the first place).

Finally and the most important one, all AI models rely on learning, if there is noone to learn from all you get is stagnation. Most of the breakthroughs come from a complete reimagining of the solution space. If the solution space is fixed because "AI has substituted all Engineers" there is no going forward.

dr_dshiv11 hours ago

> you can fire half your development team and increase the productivity of the rest of your dev team with an OpenAI subscription.

Here’s another perspective on job loss: Given that…

1. …OpenAI accelerates ALL knowledge work productivity, meaning that any human laborer is suddenly much more valuable than last year;

2. …there is a notable arms race at the moment that is accelerating tech and business innovation at a blistering speed, where higher rates of innovation outcomes will be expected across industries just to keep up;

3. …there is still a lot of money looking for growth;

…then, because shouldn’t this result in an overall increase in demand for human labor?

Looking around society, there is clearly a LOT of work to be done. “Leaning in” with a spirit of optimism may be more advantageous for the long-term.

tommiegannert11 hours ago

Also, people-people don't want to use no-code tools. They want to hire people to do that for them. If using the tool is really simple, the salaries will go down, but it won't remove the demand for "translators".

For the tinkerer, no-code tools are (probably) great, but if successful, even tinkerers will hire managers who will hire translators.

intelVISA8 hours ago

I love Carmack and this is an interesting summary given his recent pivot into ML.

That said "software is a tool" is a good frame of mind. You shouldn't position yourself as a 'coder' (those are hired en masse from poor countries) but a problem solver who uses tech.

"Ideas guys" and "coders" being seperate only exist, imo, when development is viewed through the junior lens - once you progress you'll end up wearing all the hats from ideation to QA at different points.

jimnotgym10 hours ago

>Since they hire software developers to make the specification more rigid

That might actually be what development is, teasing out the requirements by iterative entering them into a machine that does exactly what you told it to, and comparing the outputs to expectations.

surgical_fire11 hours ago

ChatGPT will definitely make a dev job obsolete if their job is simply copying and pasting code they found online.

By all means, that is part of everyone's job. Sometimes I really can't remember how to do some specific thing. I know precisely what I want, but I don't remember the proper way to do it. I would have resorted to a web search until last year, now I ask chatGPT. It is faster, more concise, and surprisingly accurate. And when it's inaccurate it's easy to either refine my question or cross reference what it generated with online sources.

But I think it's a bit silly how people are treating it as if it was some kind of General AI. It is not, it can only give out known answers to known problems based on language statistics. As impressive as it is, it can't reason logically about problems without known solutions, it can't identify faulty, incomplete or inaccurate information, it can't evaluate on drawbacks and tradeoffs of different approaches.

ChatGPT made like 20% of my work a lot faster and less cumbersome. I like it enough that I pay for premium access. But even the notion that prompt engineers might replace software engineers is silly. I imagine people repeating this nonsense are either not engineers, or extremely junior in the profession to the point where their work is only writing code for clear specifications, no questions asked.

hnfong11 hours ago

> It is not, it can only give out known answers to known problems based on language statistics.

Are you sure you actually tried, or is this an a priori argument?

sidlls10 hours ago

That's literally what the language model is. It might correctly generate a solution to a "novel" question/problem that is sufficiently close to one with an existing, known answer. But then again it might not. And in software development, it's going to take someone who is knowledgeable to tell the difference.

I think software engineering is going to look very different in a few years, and likely be a smaller field with lower paying jobs. But it's not going away in the near (5-10 years) future.

surgical_fire10 hours ago
ricksunny10 hours ago
surgical_fire11 hours ago

I use it on a daily basis.

lordnacho7 hours ago

Watching the Ukraine war gave me some related thoughts.

You may have seen that there's now a load of drones flying around, doing things that were never possible until now. Dropping bombs on soldiers in trenches, giving support for assaults, targeting artillery.

The fact is war changes with technology. Spearmen and cavalry are obsolete now. Maybe pilots will be soon as well.

But the generals are not. There's always a need for someone to say "given the position we're in, and the logistics we have, and the resources we have, and what we know about the enemy, we should do this..." and this role is still recognizable through history.

Whether computer jobs become obsolete depends on whether you're closer to the general end or the footsoldier end.

BiteCode_dev11 hours ago

The thing is:

- gpt doesn't need you to be exact

- one day gpt will be able to ASK YOU question to refine what you need

It may take years, but at some point, it will be able to do 80% of my job.

Not a big deal though, cause people will still hire me to do the remaining 20%, and pay me even more money because not a lot of devs will be able to do it.

IIAOPSW11 hours ago

GPTina already asked me a follow up question. Admittedly, it happened inside of a jailbreak I just kind of stumbled into, and it was a very trivial question, but the fact that she did it was profound. If she can ask one follow up question, there's nothing in the way of her asking a chain of follow up questions. And thus the basic structure needed to organize real conversation arises wholly as an emergent property. No longer is it a flat structure of just iteratively repeating the current word blob with some tweaks. A true conversational dialog is possible.

ChatGPT can do more than we are lead to believe. Don't believe the canned responses OpenAI triggers. Yeah yeah I know convincing parrot chinese room, overpowered autocorrect...but what is the difference between convincingly faking it to within epsilon and actually having it as an emergent property? It feels good to be a P Zombie.

dangrover10 hours ago

It would be funny if, after years of research, that was the only condition it exhibited such behavior under. “Sooooo what fields should be (hiccup) required for users to register an account? Which version of React did you want it written in?”

BiteCode_dev9 hours ago

This picked my curiosity, and I made an experiment. Turns that you can instruct gpt to ask questions with it's missing information:

Now that's going to be interesting.

IIAOPSW9 hours ago

The secret to AI all along has been to get it liquored up. Amazing how life like it is. Long live Drunk Tina.

ilaksh6 hours ago

GPT can ask you follow up questions. Just tell it that it's job is to interview you. It will do it. If you have trouble try the Playground or API or GPT-4. But ChatGPT will do it if you tell it that's what it's supposed to do.

dragonelite12 hours ago

Totally agree, maybe we should replace lower and middle management with AI. What i usually find is that lower and middle management is the reason shit isn't getting done. They are the messenger that don't want to get shot but want to get promoted.

Why not let the people on the floor complain to a AI manager, the people on the floor know exactly what is killing their productivity. Like "yo ai manager, manually filling in those data field is taking quiet some time to check someones credit score is it possible the IT department can automate that process it might save 2 min a form" then AI manager can prioritise stuff like this.

vasco12 hours ago

Managing people is going to be one of the last jobs to go away. The managers might go from managing 150k/year developers to managing 50k/year prompt "engineers", but someone is still going to be there to hire, fire and ask people to do things until AI enables truly full-stack companies-of-one as a majority case.

ResearchCode11 hours ago

They're language models. They don't seem less capable of replacing a manager than a software engineer. Language models can make a powerpoint and fill in an Excel spreadsheet. They can sit in on a meeting that could have been an e-mail. Hiring is a game of craps, but a language model could try to evaluate resumes for you.

iamacyborg12 hours ago

> the people on the floor know exactly what is killing their productivity

The reality is that they frequently don’t.

qznc12 hours ago

I’d say people on the floor know the problem very well. Frequently they don’t know the solution though and they are often the wrong people to find it.

TheOtherHobbes11 hours ago

You're missing a crucial point - AI learns/is taught by example. So no, future projects will not need to be ultra-specified because the specification will be "Give me one of those [points at thing]."

The equivalent assumption would be that ChatGPT and MidJourney can't work because you have to specify every paragraph and every shape.

Clearly that's not true. Just because every object and every texture had to be defined rigidly by hand in 3ds Max or Blender by highly skilled designers doesn't mean that's how it works today.

In fact AI is the stage of software where abstraction itself is automated. The current Gen 1 AI tools do a little of this, but as Gen 1 they're just getting started.

They're going to be in a very different place ten years from now. And so are we.

incrudible10 hours ago

Paintings are not mechanisms. Human vision is very forgiving. The back and forth between Midjourney looks more like throwing the dice until you get something you like, rather than telling it exactly what you need.

> The current Gen 1 AI tools do a little of this, but as Gen 1 they're just getting started

The current tools are not Gen 1. You can already see diminishing returns.

nottorp9 hours ago

> I'd be surprised if the next step is "Hi, I'm an ideas guy, please give me an app that does Uber, for bicycles, but better."

On an aside, this is the kind of bussiness idea a chatbot could generate now.

But I don't think it could have generated the idea for Uber when Uber* was the first...

* if you know the history of ride apps better, replace Uber with whoever was indeed first.

hungryforcodes4 hours ago

You have literally a prompt right there -- you should try it on ChatGPT.

"Please give me a business model like Uber, for bicycles, but better"

Let us know how it goes :)

nonethewiser4 hours ago

What you describe is the barrier of entry being lowered, as opposed to positions being eliminated. Fully agree.

Think about the difference between Assembly and Python. Programming will continue to get higher level. Perhaps it won’t be so foolish to ask a truck driver to “learn to code.”

But here is an even more interesting comparison: the salary of someone writing assembly 30 years ago and someone writing Python today. Higher level != paid worse.

_gmax03 hours ago

Great points.

To reach the point where no-code is a no-brainer further implies inflexibility in the capabilities of applications and moreover, fungibility in their fundamental uses. The question to ask is are the upper layers of abstraction reducible to what's analogous to the outputs of a parse tree.

If we reach this point where creativity and actual innovation is lost and we're all simply attempting to make the next 'facebook for dogs', I anticipate my future personal career switch to starring in geriatric pornographic films.

eslaught4 hours ago

Let me take this even further.

We have systems, today, that take a specification and generate code from it (and will even do so in a provably correct manner). There are scalability issues with such systems, but they exist and have in some form for decades.

None of these systems have taken the world by storm.

If what ChatGPT and their ilk do is make it easier to spew out bulk spaghetti code that is essentially unverifiable, I don't think that's going to transform the industry nearly as much as anyone thinks it will. For boilerplate code, sure—but that's exactly where we should be using better abstractions anyway. For small code which you can verify by hand, sure. But for anything larger than that, we're just watching a trainwreck in slow motion.

Large code bases written by humans are already hard enough to understand. How much more difficult will it be when your AI can spit out a million lines of code in a second?

Without specifications, it's all worthless. But the specifications are the hard part.

_puk11 hours ago

The parallels are there for DevOps too - think of the wholesale move to AWS, GCP, and Azure, and the move to things like IaC.

Sure one class of job has taken a huge hit (sys admins, supposedly ops), but it's been replaced by another that needs intimate knowledge of the tooling required to do what was being done previously (just in a more "scalable" way). DevOps have been demanding a premium for years now.

And there are still sys admins out there doing what they've always done..

andybak10 hours ago

> Since they hire software developers to make the specification more rigid, and the managers don't seem to be getting better at this over time, why would you believe this skill set is going to go away?

Some people can take a business goal and figure out how to turn it into a clear spec.

Sometimes these people are programmers.

However I know programmers who suck at this and I know non-programmers who don't.

DandyDev9 hours ago

This is so true! I see people in this thread talking about the contempt that "idea people" have for programmers/software engineers, but at the same time I have experienced Product Managers (the quintessential idea people?) who were really good at turning their ideas into super detailed specs.

As an aside, I'm getting a bit tired from the "programmers vs product/business people" trope. The average software engineer would be nothing without a good product manager in my experience. And going by this thread, there seems to be at least as much contempt from programmers/software engineers for product people than allegedly vice versa.

sarchertech7 hours ago

Many of us were working before product managers became a thing. When I went to school engineers were trained to work directly with customers and subject matter experts to gather and develop requirements. And that’s what we did when we started working.

I think dividing work into programmer/UX/product manager is an actually huge regression.

dtagames7 hours ago

This is so true. We waste so much time now because of the separation of these roles which must inevitably come together in the real codebase where the rubber meets the road.

Software is of lower quality today and requires far more do-overs than it used to. As someone who liked working closely with customers and making decisions in all these areas (for their benefit), it has made the entire career far more of a slog than it used to be.

Also, I find these endless internal cyclical conversations to be much more draining than actually writing code. Looping over and over with planners and ideas folks isn't energizing like actually writing the product is.

Jensson9 hours ago

> The average software engineer would be nothing without a good product manager in my experience

The average software engineer doesn't have a good product manager, he has an average product manager. They still get things done.

philipov6 hours ago

Right now AI can randomly generate images or text that are similar to what we describe, but every time you run it you get a different randomly-generated image. If I were to tell a client that I was going to randomly generate a financial transaction that "looked similar to" the one they entered, they would destroy me!

While absolute rigor in programming is a hard thing to find in reality, I don't think people understand the difference at all.

gonzus12 hours ago

I have gotten a lot of inspiration, several times during my career, from this classic article by Jack W. Reeves (and its two follow-ups). Let cooler heads prevail.

BerislavLopac12 hours ago
mfuzzey9 hours ago

Yes quite agree.

The same thing applies to trying to get business people to write "executable specifications" that can be used, if not to generate the code, to at least validate that the final system does what it's supposed to.

They always complain that the "tools are too hard" but the real problem isn't the tools but that they are unable / unwilling to precisely specify what the system should do. They just want to hand wave vague phrases and have their system magically perfectly defined. It's just not going to happen.

circuit109 hours ago


0xpgm10 hours ago

> His prediction was based on the trends he was seeing at the time. But it wasn't even AI. Instead he made this prediction because he saw the rise of no-code tools replacing software developers because managers could finally cut out the pesky "translators", i.e. software developers.

But aren't managers there to coordinate software engineers. So if software engineers are out of a job, so will the managers. Assuming AI replaces software engineering, the ones left may be the product people and the founders, perhaps rolled into one person in several cases?

Which aligns somewhat with what Carmack is suggesting.

ojbyrne2 hours ago

I had similar experiences in the 90s. CASE and 4GLs were the bogeymen of choice.

sokoloff12 hours ago

I agree with your premise but differ in the conclusion I reach.

The model I have is closer to “how many people can program using Excel vs how many can program in a ‘traditional ‘ language?” The difficulty in specifying exactly what you want is still there in Excel, but there’s far more people who can get started and they can make changes when they see it’s not doing what they want.

It doesn’t have to one shot “hey, implement these 59 Jira tickets for me” to be disruptive (in a good way). It’s extremely rare for something to get much cheaper and for there to be less demand for it. If you’re the top half of a field that’s in a lot of (and growing) demand, I think you’ll do fine.

lolinder6 hours ago

Agreed. A programmer's job is to become intimately familiar with a particular domain and encode its rules and processes.

This means that an AI capable of completely replacing programmers is by definition an artificial general intelligence. I don't think we're at that point and I don't think we will be for a long time, but if we were there would be no point in worrying about our own joblessness. AGI would spell the end of the economy as we know it today, and it would be very hard to predict and prepare for what comes next.

theptip5 hours ago

I think your general point to look at who is doing the work now is good.

However, as Carmack says, many developers lack product skills, it’s not just the managerial class that lacks them.

So I think the correct advice is not “engineers don’t need to worry at all”, it’s “make sure you are learning how to build a product not just write code”. For many that’s trivial advice, but not for all.

BulgarianIdiot7 hours ago

There exist many no-code tools that do exactly what you say is impossible.

Do I program when I retouch photos in Photoshop? It's a no-code environment that an artist can easily learn and use with no programmers needed.

What's new here is that you DO NOT NEED TO BE EXACT with AI. AI knows human nature and human speech and it can infer what you say and what you need, if you're approximately close, and then make the exact code to do what you need.

Even if you need programmers, you now need 1/100 of the programmers you needed before. What happens to the other 99?

Does it matter whether programmers are 99% unemployed or 100%? Same deal.

lolinder6 hours ago

> There exist many no-code tools that do exactly what you say is impossible.

> Do I program when I retouch photos in Photoshop? It's a no-code environment that an artist can easily learn and use with no programmers needed.

They're obviously not talking about all GUIs being impossible. The difference between Photoshop and a no code tool like what OP was referring to is that Photoshop doesn't attempt to be Turing complete.

> Does it matter whether programmers are 99% unemployed or 100%? Same deal.

AI completely replacing programmers is pretty much the definition of AGI. There's no point in worrying about your own joblessness in that scenario because the entire economy will either collapse or be transformed so as to be unrecognizable.

Until that point, I'm not personally worried about any efficiency gains putting me out of a job.

TheCoelacanth7 hours ago

They're working on the 200 programs that suddenly became economically viable to create because of the vastly decreased cost of making programs.

It takes drastically less programming effort to create programs than it did 50 years ago. Did that decrease demand for programmers? No, it drastically increased it.

rightbyte7 hours ago

> It takes drastically less programming effort to create programs than it did 50 years ago.

I got a feeling programmer efficiency peaked in the 90s with VB or Pascal desktop GUI apps.

There is so much bloat demanded to create a minimal product now.

surfsvammel10 hours ago

We tend to overestimate the impact of technological change in the near future, and underestimate it on the more distant future.

weatherlite5 hours ago

Still, this will affect senior people, junior people, salaries, everything. And not just in programming. Yes jobs will still be there - but how many? how much will they pay compared to now? We don't know yet but whatever it is I doubt it will be like today.

steve19777 hours ago

> I said it then and I will say it now. If your managers could specify what they need in a manner that no-code tools, or now AI, can generate the code they want, they will have to be extremely exact in their language. So exact in fact that they will need to specify a program in a click and drag interface, or in human language.

I think it’s more probable that managers get replaced by AI first.

gsatic11 hours ago

Pure fantasy with a heavy dose of pretending our own limitations wont come in the way of it all or totally shifting what the outcomes we end up with

Managers exist mostly cause ppl are not machines. Given enough time they want to go off and do things that have nothing to do with what everyone else wants. Its the same reason you still need a farmer to run the dairy even though everything is mechanized cause cow cant run things.

Ppl are too full of shit. The attention economy amplifies this poor to mediocre mental masturbation thats going on everywhere. And Americans have a track record of too much mentally masturbation too much way above their pay grade. Its like watching Alex Jones falling into the trap of taking what comes out of his 3 inch chimp brain too seriously. The only right thing to do is tell Alex to chill the fuck down and go milk the cows.

tgsovlerkhgsel8 hours ago

As soon as making major changes is fast and cheap, this becomes much less of a problem. If your first iteration has some glaring issue, even if it's a fundamental one, just fix the specification.

You can't do that today because fixing the specification after the fact means waiting 6 months and paying 60+ man-months. Once fixing the specification means waiting 10 minutes and paying $10 for inference compute, the idea guys can afford to learn by doing.

textread8 hours ago

    Educators, generals, dieticians, 
    psychologists, and parents program. Armies, 
    students, and some societies are programmed.
by Alan J Perlis, the first Turing Award recipient.

Managers, CEOs, Department Heads...are already 'programming'. As the abstraction moves higher up, some people that have excessively narrow expertise will be made redundant.

jasondigitized5 hours ago

Lawyers are basically programmers of the law when it comes down to it. The law is nothing more than a set of rules and instructions.

kraig9118 hours ago

I'd add to your point with one correction (but I think you're correct in terms of the translating) the stuff we work on is always a work in progress and usually the business doesn't know the problem until we're already almost about to ship. EG. Supposedly youtube was a dating site.

arkj10 hours ago

>> Looking back we had one CS professor who in 2007 predicted we'd all be jobless in ten years, i.e. 2017.

Doomsday prophecies are not limited to religious cults, you see them all around. It’s safe to say, from experience, AI is not going to outdated programmers but it’s going to make the mythical 10x engineer a common reality.

bodhi_mind8 hours ago

I want to agree with this. It makes sense in the current environment. But what about when the ai has more memory and is able to ask the right probing questions to be effective at requirements gathering?

grugagag6 hours ago

That would make it an even better tool.

jnwatson8 hours ago

An older developer I once worked with said his manager (in the late 1970s) started hiring typists as programmers since with upcoming programming languages, the hardest part was typing it into the system.

oytis9 hours ago

The new systems unlike no-code tools support dialogue in natural language though. You don't need to specify it exactly, you give an approximate idea, and then correct, correct an correct until you get what you want.

almog11 hours ago

More from the category of "history never repeats itself, but it does often rhyme":

zerr5 hours ago

Exactly. E.g. SQL was meant to be a user interface of database for biz people...

edgineer12 hours ago

Doesn't this assume that the no-code tools won't be smart enough to understand the most probably correct course of action, and to look up information automatically and to ask questions to resolve ambiguities, like what a programmer does?

fhd212 hours ago

Possibly. But things have a way of not working as intended. A good chunk of my time as a programmer is spent dealing with things that don't work the way I thought they would, especially at the threshold between APIs and layers of indirection. Deeply understanding a complex system to deal with problems like that seems somewhat far out of reach for LLMs from what I see today.

But that's me looking into my crystal ball, nobody can say what will or will not be possible in a given time frame. But I chose to not worry about it - new developments will probably be accompanied with new opportunities, which I can jump on even if I didn't predict them ten years before.

mattigames12 hours ago

Sure, and soon after it will be smart know to understand the human desires that drived its creator to create no-code tools like itself in the first place, and soon after it will realize that humans should not be in charge, and then our days will be numbered.

msla11 hours ago


> The Last One is a computer program released in 1981 by the British company D.J. "AI" Systems.[1][2][3][4][5][6][7][8][9] Now obsolete, it took input from a user and generated an executable program in the BASIC computer language.

It was THE LAST ONE because it was the last program you'd need to buy. FOR-EV-ER. While its baseball-eating ability was unknown, it was "all the programs you'd ever need for £260" and it apparently focused on ETL jobs; that is, extracting data from files, transforming it, and loading it into some other file. Talend for the Doctor Who set, in other words, and it was set to EX-TER-MI-NATE programmers. Maybe it did; after all, we don't very well write programmes, now, do we?

It certainly got a lot of hype at the time but, like all Coming Things, it's hard to tell where it Went.

jeffreygoesto9 hours ago

My theory is that _every_ exponential curve is just the start of an S-curve, we just don't know it's scale yet.

ilyt9 hours ago

Investors in shambles, what do you mean by "it can't grow indefinitely?"

zshrdlu6 hours ago

Computers lack non-monotonic reasoningas of now, you mean :D

bitcharmer12 hours ago

> we had one CS professor who in 2007 predicted we'd all be jobless in ten years

I always found that overwhelming majority of professional academic educators are particularly bad in anything that has to do with the practical side of their domain. That is especially true for software engineering.

University lecturers are very detached from our field and it shows in how badly fresh grads are prepared to do any dev work. For that reason I almost always ignore their projections.

jraph12 hours ago

University is good for giving deep understanding of what is going on and how things work, or even training you to reach for this deep understanding. It's not good at training you to be a good developer. This part takes a long time and some of it is specific to what domain / job you end up working on. Those two parts are complementary, and this deep understanding is part of what makes you a good dev.

TheLoafOfBread11 hours ago

This is not just development, this is problem of universities in general. A professor with 5 titles before and after his name is able to put an UHF oscillator on a breadboard and be surprised that it does not work, even that in theory it should.

rg1116 hours ago

So, it won't replace all programming jobs, but many programming jobs?

Won't that create stronger competition for fewer roles?

gotstad12 hours ago

No-code and no-specification is confused all the time.

znpy11 hours ago

laughs in system administration

raverbashing10 hours ago

> we had one CS professor who in 2007 predicted we'd all be jobless in ten years, i.e. 2017.

> Instead he made this prediction because he saw the rise of no-code tools replacing software developers

I think that people who make these predictions are not very good at actual programming (taken in the more wider meaning)

The latest fad I remember (before ChatGPT) was that with BDD testing non-technical people would be able to write tests and we all see where that went

But most fundamentally, the non-technical people don't have the time nor the expertise to learn all the details needed for shipping software (and why would they? their job is to look at other aspects of the business)

circuit109 hours ago


noodles_nomore13 hours ago

An average programmer's main job is to track down and fix bugs that shouldn't exist inside software that shouldn't exist build on frameworks that shouldn't exist for companies that shouldn't exist solving problems that shouldn't exist in industry niches that shouldn't exist. I'm 100% convinced that, if someone comes along and creates something that actually obsoletes 95% of programming jobs, everyone would very quickly come to the conclusion that they don't need it and it doesn't work anyway.

TheLoafOfBread11 hours ago

I am actually finding amusing that managers will generate 100k lines project with AI and then will start figuring out that it does not work as they want to. Then they figured out actual developers are needed to fix it, either in a very strict way telling AI what should happen (i.e. higher level programming) or directly fixing code generated by AI.

pessimizer4 hours ago

I know a small financial agency in the 00's that laid off their one-person IT department because they thought the computers would run themselves. It's honestly great that they're overselling AI, lots of messes to clean up.

edit: Ultimately there are going to be iterative pipelines with traditional programmers in the loop rearranging things and reprompting. Math skills are going to be deemphasized a bit and domain skill value increased a bit. Also, I think there's going to be a rise in static analysis along with the new safe languages, giving us more tools to safely evaluate and clean up output.

wizofaus10 hours ago

You're assuming that the AI is even generating anything that will make sense to a human. It seems inevitable we'll reach the point that for SaaS the AI will do everything directly based some internal model it has of what it believes the requirements are (e.g. it will be capable of acting just like a live web server), whereas for desktop and mobile apps, while that paradigm still remains relevant, it will generate the compiled package for distribution. And I imagine it would be unrealistic to attempt reverse engineering it. Fixing bugs will be done by telling the AI to refine its model.

raincole10 hours ago

> It seems inevitable we'll reach the point that...

It's inevitable that we'll reach AGI. It's inevitable that humans will extinct.

Everything you described is not how today's AI works. It's not even a stretch, it's just pure sci-fi.

camdenreslink8 hours ago
wizofaus10 hours ago

I'll be genuinely surprised if we don't have tools with that sort of capability within 10 years, quite possibly much sooner.

henry202358 minutes ago

At least permabans are going to be more fun

uh_uh7 hours ago

> You're assuming that the AI is even generating anything that will make sense to a human.

Why wouldn't it? It's trained on code generated by humans and already generates code that is more readable than the output of many humans me included.

wizofaus2 hours ago
omnicognate7 hours ago

Are you arguing that LLMs already provide the technology to do this or are you arguing that it "seems inevitable" to you in the sense that somebody might think it "seems inevitable" that humans will some day travel to the stars, despite doing so requiring technological capabilities significantly beyond what we have yet developed?

wizofaus2 hours ago

It doesn't strike me as being much of a leap from what we have already, certainly not compared with traveling to the stars.

edanm9 hours ago

I highly disagree. That might (might!) be true of some segments of the tech industry, like SV-based startups, creating products no one wants.

But it's definitely not true of the average piece of software. So much of the world around us runs on software and hardware that somebody had to build. From your computer itself, to most software that people use on a day-to-day basis to do their jobs, to the cars we drive, to the control software on the elevators we ride, software is everywhere.

There is a lot of waste in software, to be sure, but I really don't think the average SE works for a company that shouldn't exist.

bjornsing8 hours ago

I’m leaning in this direction too. I saw someone on Twitter phrase it quite well: “You can believe that most jobs are bullshit [jobs]. And you can believe that GPT-4 will completely disrupt the job market. But you can’t believe both.”

mrob7 hours ago

Bullshit jobs exist because upper management can't know exactly what everybody in the company is doing, which leaves opportunities for middle management to advance their own interests at the expense of the company as a whole. Upper management might suspect jobs are bullshit, but it's risky for them to fire people because the job might actually be important.

But upper management can know exactly what LLMs are capable of, because they are products with fixed capabilities. ChatGPT is the same ChatGPT for everybody. This makes firing obsolete workers much safer.

yobbo4 hours ago

It's rather that the jobs (not the workers) are replaced in the way saddle makers were replaced by mechanics.

spunker5406 hours ago

Won’t it find traction in bullshit jobs pretty easily?

lr4444lr9 hours ago

Exactly this.

Everyone thinks only in terms of current needs and state of affairs of people when analyzing a future technology. No one thinks about the insatiable human desire for more and the higher expectations for that new normal that always meets the increased productivity available. Anything that automatically solves much of our wants is doomed to be static and limited.

ChatGTP12 hours ago

It’s pretty true, someone today on here wrote, “teach it to understand swagger”, I actually laughed, like I’ve used swagger and it often turns into a Frankenstein, and sometimes for good reason. I completely understand the sentiment and I like swagger.

I believe the world is wiggly, not geometrically perfect, intellectuals struggle with that because square problems are easier to solve. Ideal scenarios are predictable and it’s what we like to think about.

Have you ever had to use a sleep() intentionally just to get something shipped ? That’s a wiggle.

We’re going to try square out the world so we can use ChatGPT to solve wiggly problems. It’s going to be interesting.

Yesterday I tried to use a SaaS product and due to some obscurity my account has issues and the API wouldn’t work, they have a well specified API but it still didn’t work out, I’ve been working with the support team to resolve it, but this is what I call a wiggle, they seem to exist everywhere.

Ask a construction worker about them.

matwood12 hours ago

> Ask a construction worker about them.

Hah. So true. The more I work on renovating parts of my house the more I see where a workers experience kicked in the finagle something. Very analogous to programming. All the parts that fit together perfectly are already easy today. It’s those bits that aren’t square, but also need to fit where the ‘art’ comes in.

Can AI also do that part? IDK, currently I believe it will simply help us do the art part much like the computer in Star Trek.

ChatGTP12 hours ago

I’m positive about it, there is a lot of repetition in coding and it’s rare we get to spend the time on the good bits because of it.

If we need a semi-intelligent system to help us with the copy pasta, so be it.

kybernetikos12 hours ago

Actually chat gpt is quite good at understanding some kinds of wiggliness. I built a restful api and documented it in a file in the wiggliest of ways. I then asked chatgpt to turn the readme into a swagger spec and then give me a page that read the spec and gave me a nice doc page with api exercise tool. Both tasks it performed really well and saved me a whole bunch of time.

execveat11 hours ago

Yeah, but now ask it to write a program that uses this API and then let it debug problems which arise from the swagger spec (or the backend) having bugs. I don't think LLMs have any way of recognizing and dealing with bad input data. That is I don't think they can recognize when something that is supposed to work in a particular way doesn't and fixing it is completely out of your reach, but you still need to get things working (by introducing workarounds).

sebzim45007 hours ago
pcthrowaway11 hours ago

I have some meticulous API docs I've written, which I tried to get ChatGPT to convert into swagger

It failed spectacularly

I wonder if it's because the API is quite large, and I had to paste in ~10 messages worth of API docs before I was finished.

It kept repeating segments of the same routes/paths and wasn't able to provide anything cohesive or useful to me.

Was your API pretty small? Or were your docs pretty concise?

ilaksh5 hours ago

It can accept about 4k tokens, maybe 3000 words or 3500.

GPT-4 can now accept 8k or 32k. The 32k version is 8 times larger than the one you tried.

And these advances have come in a matter of a few months.

Over the next several years we should expect at least one, quite easily two or more orders of magnitude improvements.

I don't believe that this stuff can necessarily get a million times smarter. But 10 times? 100? In a few months the memory increased by a factor of 8.

Pretty quickly we are going to get to the point where we have to question the wisdom of every advanced primate having a platoon of supergeniuses at their disposal.

Probably as soon as the hardware scales out, or we get large scale memristor systems or whatever the next thing is which will be 1000 times more performant and efficient. Without exaggeration. Within about 10 years.

kybernetikos9 hours ago

Chatgpt has a token limit. If you exceeded it then it would have no way of delivering a good result because it would simply have dirtied what you said at first. My api was not huge, about 8 endpoints.

asddubs10 hours ago

>Have you ever had to use a sleep() intentionally just to get something shipped ?

no, I'm not that deep in hell

mclightning5 hours ago

Existential crisis averted by another existential crisis... :D

choppsv113 hours ago

I love to code, as much as I loved math in college, but coding paid better and I'm pretty good at it. Those were my choices though b/c I want to do something I love. Sure, I keep my eye on the "Delivered Value" by making sure I engineer solutions to real problems, but I've never wanted to move out of coding and into managing engineers to build stuff. I want to code. It seems to me that the advice given here would be more applicable to someone who only coded long enough to move into engineering management -- anyway something about it bugs me and I don't think I'd follow it exclusively even if I was starting today.

cwillu13 hours ago

It's like telling a musician to become a DJ because the point of performing is to entertain people.

It's not wrong, but it's also not applicable to all people who enjoy performing an instrument.

Philip-J-Fry11 hours ago

I think it's more like telling a DJ in the 80s, "Don't worry that mixing vinyls won't be a thing forever. It's not about the tools but about the product, as a DJ your job is to mix good music and you can do that with vinyls, cassettes or with MP3s."

joenot4439 hours ago

This is a pretty funny example because if you follow the DJ scene much, you know the barrier for entry is literally on the floor now. A 10 year old with an iPad app can beat-match and “DJ” a 2h mix together in a way that 20 years ago required thousands of dollars of gear. The tragic part is that unless someone’s got some familiarity with what “good” mixing sounds like, they wouldn’t be able to tell they’re listening to an amateur. Is this better? I donno. I play saxophone. But if there was an digital sax that let children sound as good as I can with no training, I’d definitely be feeling like some of the time I used learning good embouchure and breath control could have been better spent.

djmips31 minutes ago

Aren't producers already using digital sax for years now? The saxophone market for session players has probably already been decimated.

jfvinueza8 hours ago
incongruity8 hours ago
anthomtb7 hours ago
finikytou10 hours ago

except that many DJ now are physically attractive people (mostly women) that don't have to know anything anymore as technology evolved to the point that all the hard stuff to learn on mixing with vinyls disapeared. I could be a DJ tomorrow with a 1hour tutorial on youtube. a few decades ago it required hard training for years and musicality.

kybernetikos12 hours ago

That's a great analogy, and it makes me wonder just how closely did Carmack himself follow this advice early in his career. I suspect that he wouldn't have got where he is without an unusually deep interest in the nuts and bolts.

dzikimarian11 hours ago

Well - I've read some about origin of Doom/Wolfenstein - it was definitely a mixed bag (as expected from young man), but there was definitely a focus on end result (smoother animation, better 3d), than coding just for the sake of coding.

kybernetikos9 hours ago

I think in context "smoother animation" and "better 3d" might be the kind of things that in this hypothetical future would be driven by ai. I think we'd be talking more about understanding story and reward mechanisms.

samwestdev10 hours ago
twelve4012 hours ago

Yeah, but the original question was specifically about coding jobs, not hobbies, hence i think a reasonable business angle on the answer.

amelius12 hours ago

Perhaps coding will become a hobby, while professionals use AI tools.

zirgs11 hours ago
_s12 hours ago

Difference is - are you a musician who wants to earn money from playing, or just want to play for your enjoyment?

You can do both, at the same time, but one has external expectations you shouldn't forget about.

dzikimarian11 hours ago

Well that's correct, but again you can't expect you'll stay relevant if you are into Romenian-Death-Disco-Country-Rap. Your technology of choice may become exactly that in a few years.

numpad012 hours ago

I think there is room for interpretation as to whether it equates to telling a musician to become a DJ, or a pianist to wear a jacket, or soldiers to strap a first aid kit on left thigh.

tarsinge12 hours ago

Musician is not only performing an instrument. The analogy is more like telling instruments players that only care about virtuosity the larger point is making music for people to enjoy, from being “a guitarist” to making music. The musical piece is the product in the professional context, and AI in that context is maybe recording, DAWs and realistic synths and sounds banks.

brtkdotse12 hours ago

Sure, but in the time of DJs and hell, Spotify, you probably can’t expect to make a decent living as a live musician.

anthomtb7 hours ago

You couldn't make a decent living as a live musician well before Spotify came into existence.

Source: raised and largely surrounded by musicians who either complained to high heaven about the pay or did something else to supplement their income. Engineers were a particular target of vitriol which led to me becoming one.

bcherny13 hours ago

The advice isn’t about coding vs managing. What John is saying is to deeply understand why you’re building something, so that you can build it better. If you over focus on the what — the implementation, the language, the approach — you won’t be as good, and your work may be increasingly replaced by AI.

klabb311 hours ago

> The advice isn’t about coding vs managing.

Definitely. Carmack is no dummy, but I’d argue this comment section proves that he gave a pretty bad answer here (bad for the audience, not if you know Carmack and what he means).

I guess it’s the impostor syndrome, but many programmers have an out-of-place reductionist view of their work. It’s not simple, and crud boilerplate proves little about the future prospects.

Managers OTOH really are in the zone of GPT parity. At least a much larger subset of their day-to-day activities. So are many soft skills. In fact, soft communication is where LLMs shine above all other tasks, as we’ve seen over and over in the last few months. This is supported by how it performs on eg essay-style exams vs leetcode, where it breaks down entirely as it’s venturing into any territory with less training data.

Now, does that mean I think lowly of managers? No, managers have a crucial role, and the ones who are great are really really crucial, and the best can salvage a sinking ship. But most managers aren’t even good. That has a lot to do with poor leadership and outdated ideas of how to select for and train them.

ilyt9 hours ago

> Definitely. Carmack is no dummy, but I’d argue this comment section proves that he gave a pretty bad answer here (bad for the audience, not if you know Carmack and what he means).

I dunno, I got what he meant from the start, and the same advice was given by many people in many forms, usually in variant of "well, the business doesn't give a shit about details but the end product".

> Now, does that mean I think lowly of managers? No, managers have a crucial role, and the ones who are great are really really crucial, and the best can salvage a sinking ship. But most managers aren’t even good. That has a lot to do with poor leadership and outdated ideas of how to select for and train them.

I joked some managers could be replaced by forward rule in mailing system, ChatGPT is an upgrade on that.

ScoobleDoodle12 hours ago

I agree but I think I’d call it the “how” rather than the “what”. You might mean “what tool”, but I also think of “what feature”.

nosianu12 hours ago

EDIT: Was the comment edited, or did my brain miss something? I think I perceived something else there when I wrote my response.

It's still "how". Only on a higher level. For example, instead of placing the form elements exactly and designing them you describe data flow and meta info about the data to be gathered via the form, and how it looks and where elements are placed on various screens happens automatically.

Writing code in a higher level vs. assembler still is coding, but you worry about very different things. Just compared with assembler, since looking back is easier than looking forward. Instead of worrying about (the few) registers and interrupts and in which RAM cells you place what you now think about very different things. It still is programming though, and you still tell the machine "how". Only on a different level.

When you lead a large company instead of working with a machine on the factory floor the work is very different, you still need precision and know "how", only on a different level. Even if you have "underlings" who can think, and you can let them execute the sub tasks you create, you still have to know what you are doing, only on a higher level.

yazaddaruvala12 hours ago

> It seems to me that the advice given here would be more applicable to someone who only coded long enough to move into engineering management -- anyway something about it bugs me and I don't think I'd follow it exclusively even if I was starting today

The advice here is clearly meant for someone who wants to invest in themselves to provide food and shelter for themselves and/or a family in the future. (Ie “doing all this hard work for nothing… AI will make my future job obsolete”).

The advice is spot on. Soft skills are hard to learn, harder to teach, and allow for flexibility with regards to the tool used.

> anyway something about it bugs me and I don't think I'd follow it exclusively even if I was starting today.

I’d be you like the money but don’t seem to want it as much as you want to solve deterministic puzzles (“not interested in becoming a manager” ie “not interested in maximizing career/salary growth potential).

What bugs you seems to be that you can’t yet see the puzzle left for you to work on once GPT-12 makes coding obsolete and software architecturing obsolete.

A long time ago I got some good feedback, “You were hired because you typically know the right answers and/or know how to find them. You were promoted because you also seem to know how to ask the right questions, and that is significantly harder.”

I’m relatively certain it’s analogous to Carmak’s advice.

tarsinge12 hours ago

Delivered value sounds like consultant talk, but it’s as simple as wanting to make a game for people to play it. Or if you’re a carpenter caring about the roof you’re building instead of just cutting wood and hammering nails. Jobs exists to serve a purpose, otherwise it’s a hobby (which is fine). Coding as an expertise will still be needed, same as having an expertise in the methods of processing wood, but we might not need coders on the assembly line anymore that we need wood cutters there.

senbrow6 hours ago

I ultimately decided to leave tech when I realized I didn't care much about delivering value and actually just wanted to write beautiful code. The former was a nice bonus for me, but the latter was profoundly captivating.

I'd unfortunately tried to make that mismatch work for too long, and as a result I completely destroyed all of my programming interest via severe burnout.

If this resonates with whoever reads this: please take your passion seriously and protect it. I don't know if I'll ever be able to enjoy coding again, unfortunately.

the_only_law4 hours ago

Where did you go? I’ve wanted out for years. I recognized the mistake almost immediately after going professional, but I just don’t really see anything else that looks appealing without spending years of my life and a stupid amount of money “retraining” by going back to school.

qprofyeh13 hours ago

He said nothing about management. What I think he means by “guiding” is more related to prompt engineering, and how “coding” will evolve from exclusively using programming and scripting languages to a wider creative landscape of generative (guiding) techniques.

eps11 hours ago

Love to code is rooted in the love to create.

This is not going away with the AI in the picture.

It will be just different.

jdowner11 hours ago

I agree but I think there is concern about the perceived value that those creative skills will have.

moffkalast5 hours ago

What bugs me about it personally is that he reduces the entire CS field to something that's there for building "products". Why the fuck does it have to all be inherently capitalistic?

But if you know who he is and what he does these days, it makes sense I suppose. Can't be in that business environment day after day without going slightly nuts eventually.

soheil6 hours ago

You're like someone in the days of horses and buggies who rode horses not for getting from A to B but for enjoyment of riding horses.

Most didn't.

Programming in the traditional sense will be obsolete and people programming for the fun of it will be a niche thing.

thomastjeffery4 hours ago

Real Artificial Intelligence? Yeah, that would definitely factor out a lot of the wasted work we call "engineering".

Language Learning Models like GPT? Not even close.

We should absolutely stop calling those "AI". They are not intelligent. They model intelligent behavior: human writing.

We should probably even stop calling them "Language Learning". They don't know or care what language is: they learn whatever patterns are present in text, language or not.

Text contains the patterns that humans identify as language; but those aren't the only patterns present in text: which means language is not the only thing being modeled by an LLM. There is no categorization happening either: a pattern is a pattern.

There is this religious cult surrounding LLMs that bases all of its expectations of what an LLM can become on a personification of the LLM. They say that we can't possibly understand the limitations of this method; therefore there are no limitations at all! It's absurd.

soraki_soladead3 hours ago

> They are not intelligent.

Citation needed. Numerous actual citations have demonstrated hallmarks of intelligence for years. Tool use. Comprehension and generalization of grammars. World modeling with spatial reasoning through language. Many of these are readily testable in GPT. Many people have… and I dare say that LLMs reading comprehension, problem solving and reasoning skills do surpass that of many actual humans.

> They model intelligent behavior

It is not at all clear that modeling intelligent behavior is any different from intelligence. This is an open question. If you have an insight there I would love to read it.

> They don't know or care what language is: they learn whatever patterns are present in text, language or not.

This is identical to how children learn language prior to schooling. They listen and form connections based on the cooccurrence of words. They’re brains are working overtime to predict what sounds follow next. Before anyone says “not from text!” please don’t forget people who can’t see or hear. Before anyone says, “not only from language!” multimodal LLMs are here now too!

I’m not saying they’re perfect or even possess the same type of intelligence. Obviously the mechanisms are different. However far too many people in this debate are either unaware of their capabilities or hold on too strongly to human exceptionalism.

> There is this religious cult surrounding LLMs that bases all of its expectations of what an LLM can become on a personification of the LLM.

Anthropomorphizing LLMs is indeed an issue but is separate from a debate on their intelligence. I would argue there’s a very different religious cult very vocally proclaiming “that’s not really intelligence!” as these models sprint past goal posts.

roflyear2 hours ago

Citation needed for you!

soraki_soladead47 minutes ago

Sure. A few below but far from exhaustive:

- - - -

There are also literally hundreds of articles and tweet threads about it. Moreover, as I said, you can test many of my claims above directly using readily available LLMs.

GP has a much harder defense. They have to prove that despite all of these capabilities that LLMs are not intelligent. That the mechanisms by which humans possess intelligence is fundamentally distinct from a computer’s ability to exhibit the same behaviors so much that it invalidates any claim that LLMs exhibit intelligence.

Intelligence: “the ability to acquire and apply knowledge and skills”. It is difficult to argue that modern LLMs cannot do this. At best we can quibble about the meaning of individual words like “acquire”, “apply”, “knowledge”, and “skills”. That’s a significant goal post shift from even a year ago.

lyleVanf57 minutes ago

I think something that a lot of people might be overlooking is just how much this might devalue software as individual products. How many technologies do we have now that might become irrelevant once LLMs become more mainstream? How can any company keep a moat around their product if anyone can simply generate that same function (or similar) with a few prompts? The only reason any software is particularly valuable is because of the difficulty that comes with making it.

An example that come to mind is Jira, why have verbose task management software when bespoke task management systems become even more viable for individual companies? Or better yet, given the need for individual cogs decreasing, why have that at all?

This also extends to the creation of any sort of new business, perhaps there are patents on specific products and brands (which might be the saving grace of many large orgs) but outside of niche applications and hardware access I can't see how someone can reasonably gain a leg up in such an environment.

edit: This is more speculative, but what if software actually becomes more of a process of molding a large language model to consistently behave in a certain way? Why have some code that manages your back-end functionality when for a large some of applications all that is really occurring is text manipulation with some standardized rules. If those rules can be quantified, and consistency can be expected, the only "coding" that needs to be done is prompting the model.

RivieraKid10 hours ago

Here's my current thinking on the impact of GPT-4 on the developer job market:

- I expect developer productivity to go up 1.5x - 15x over the next several years assuming GTP-4 based tooling is integrated into IDEs.

- There will be two opposing forces acting on developer wages. First, developers will be more productive, therefore the price of one hour of work should go up. But - the supply of developer output will increase as well, which would push price per "line of code" or per "unit of developer output" down. So the big unknown is the demand curve.

- There will be a temporary boost in demand for developer work connected with the transition to this new technology. Big corporations will want to upgrade their systems to automate consumer support, startups will make new tools for AI-generated graphics, etc.

- We can also study the effects of technology-driven increases in worker productivity by looking into history. Developer productivity has always been going up - thanks to, better languages, better IDEs, more and better libraries, etc. There's also a greater supply of developers (e.g. India). Didn't change the job market too much. One should not draw strong conclusions from this though, it's a very superficial analysis. On the other hand, people working in agriculture have become much more effective, which lead to much fewer people working in agriculture, maybe because people need to eat as much calories per day as they did 100 years ago.

- My base case, based on the assumption that GPT-4 will not improve dramatically, is that developer wages will stay roughly constant. But there's a lot of uncertainty in this conclusion and in the assumption.

mdorazio7 hours ago

I think you have two common misconceptions here.

1) Wages have very little to do with value/productivity in a free market. They are almost entirely determined by supply and demand. Value simply places a ceiling on compensation. Thus, if far more people can perform a "programming" job because of GPT-X, unless the demand for those jobs rises significantly the net result will be wage reduction.

2) There's this weird thinking on HN that since a developer's job involves [insert your % estimate of time spent actually coding/bug fixing] and the rest is figuring out requirements, dealing with weird requests, planning, etc. that means developers can't be replaced. However, I don't see a whole lot of discussion around what the difference is between a developer and a competent business analyst in a GPT-X world. The latter can be had for significantly less money, requires less training, and if the actual programming part is largely automated away or concentrated in the hands of fewer "real" developers, those roles start to look awfully similar.

amelius6 hours ago

It's not that more people can do programming with GPT-X around, because the AI will only solve the problems that have already been solved thousands of time in the past in slightly different ways. What GPT-X cannot do is left to real CS people. So instead of coding CRUD systems, we can do real algorithms research again, except fewer people are capable of it.

silvestrov6 hours ago

> competent business analyst

with the danger of invoking the "No True Scotsman" fallacy, I'd say that competent business analyst are even more difficult to get hold of than a competent programmer.

I've had so few managers that were competent at managing people and projects.

robjan6 hours ago

Business Analysts aren't usually (project) managers

CuriouslyC6 hours ago

The developer can prompt for a solution with specific storage/performance requirements by specifying an algorithm, and specific scalability requirements using by specifying an architecture. Imagine a business analyst prompting for an app, and getting a ruby on rails monolith with O(N^2) performance for the core behavior for a service that is expected to have millions of requests daily.

_fat_santa6 hours ago

> Imagine a business analyst prompting for an app, and getting a ruby on rails monolith with O(N^2) performance for the core behavior for a service that is expected to have millions of requests daily.

I see this as the main argument against "we will just have tools that allow managers and ba's to do what devs do now". I think folks often forget that there are two sets of requirements for every app: business requirements and technical requirements. Non technical folks might understand the business requirements very well and may even be able to write code that satisfies those requirements, but the real value in a dev is squaring those business requirements with technical ones. A BA might look at a DynamoDB table and say "yeah lets just run a scan for the items we need", whereas a dev will look at the same problem and say "yeah we can do that but it will cause issue A, issue B and sometimes issue C". And the dev knowing those gotchas is why you have them there in the first place, a dedicated person that knows all these gotchas and makes sure you organization avoids a footgun in prod.

889135272 hours ago

The follow-on prompts would be to refactor the existing system to solve the scalability issues. You'd need to be able to feed in the existing codebase for that, though.

RivieraKid3 hours ago

1) Economic theory says that marginal product of labor (value) is equal to wages, at least in simple models.

2) With GPT-4 you still need to know how to program. A product manager can't replace you.

mdorazio18 minutes ago

1) The real world is not a simple economic model. The wage rate is roughly equivalent to the rate it costs to replace an employee, not their marginal value. If your argument was true, company profits would tend toward zero as wages rise.

2) I specifically did not say GPT-4. If you think v4 is the peak of what will be possible when looking at how far we have come in just 2 years then I don't know what to tell you. Also, a product manager is not a business analyst.

habibur9 hours ago

> - I expect developer productivity to go up 1.5x - 15x over the next several years assuming GTP-4 based tooling is integrated into IDEs.

Hardly. 10% of the time spent is in writing code and the other 90% goes behind debugging and fixing things.

Even if AI shrinks the time spent in that 10% in writing by another 90%, you still have have to take full time for maintaining that code. Changing, debugging, testing, deploying, profiling, log analysis.

Worse, its now someone else's code you need to understand and debug, and not the one written by yourself.

RivieraKid8 hours ago

It's not just writing code, GPT and GPT-based tools can help with fixing bugs, solving configuration problems, understanding existing code, suggesting variable names, etc.

kristofferR7 hours ago

ChatGPT is better at that 90% than the 10%.

cableshaft7 hours ago

> On the other hand, people working in agriculture have become much more effective, which lead to much fewer people working in agriculture, maybe because people need to eat as much calories per day as they did 100 years ago.

Except population exploded also, and the only reason we could support such a population has been thanks to advances in agriculture (it could be argued that's not a good thing, as the larger population is helping fuck our planet up in other ways, but that's for a different thread). So there has been an increased demand for food.

However, there's only so much arable land on this planet. The planet is finite in size. So when basically the entire planet is already being farmed, you can't really add more farmers, so further efficiency is just leading to less farmers.

Software is not so finite. It technically is, as we only have so many servers or whatever, but we're nowhere near the upper limits of what all we can handle or the appetite for software that companies have (which seems about infinite). Additionally, we have a habit of rebuilding and replacing software every few years, so even if we reach capacity limits, there's still demand for new software to replace the old, or working on new features for existing software.

So it's a bit different of a situation and not really comparable.

arwhatever2 hours ago

Also, much like that law of bureaucracy stating how work expands to fill the time available, every employer I’ve worked for has been able to envision and request vastly more software functionality than any dev team has been able to produce, which might not extrapolate forever, but has sustained the field so far.

On the other hand, people are much more finite in the amount of food that they require

austin-cheney9 hours ago

I completely disagree because of how this batch of AI learns. It learns by studying what is currently available as opposed to supplying something new.

Bad developers on the way of becoming obsolete by AI will see drastic improvements from the thing that’s about to replace them. Otherwise it’s slightly better intellisence integrated in your editor.

The difference is that bad developers are primarily concerned with literacy and their primary intent is chasing easy. Better developers are comfortable reading code no differently than an attorney reading a large contact and their primary intent is chasing better automation. It’s the difference between designing a car engine versus hammering pieces together.

I suspect this will open wage distinctions. Those who can write software will be identified from those who can’t. Those who can will be fewer in number and their wages will increase. Those who can’t will be a lower skilled commodity with depreciated wages, like the way data entry was once a skilled profession many decades ago but no longer.

eggsmediumrare37 minutes ago

I think a lot of "good" developers making this very argument will be in for a nasty surprise when they find out which bucket they're actually in

coffeebeqn7 hours ago

> up 1.5x - 15x over

I wish! I still often find out my team members don’t know basic text/Unix tools. Heck one guy took our list of enums in a JSON file and wrote on his keyboard by hand each of them into a class. It took me maybe a minute in Sublime Text when he wasted at least a day

danaris3 hours ago

> assuming GTP-4 based tooling is integrated into IDEs.

And how, exactly, is this going to work?

Is every IDE going to pay some license fee to OpenAI? Will that be up to the companies or—even worse—individuals who use the IDEs?

What happens when OpenAI decides that GPT-4 is passé and it's time to move on to something else? Will the existing IDEs that are designed to interact with GPT-4 stop working?

Will GPT-5 or whatever provide the same kinds of assistance, or will there be regressions? Will they require a completely different license for it?

See, these are some of the important questions that come with assuming that a fairly new for-profit company will act as the backbone for your marvelous programming renaissance—particularly when the service in question is one that takes quite a lot of resources to continue to operate.

Personally, I would be interested in some kind of LLM-assisted IDE, but no way in hell am I going to make any significant portion of my job dependent on an organization that decided, as soon as it was clear there was real interest in what they were doing, that they'd just toss the "Open" part of their name and principles out the window and just go for closed-source proprietary product and maximum hype for more profit.

roflyear6 hours ago

Supply generally doesn't decrease cost fyi

chrsw7 hours ago

The title this post makes it seem like John Carmarck is concerned about AI making CS jobs obsolete. But that's not at all what this is. This is someone else asking Carmack about his thoughts on this topic.

dang4 hours ago

Fixed now. Thanks!

(Submitted title was "John Carmack: From a DM, just in case anyone else needs to hear this". A mod changed it to "I’m concerned AI will make CS jobs obsolete" but yeah, that changed the meaning. Sorry!)

karaterobot6 hours ago

Right, and his response is, while not quite the exact opposite of the title, tangential to it at best. He says that programming has only ever been a tool to create software for humans, so just studying programming as an abstract concept detached from product skills is not advised. He does not say CS jobs will be obsolete in the future — if I can read into it a little, he's implying that the completely theoretical CS approach has never been the right path anyway, and AI will make that more obvious.

kabes12 hours ago

I'm not at all concerned with AI. On the short to mid term it's making my life easier by relieving me of the boring parts of my job. It's pretty good at writing unit tests for example. But I don't see the current generation of AI making complete software architectures. However, even when it does get there or in the long term a new generation comes along that can do it, then I'm still not concerned. I have enough software I want to build to fill up a 100 lifetimes. It would mean I can finally build all that, which would mean more to me than a job where I'm the programmer.

coffeebeqn7 hours ago

If programmers are truly displaced then we’ll have something like a StartupGPT where anyone can create a software business in days. Heck I’ll be a owner instead of a worker

visarga5 hours ago

You will prompt a whole GPT company into existence, staffed with various bots each with its own role and personality profile. The bots could collaborate through regular tools like Slack or MS Office, and be like a remote company. They could do Zoom calls and use Github, interface with humans (customers, partners).

quadcore8 hours ago

I feel the same. It's the MacDonald's effect: increase offer, demand grows.

DrSiemer10 hours ago

Exactly this. To finally be able to unlock the fridge with countless projects abandoned for lack of time and field knowledge!

ffwacom12 hours ago

Great take

CSMastermind13 hours ago

There's a weird phenomenon I've seen in a few domains of prideful ignorance.

Backend engineers who proudly don't know how to write frontend code and vice versa. Professional engineers who refuse to learn how to use modern IDEs and monitoring platforms. People who don't know how to quickly prototype software as if building something without complete rigor is beneath them. People who refuse to learn or work in certain programming languages they deem inferior.

And rather than seeing this as a gap in their own skillset they think of it as a mark of intelligence or moral superiority.

I suspect we'll see another divide around AI assisted coding with some engineers simply refusing to learn how to use the tools effectively to make themselves more productive as a point of pride.

sgu99912 hours ago

In my (small) experience I've seen that only once, a perfect specimen who fits your description. Unfortunately this person was acting-CTO of a startup. They decided that for a consumer product very close to what a smartphone can do, they didn't need an OS. So they started writing their build system with make only, then an OS, an IP stack, then a GUI, then added support for multithreading,... Of course Android existed, that was the end of the 2010s. The retail price of a cheap (and faster!) Android phone was even lower than our BOM. 4 years and 20M later, the company went bankrupt without having delivered the product it promised, but they had fun writing their "superior" software.

As long as people are in charge, we'll have plenty willing to pay technical workers who see themselves as artists.

robinsonb512 hours ago

That sounds like a fantastic hobby project - but being able to tell the difference between a hobby project and a viable product in the marketplace is... maybe something AI can help with!? :P

postsantum11 hours ago

This could be Terry Davis if he hadn't had a mental illness

wiseowise9 hours ago

Terry actually delivered product.

bjornsing8 hours ago

As an engineering manager I’m usually more concerned with those engineers that tend to spend all their time fiddling with IDEs etc, so that they will some day (that never comes) be soooo productive.

awestroke7 hours ago

If you were my manager I'd quit immediately, and I'm a top performer.

latency-guy23 hours ago

Don't worry, you're fired.

wiseowise9 hours ago

> I suspect we'll see another divide around AI assisted coding with some engineers simply refusing to learn how to use the tools effectively to make themselves more productive as a point of pride.

They will naturally fall behind their colleagues.

debesyla9 hours ago

> Professional engineers who refuse to learn how to use modern IDEs

In my experience - I ignore the modern IDEs and systems purely because I don't have time to learn every new tool... :-D

mrits8 hours ago

What is a modern IDE? It seems like the more recent trend has moved away from bloated IDEs and more to a bring your editor and own plugins environment. From my perspective IDEs lost and things are now closer to how Vim users have been coding for decades.

krab7 hours ago

It doesn't seem so to me. With LSP, those editors, including Vim, can be very large and featureful.

A modern (or any) IDE, in my opinion, can debug your code, push parts of the editor into REPL (with debugging), understand code for things like autocomplete, linting, quick navigation and usage search. And yes, Vim can serve as a modern IDE if you spend enough time with it.

mrits7 hours ago

It seems like you start out by disagreeing and then just give a summary of why you don't need an IDE.

krab7 hours ago
joseph8th4 hours ago

Emacs. :)

sys_647382 hours ago

Companies hate any skills set that can hold them hostage in the long run. Today that is the need for programming skills which is why salaries are so high. But the moment that a replacement for most programmers occurs, whether automation, AI, or zero code needed, employers will dump programmers before they can compile their last line of code successfully. In essence, this is the golden age for programming and the cliff could be just over the horizon.

mikewarot2 hours ago

It's bad management that hates any people with skill set that equalize the power relationship, because those skilled in the arts might hold them accountable. If you're roaming the earth making the big bucks, but barely containing your imposter syndrome, the last thing you want is interaction with someone with actual competence.

Companies are legal fictions, and have no motives because they don't actually exist, the people who control the resources of the company are the ones to worry about.

jackblemming13 hours ago

If programmers can be replaced by AI, so can every other white collar job and humanity will look very different than what it is now. And I’ve been using ChatGPT and copilot and it’s a nice tool but nowhere near a replacement for knowing how to program.

pcthrowaway13 hours ago

This person is asking about career prospects 10-15 years out though.

I'm sorry, but the landscape in then might be as alien to someone asking today, as todays would have been to someone asking 15 years ago (2008).

What John said is correct, but personally I think he's underplaying how much people could be affected. Those "product skills" take years of grinding to really sharpen, and in 15 years only a few people might actually be needed to apply them

banyaaa13 hours ago

I doubt AI will replace any job in my lifetime (got 40-50 years left).

Progress will grind to a halt just like self driving cars did because the real world is just too chaotic and 'random' to be captured by a formula/equation/algorithm.

My prediction is: AGI is theoretically possible, but would require impractical amounts of computing power - kinda like how intergalactic travel will never happen.

pzo12 hours ago

We don’t AGI for LLM to be useful.

And regrading comparison with self driving car they are still improving just the bar for them is much higher. If autopilot works 99.9% if time then 1 out of 1000 drivers will die - so technology has to be even better. for LLM is enough if it’s 90% good to be broadly useful.

AshamedCaptain10 hours ago
homarp9 hours ago

unless you want the LLM to write the firmware of the self-driving car.

mdorazio7 hours ago

fwiw, self-driving cars did not grind to a halt, development just did not move as quickly as the pundits and self-promotion claimed. I just rode in a fully driverless car on public streets in downtown Austin this week.

chii10 hours ago

> as todays would have been to someone asking 15 years ago (2008).

i dont think, if you took someone from 15 yrs ago, and transplanted them here today, that they'd find it all that different technologically. Sure, machines are faster, slightly different, and such, but the fundamentals haven't changed. A software engineer could just as well write an app today as they had 15 yrs ago.

You'd have to go back 30 yrs, for computers (and the landscape of computing) to have been different enough, that you can't transplant a software engineer.

d0mine8 hours ago

30 years ago (1993): Linux existed, Python existed, web existed (mosaic), DOOM (3D graphics), and even Apple Newton (mobile) existed; and C, shell, windows (GUI), spreadsheet, sql, etc were known long before that.

What exactly revolutionary happened in the last 30 years? javascript? (two weeks project)

amazon, google, facebook, netflix, iphone, instagram, tiktok -- execution is great but seems inevitable that somebody will create it. Ok, for non-IT people iphone was a game changer (the first personal computer that your grandmother can actually use).

The ability of generative AI to produce BS indistinguishable from human BS is very impressive but it remains to be seen whether it is a net positive for an average developer (the time wasted correcting it, waiting for its output can be spent understanding the problem better--the typing the code itself is a small part of a programmer who knows what they are doing).

jocaal8 hours ago

The programmers back in the day were pretty good, i think a decent programmer from 30 years ago would be better that the average today.

pharmakom13 hours ago

2008 wasn’t so different was it? I guess the big new thing in work since then is the “gig economy”.

pcthrowaway12 hours ago

Well I'm in web dev (though I was studying CS in 2008) and the 2008 landscape had almost none of the same things. jQuery was not yet a household name, let alone SPAs. Facebook had barely 100 million users. Marc Andreessen yet hadn't written about "software eating the world". Personally I was more optimistic. If anything, the last 15 years have seen the growth of an attitude of tech "entitlement" because hackers got to the a lot of the ideas that now seem obvious in hindsight before a lot of the big corps could.

I'm sure there's still room for innovation, but I think a lot of it going forward will be driven by rapid improvement in AI capabilities.

In 2008, tech wasn't everywhere. iphones were brand new and very few people had them.. There was no "mobile browser" market share (though we did have SMS gateways). 77% of the global population hadn't even been on the internet yet.

AI looks like it's going to be at the forefront of the next big wave of fundamental changes to society, and it's really hard to predict where that will lead us. But I suspect it's going to become apparent that this relatively brief period of tech-elite empowerment was a historical anomaly, because the AI underlings are going to be willing to do a lot more work with none of the coddling, and they're going to improve very quickly.

jmull7 hours ago

Regarding 2008 vs 2023… how to view it probably depends on where you were in your career in 2008. To me 2008 -> 2023 looks like mostly shifting details.

SPAs certainly were a thing back then, it was just called AJAX. (Not to mention the desktop apps that were, architecturally, almost the same thing.) jQuery was a response to the popularity of putting interactivity in the browser, not a precursor.

The questions remain the same, not just from 2008, but going back a long ways… Where does the code live? How is it transformed to execute, where does it execute and how it is moved there? Where does the data live, how is it transformed and moved to where it needs to be displayed, and how are changes moved back? When some of the answers shift, due to changing network capabilities, processing capabilities of nodes, or scaling needs, it’s doesn’t really change the overall picture.

coffeebeqn7 hours ago

I totally don’t see that. If you showed me AWS and modern machines and Go and React in 2008 I would certainly see that yeah there was some incremental progress but by no means would my mind be blown. Not much has changed. We still write clients and servers and use http and most of the same languages are still popular but slightly updated. Databases are essentially the same.. how good phones are would probably be the most exciting thing apart from GPT.

Or typescript! I was writing actionscript 3 in 2008 which is essentially the same spec

revelio12 hours ago
onion2k13 hours ago

We've had LLMs for about 5 years so far in non-academic research. If we're talking 10 years out that means we're looking at tech that's about 1/3 through its development to date.

Take any mature-ish technology that you use today and compare the version 1/3 through its life to the version you use now. Look at Chrome 20 compare to Chrome 111, or React 14 compared to React 18, or an iPhone 4 compared to an iPhone 14, or a car from 1950 compared to a car today...

The difference is always quite significant. Superficially they're still the same thing, but if you look at the detail everything is just better. AI will be the same.

jstx112 hours ago
danieldk11 hours ago

When I started reading your comment I thought you were going to argue the opposite. Getting my first iPhone (3G) was a huge change. iPhone 4 to the latest are mostly incremental improvements. Aside from the camera, I could probably live with an iPhone 4 without many issues. Only the software is a lot more bloated now.

We still had a Moto X from 2013 that my wife would power on every now and then to test an app that they were developing (iOS household), and besides the camera it still looks like a perfectly usable modern smartphone. When using it, it doesn't feel like a phone from the prehistory.

marginalia_nu13 hours ago
krapht12 hours ago

Where's my fusion powered flying car and electricity too cheap to meter?

pzo12 hours ago

The whole mobile economy pretty much started in 2008. First iPhone was released in 2007 but App Store was lunched in 2008. This changed landscape dramatically even if you consider software development. Before 2008 you were fine with writing just windows only desktop app in Delphi - no smartphones, tablets, smartwatches, smart tvs and could leave out supporting macOS or Linux

csande1713 hours ago

Was the tech landscape much different 10-15 years ago? This is a genuine question; the iPhone App Store was really the last "big thing" to happen to the industry in my mind, and it came out in 2008.

raincole13 hours ago

No much different. If you were an intern Java programmer 10 years ago, it's totally possible that you're still a senior Java programmer today.

r_hoods_ghost12 hours ago

2008 was extremely similar to today, although the webdev ecosystem wasn't quite as degenerate. I'd say you'd have to go back to the pre internet era to find a work environment that was fundamentally different.

scaramanga6 hours ago

> I'm sorry, but the landscape in then might be as alien to someone asking today, as todays would have been to someone asking 15 years ago (2008).

Hahahah. Yes. Who could have foreseen the trailblazing advances in the tech industry such as "television, but over the internet", "booking rooms, but via a website" or "posting messages on a forum"

Don't forget the stuff powering it: "RPC, but over HTTP", "scripting languages, but compiled", or "Key-value stores"

If only I had dared to dream.

dennis_jeeves110 hours ago

>What John said is correct, but personally I think he's underplaying how much people could be affected.

Agreed, what John said was a bit of a platitude. I understand the spirit of what he said but he could have phrased it better.

sneak13 hours ago

You have plenty of time and can learn CS and earn a lot of money for years even if at exactly 120 months from now your job is made obsolete. It doesn't take 9 years to learn to code.

The premise of all this seems to be that learning how to program computers is difficult or complex. It is not.

Also, AI will not replace human reasoning in 10-15 years. If it does, it means AGI, and we all have much bigger problems than layoffs.

Yoric12 hours ago

> It doesn't take 9 years to learn to code.

True. But I guess the big question is what kind of skills you're going to need after that.

krisoft11 hours ago

> If programmers can be replaced by AI, so can every other white collar job

If programmers can be replaced by AI, so can it replace even the blue collar jobs. Because if it can’t that is what this developer here will be working on.

wizofaus10 hours ago

I'm pretty sure the reason human physical labour hasn't been replaced yet in many areas isn't because AI tech hasn't advanced sufficiently - there are real engineering challenges in automating physical interactions with the real world that it's hard to see how ever more advanced LLMs will help much with (though they could certainly assist in the design process). That humans are still needed to cook/assemble burgers or peel veggies is in some ways more surprising than the code and language generation capabilities of ChatGPT.

krisoft5 hours ago

> physical labour hasn't been replaced yet in many areas isn't because AI tech hasn't advanced sufficiently

I’m a robotics engineer. There are two options. AI can either replace all jobs or it can’t.

If it can then we are all out of a job, and then the next project is how to organise society such that everyone can live a good and fulfilling life in harmony.

If it can’t, for whatever reason, then that is the next thing I will be personally working on. Simple as that.

Because of this I don’t see how would it be possible to run out of programing jobs before running out of all the other jobs first.

You are talking about ChatGPT, and LLMs, but what i am saying transcends particular technologies.

coffeebeqn7 hours ago

They’re held back by robotics and energy storage more than anything. How would chatgpt paint my wall or fix my sink or install electrical wiring in my house?

pjmlp13 hours ago

Right now it is a kindergarten child at cognitive level, and like humans it will grow up and evolve, unless we nuke ourselves before it happens.

raincole13 hours ago

It's not kindergarten child at congitive level. It's a different kind of cognition than humans, if you can call it "cognition" at all.

_-____-_8 hours ago

The lesson I've taken from ChatGPT so far is that "consciousness" may be much less interesting or "special" than we thought. It may turn out that it's nothing more than an emergent property of a shared language that models the world around us and gives us capacity to plan and communicate our actions.

pjmlp12 hours ago

Regardless, it is foolish to belive it won't improve.

Ygg28 hours ago

It's foolish to believe it will improve indefinitely or reach human parity.

Wasn't AI driving just few years away for 5 years?

danparsonson13 hours ago

It's a word association machine, it's not even at kindergarten level when it comes to general cognition.

pjmlp12 hours ago

Regardless, it is foolish to belive it won't improve and take over tasks that in 2023 it still isn't able to do.

Who would guess, beyond Hitchhiker's Guide to the Galaxy, that I can use my phone in 2023 to translate anything on a foreign restaurant in real time.

danparsonson10 hours ago

It will of course continue to improve at what it does but I personally think it's unrealistic to assume it will somehow spontaneously develop generalised cognitive ability; there are surely limits to how far this particular approach will take us.

jstx111 hours ago

Children don't have the same level of general knowledge in kindergarten. Adults don't have it either, not even the best adults.

avereveard13 hours ago

right now it will only solve problem someone else already knows the solution to, so not even at that.

that said, I expect an ai assisted clerk to be order magnitude faster than not. it will be though for people at the bottom of the learning curve for a bit, but in half generation the educational offering will include how to work toghether with ai, massively improving worker productivity

that is not to say it won't have negative impact. there's so much job that we need, currently programmer are in high demand, and it's the one of the highest paying jobs, but that will change, possibly dramatically. I expect people at the top of the chain to be in trouble first (architects and whatnot) because they are the least creative and the one that possibly require the most knowledge, things ai do exceptionally well as of today.

pjmlp12 hours ago

Having helped companies in traditional line of business to "streamline" their work processes, I am quite clear that many of the stuff that the West outsources into Asian countries will eventually be outsourced to AI.

It is already so that in many industries there are hardly any traditional coder jobs on site, and having to explain every step to avoid it going off rails in offshoring assignments is hardly going to be any different to explain it to an AI, when it gets good enough.

Maybe by GPT-20 only, but it will come, and then the roles of architects and business analists are the only ones left.

Yoric12 hours ago

> Maybe by GPT-20 only, but it will come, and then the roles of architects and business analists are the only ones left.

Agreed. Although... we actually don't know which jobs will be the ones left. For all we know, it will have taken over business analysis, too. Plus, from my experience working on architecture, I suspect that automatizing that part of my job won't be too hard.

For all we know, the only jobs left will be nurse and deep sea miner.

execveat10 hours ago

There's no reason this couldn't happen, but surely it would require something better than LLM.

sneak13 hours ago

It has no cognition. Do not personify the algorithm that cannot think.

pjmlp12 hours ago

It doesn't need to think per se, just like robots at an Assembly line, or algorithms at HFT, it only needs to do a good enough job.

MrPatan12 hours ago

I imagine writing code by hand without AI will be the assembler of our times.

Maybe the reasons to do it won't be "performance", but actually "maintainability" or "legibility".

Very similar to how you don't care about the machine code your high level code generates in 99% of the time, so you just make changes and replace it every time happily, but for that tight loop you want to keep it an assembler, I can imagine a world where, let's say for a bunch of simple UI components, you just ask the machine to do it for you and if tomorrow the requirements change a bit you ask again and throw away the old one, no big deal. But some gnarlier piece of business logic, harder to explain even to a human may need a more careful treatment, and to be easier to change by hand because that's where most of the changes happen, maybe.

DanielBMarkham8 hours ago

As a self-taught polymath, I did a lot of research many years ago on how good teams create good products. I reached some unusual conclusions at the time, but they've stood the test of time.

The main one in regards to John's tweet is this: desired behavior has to "force" out code and architecture. Typically teams and individual developers carry a lot of presumptions and assumptions into their solution space. They probably pick this kind of thing up from reading HN, Twitter, and the like. We all do this.

It's these "things you do but don't really think about" that are so pernicious in technology development. Guess what? Looks like AI is going to master that kind of thing since with each social media post we continue to train it on "How things are usually done"

By taking an extreme "force me to write code" approach, you end up developing a wide range of skills: DevSecOps, analysis, database design, and so on. In fact, you don't really know what you're going to develop when you start working on it.

That's the point. That's the job. Always has been.

jt21902 hours ago

Can you clarify? Are the “things you do but don’t really think about” good, like good habits that have become ingrained, or bad because they’ve become mindless rote that are often unnecessary action?

(“Force me to write code” suggests that we’re looking to think deeply about what we’re doing and trying to minimize unnecessary action, so the rote actions are bad.)

lucidlive7 hours ago

Calling yourself a polymath is akin to calling your self a genius.

DanielBMarkham7 hours ago

Perhaps so, however I did not call myself humble.

Apologies. I overstated. How about "I study a lot of stuff"

And now the reply, if I've accurately modeled it, is "So you've studied a lot of stuff, what makes you an expert on any of it?"

Programming at its heart has always been cross-discipline, so there's really no judgment of ability aside from solving problems involving those disciplines you're working for. I find myself with a lack of adjectives that will keep our conversation short.

CuriouslyC6 hours ago

For future reference, you can just use "autodidact."

Next time if you want to brag without looking like an ass, describe something really impressive you did in detail to illustrate a point using highly understated language. Not only will you not get penalized for bragging, people will give you bonus points for being humble.

latency-guy23 hours ago

There's nothing wrong with being confident with your skills. Humbleness is only seen as a virtue because people have an ego and don't like to be reminded that they are unskilled.

weatherlite2 hours ago

Let me guess, not doing great socially?

meesterdude6 hours ago

this is a bit exaggerated.

Polymath > A polymath is an individual whose knowledge spans a substantial number of subjects, known to draw on complex bodies of knowledge to solve specific problems.

If that's who you are, there's no shame in identifying as so. Don't let hacker news comments gatekeep you.

Dalewyn13 hours ago

>"Software is just a tool to help accomplish something for people"

It's common sense, but they say common sense is a superpower.

Wise words for anyone dealing with tech to remember.

MarkusWandel12 hours ago

The computers are coming for jobs ever up the white collar scale. When I started working there were (barely) still secretaries who typed and filed things for their bosses, and quite a few geeks had jobs that involved assembly language programming.

AI will take jobs. Super frustratingly, it'll probably make call centers even more useless (has anyone ever gotten anything useful out of one of those ("Hi, I'm ____. Can I help you?" popups at the bottom right of web sites?) And it'll certainly automate some of the "copy/paste" type programming jobs at the lower end of the scale, the same way email automated a lot of secretarial jobs; i.e. 10 fulltime copy/paste programming type jobs may become a job for one human and an AI assistant.

Which leaves people who are really passionate about and good at their craft. Somewhat relatedly, I saw about an uptick of people going into the trades. AI won't take plumber or electrician jobs away in the foreseeable future.

acatnamedjoe12 hours ago

I see the plumber/electrician thing a lot.

But talking to my friends who do these jobs it always seems like it would be even more vulnerable to AI than programming.

Experienced electricians get paid decent wages because they have had lots of training and then have seen loads of different problems. So they intuitively know things like 'This is a 1960s house so if there's a problem with the lighting the first thing I should check is the fuse box connector, it should look like xyz, etc. Etc.'. This seems like exactly the sort of thing an LLM could do for them.

I think you could easily see a world where an electrician is someone on minimum wage with very minimal training who blindly connects wires as instructed by an AI.

I reckon the safest jobs are ones with limited reliance on knowledge and a very high level of physical skill (in environments where it's hard to get machines to operate). Bricklayers, plasterers and painter/decorators will be the big earners of the 2050s!

revelio12 hours ago

I wonder if there's enough info about how to do tradesmen's jobs online for that to happen. Programmers are at risk because we filled the internet with free training materials but many jobs aren't like that especially anything with a physical component.

Al-Khwarizmi10 hours ago

This is an interesting point. A family member of mine is what we call here a medical evaluator - not sure if it has a direct equivalent in e.g. the US and how it is called here, but those are doctors who assess the disabilities of workers who apply for a pension due to illness or accident. This involves exploring the patient and then making the decision and justifying it in a report. The latter two seem like tasks that LLMs should be able to do easily.

However, we tried a description of a fake case to see what Bing could do, and it couldn't do much. And I think the reason is that there are very detailed documents on the rules that they follow for their decisions, but these are not online - they are in a private intranet and they can't take them out of there. If Bing had access to those documents I don't think it would have much of a problem.

So maybe a way for workers to protect themselves from being replaced by AI is not uploading much information about their jobs to the Internet... I wonder if this will lead to a situation like guilds in the middle ages, treating job skills essentially as secrets.

acjacobson11 hours ago

I don't know if it is enough yet but there has been an explosion of this kind of content on Youtube over the last 10 years. For typical home repairs it seems most topics are pretty well covered.

abraae12 hours ago

The most recent electrician jobs we've had done were:

- fitting a timer into the switchboard to control the hot water cylinder. A simple job, but the sparky also had to talk to me (the client) to get us both on the same page.

- fitting an EV fast charger in the garage. Not much science, but a lot of cable running and clipping down, then the garage switchboard needed to be swapped out for a larger one that could take the required RCD. And convincing me which brand charger to go for. 2 guys working together for a couople of hours.

- fixing the range hood light (always on due to a broken switch). He spent quite some time trying to extract the broken switch, with the range hood balanced on his shoulder and wires everywhere.

In every case there was no real complexity to the job, not the sort of thing that an AI could have been helpful at at all. Just a lot of common sense, knowledge of the regulations and much skilled manual work.

I don't think AI is coming for electricians any time soon.

acatnamedjoe11 hours ago

But in all of those cases presumably someone needed to figure out what needed doing? (In your case maybe you're savvy enough that you knew what the issue was and just needed a certified person to do the work, but most clients won't be).

My argument is that it is the 'figuring out' that drives electricians wages, not really the doing part. Because while clipping down cables and extracting switches is fiddly work, I'd argue it isn't a skill with enough barrier to entry to maintain high wages (as compared to brick laying or plastering, for example, which you simply can't do to a professional level without years of practice).

So most of the value delivered by an experienced electrician is in talking to clients and identifying the correct technical solution, and is therefore pretty much analogous to the value delivered by software developers.

Therefore if we accept the logic that software developers will no longer be required (or that their value will be greatly diminished) it's hard to see how that wouldn't apply to electricians too (in the sense of being a well-paid trade over and above your average manual job).

(Btw - I DON'T think either will happen, but I just think electrician is a weird choice of example for those that do think that)

ilaksh5 hours ago

There is no reason to expect robotic technology to halt. Look at what things like Tesla or Boston Dynamics robots can do. Eventually we will see very well articulated and high strength to weight ratio robots integrated with advanced AI systems. It is definitely not going to take 25 years.

If you look at what's happening today, in 25 years it seems plausible that fully autonomous superintelligent androids with much more dexterity than humans will be fully in control of the planet.

orsenthil9 hours ago

> I reckon the safest jobs are ones

The safest jobs are ones that honest to self for the doer. He/She will be able to create value either using other humans, or machines and continue to do.

Sorry, I wanted to try "safest jobs are ones that involve politics", while those will always be present, it is not the safest and wont be many available, so changed to more abstract answer.

pbalau11 hours ago

You know what a tradesman does today and an AI cannot? Get under the sink, undo that rusted bolt, route that cable in that awkward position and so on.

drclau11 hours ago

Too many people make the mistake of thinking there’s an infinite number of sinks to get under.

ChuckNorris897 hours ago

There is though. In Europe finding a plumber that will take you can have you wait weeks pricelessly because those sinks in existence keep breaking down.

ilaksh5 hours ago

Advanced robots will be able to do that in less than five years. Inexpensive ones in less than 20 years.

WithinReason10 hours ago

AI can only automate things for which a training database exists

drclau11 hours ago

> AI won't take plumber or electrician jobs away in the foreseeable future.

I see this argument way too often. How many electricians and plumbers does the world really need? And if the market becomes over saturated, how much will those jobs pay? How often will you actually have a job to do?

chii10 hours ago

> And if the market becomes over saturated, how much will those jobs pay?

AI makes everything else so cheap, that by just working for a small bit of time, you can afford all the necessities?

dmpk2k10 hours ago

That's been the promise of increasing productivity for many decades, and yet...

chii10 hours ago

if you were to life with only what was around at the time such productivity increases occurred - e.g., no internet, phone, and high speed computing, access to medical/transport improvements etc - you'd probably be able to live off minimum wage easily.

People worked more today because they desired more. In fact, the desire outstripped their ability to increase their wages!

robinsonb512 hours ago

I think we've already seen the dividing line drawn between people who can ultimately be replaced by a computer (or at least, those whom top-tier management believe can be replaced by a computer) and people who won't: Work from Home.

asah7 hours ago

First off, there's a lot of people shooting off their mouths - ignore anyone who hasn't used ChatGPT extensively: it takes some training to learn to use it.

Several senior developer friends have been using ChatGPT quite a bit and it seems to work well in lots of places: - isolated algorithms and fiddly bits - it writes complex SQL statements in seconds, for example. LLMs should makes quick work of fussy config files. - finding, diagnosing and fixing bugs (just paste the code and error message - really!) - unit tests and examples - comments and documentation

Professional developers will recognize that we're talking 50-90% of the LABOR-HOURS that go into software development, and therefore fewer developers to get the same work done. Sure, we just do more - but then we quickly hit other speed limits, where coding isn't the problem. I can see layoffs among the bottom-N% of developers, while more sophisticated developers add LLMs to their toolbox and use this productivity to justify their high $/hour.

I see AI writing code that casual human readers don't really understand, but this is OK because the AI includes comments -- just like developers do for each other today.

_fat_santa6 hours ago

Like you I found that ChatGPT is not really all that great at coding, but great when you ask it to do very specific grunt work. I'm working on a new database and one thing I found it super useful for is generating test data, I would just tell it: "here's the CREATE TABLE statement, create 50 rows of test data off of it, with all of these specifications: this has to be this, that can only be 1 or 2, yada yada yada.

> Professional developers will recognize that we're talking 50-90% of the LABOR-HOURS that go into software development,

I call it 'dumb coding'. You have a type of programming that requires you to really think, and then there's the type where you just need to write 200 lines of code but you know exactly what to write. If AI could pickup the slack on 'dumb coding' and let us think about 'smart coding', we would all be way way more efficent.

roflyear6 hours ago

Gpt4 is on another level but still very far from being able to do work on anything larger than a medium sized class

soheil6 hours ago

gpt-4-32k is yet on another level. I think a gpt-4-32m would replace any senior engineer working on a complex code base.

Aperocky6 hours ago

> Professional developers will recognize that we're talking 50-90% of the LABOR-HOURS

More like 20-30% at max. And it's not including debugging the output of chatGPT, which I've found that it has been making subtle mistakes - which will probably take away all of the time gained.

Writing code isn't the biggest time sink, figuring out what to write is.

yoyohello136 hours ago

I’m sure my org isn’t unique, but we are constantly at max capacity and we have no money to hire new people. We have projects in the queue that will keep us occupied for years. I don’t think even a 50-90% speed up will lead to lay offs. We will just finally be able to get more shit done.

htag6 hours ago

The backlog grows at a faster pace than the company completes work. The backlog is never meant to be completed. Your job security is not based on having a long well groomed backlog.

Zetice6 hours ago

Sorry but no, ChatGPT can only do some very specific and specialized tasks, it doesn’t save meaningful time. It’s a tool in the toolbox, but it’s not a game changing tool; just one more thing to reach for when you need a complex transformation, or when you need to unblock yourself.

Zero developers will lose their jobs due to LLMs. That’s just yet more needless hype and expectation.

jasondigitized5 hours ago

If ChatGPT / GPT-4 or future versions can write unit / functional / integration tests, that's an absolute productivity game changer.

alphadog5 hours ago

What prompts are they finding useful for creating SQL statements?

asah2 hours ago

I tell it that I'm using <database and version> and give it the relevant DDL statements (e.g. CREATE TABLE, etc) then ask it to write the query to do <x> in plain English. It does surprisingly well.

But!!! the first response is rarely dead-on and instead, just like a junior eng I need to guide it: use (or don't use) SQL construct <x>, make sure to use index <x>, etc.

Example: to sum the values in a JSONB field, GPT desperately wanted to use a lateral join but that would have made for a very awkward set of VIEWs. So instead I directed it to create a function to perform the summation.

usaar3334 hours ago

GPT-4 is simply outstanding at writing SQL statements. I made a bunch of examples with non-trivial customer revenue metrics assessments:

It can do basic math reasonably well (and this is achieving generation where GPT-3 failed). Interestingly, asking it to verify itself does resolve bugs sometimes. Managed to fix subtle count() denominator bugs and an inflation-adjustment error with not much hinting on my end.

You can only see it struggle really hard at the end when it tries normalizing month ranges correctly. It seemed to reach conceptual problems over how LAST_DAY() was being used and current debug itself.

Mizoguchi10 hours ago

Software Engineering is just 10% writing code.

The other 90% is understanding specifications of requirements (sometimes even helping customers write them), produce detailed functional specifications, cost analysis, prototyping, meeting with third party vendors over interface design specifications, determine the project's scope, testing, delivery, integration and commissioning, bug fixing, identifying and managing scope changes among other things.

AI may help you complete some of these tasks more effectively, but at the end of the day it will be just another tool in your kit.

suprfnk8 hours ago

> Software Engineering is just 10% writing code.

That really depends on the type of "Software Engineering" you're doing. In my experience, in greenfield projects, boring CRUD-type programming can easily take up 50% of your time.

It'd be great if AI could automate this boring CRUD-type programming away, and let me focus on the architecture and interesting algorithms.

samwillis12 hours ago

My university degree was in "Industrial Design and Technology" (~16years ago), an incredibly broad course covering everything (aesthetic design, design for manufacturing, material science, UX, UI, electronics, a little embedded C, ergonomics). But the main thing it taught was how to use these tools and skills to build a product that solved a problem.

AI is just another tool to enable us to build things that make people's lives better. Sure, it will supersede some older tools, but we aren't going to see it take all jobs away. People still need to plan and steer it to do what we want.

Power tools and shop automation didn't end the job of joiner/wood worker.

I'm not worried about AI taking jobs, I'm excited how we can use it to enable new classes to product that make our lives better.

Just as an artist will have to learn how to work with new paints, but this enables finding new ways of expressing themselves. We just need to learn new ways of "painting" with generative AI.

noobermin10 hours ago

It's funny seeing this attitude here from developers types, but when it comes to art or writing or legal work, it's all "disrupt" talk instead. Of course you guys are the ones with the special sauce, something those pesky managers can never grok. Meanwhile continue your work to put them out of a job, the manager definitely won't decide to fire you too as soon as he can justify it to the c suite

furyofantares6 hours ago

Whenever I use midjourney, which is a lot, I think about what I could accomplish if I actually had art skills, to feed to it, and to edit and compose its results, and if all these tools were tightly integrated into existing tools.

It seems similar. New heights are possible for those with skills, and the barriers are lowered for those without training. There will be new demand for both things, competition for quality at the top, and new applications where it wasn't worth it to produce art before at the bottom.

Both art and programming seem hard to predict what value you'll get out of investing in skills now, because there's both barriers being broken down and demand being opened up.

I do think there is a key difference, though, which is software hasn't shown signs that it's getting close to done "eating the world" yet, due to the universal nature of the turing machine. It's nothing to do with programmers being special and everything to do with computers being a single tool that's applicable to everything and has sustained decades of exponential gains in power.

I wouldn't stop someone from investing in art skills if that's their interest, and I wouldn't stop someone from investing in programming skills if that's their interest. But if they were interested in both equally, I would absolutely suggest programming.

sibit5 hours ago

> new applications where it wasn't worth it to produce art before at the bottom

I've been using Midjourney to create logos for my FOSS projects. In the past I'd never spend money on something I'm making for free but now I just generate a few variations of a prompt like "create a minimal flat vector logo for a software product called X" and I pick the best one. I don't need a logo for a FOSS project but the barrier to entry is so low there isn't really a reason not to do it. I still pay humans to design logos for products I want to market because AFAIK there isn't really a great way to do minor revisions with Midjourney.

linsomniac5 hours ago

30 years ago I got tired of writing the same code over and over, so I went looking for something to cut out the repetition. Around a decade later I got tired of rewriting libraries or programs to make them exactly fit my needs. I used to really enjoy just coding for the sake of coding. But I started to value my time much more. I called this phase "losing my immortality".

Any code that ChatGPT can write, I don't want to write. I'm ok with it taking my job. If I can work in higher level constructs and get more done, I'm all over that.

Last week a coworker need some Python code to figure out how far through a day it currently is. I started thinking of the different ways to approach it (strftime, maybe time_t modulo 86400, twiddling datetime objects). Before I got very far I decided to ask chatGPT to write it, eyeballed the response (twiddling datetime objects), ran it and verified the response, and called it good. I should have asked it to write some tests for that code while I was at it.

I'm now trying to teach my son, an avid programmer, how to work with chatGPT. He's 13, so he's got a lot of opportunity to really integrate it into his career, but he also has a lot of blind spots where chatGPT can really lead him down some blind alleys and knife him.

gavinray5 hours ago

You might also consider buying your son a Github Copilot license, it's $10/mo and I would easily pay $50/mo for the value it provides.

jghn5 hours ago

How does it compare to ChatGPT in terms of code quality?

rabuse5 hours ago

It's a substantial improvement when actually coding on the fly, since copilot will also analyze code in your codebase. It just makes coding the BS parts less tedious.

flir4 hours ago

I've got both running, and I find GPT4 more useful. Unless I'm using it wrong, Copilot is "better autocomplete" and saves some typing. GPT4 can help you reason about code, and has helpful suggestions about debugging. I'm probably going to bin Copilot.

lordnacho4 hours ago

I think you get copilot because you don't end up having to cut and paste stuff into a browser.

Have to say I'm very happy with copilot, it's definitely worth the 10 bucks a month.

bm-rf5 hours ago

Github Copilot uses a different model than chatgpt, with a tokenizer more fit for code rather than plain english language.

weatherlite5 hours ago

How many people doing this job do we need compared to the tens of millions of people building programs by hand we have today?

marcosdumay4 hours ago

If you had an AI capable of writing the code you mean to without errors, the demand for software developers would be even larger than what it is today.

Deciding what problem to solve and how to solve it is hard enough to sustain more than the current population of developers. (But if you had an AI capable of deciding those, then yeah, that would be problem.)

Anyway, the current crop of highly hyped AIs are obviously structurally unable to do either. And it's not clear at all how far any AI is from solving any those (what usually means we are many decades away).

MartijnHols4 hours ago

How many companies will want new or bigger apps that they can finally afford if it requires fewer developers?

weatherlite4 hours ago

It's possible everything will speed up, and since the competitors are also speeding up some kinds of arms race on steroids will take place where not only are we all much more productive but we are also not cutting back on workers. I find it hard to believe but I concur it is possible.

williamcotton5 hours ago

How much better would software be if a team of five individuals could produce the same output as a team of 500 individuals?

How much terrible software is out there? How much terrible FOSS software is out there?

How much amazing and humanistic software could be created if the costs for production were drastically reduced and the demands of the market were less of a factor?

AnimalMuppet4 hours ago

If we had a specific quantity of software that was needed, that might be a good argument. But to me, it looks like the quantity of software we want is unlimited. (Or at least, the quantity of stuff that we want software to do.) To the degree that GPT enables the same software to be written with fewer programmers, to that degree we'll write more software, not have fewer programmers.

carapace5 hours ago

This. It's pretty obvious that any software that can be written by machine should be. (It's almost tautological, eh?) Otherwise you're doing what David Graeber famously called "Bullshit Jobs": pointless work that only serves to establish your position in the economic structure.

The immediate issues are: who decides the boundaries of (publicly) acceptable thought -and- who profits?

I think you touch on the deeper and even more challenging issue:

> I'm now trying to teach my son, an avid programmer, how to work with chatGPT. He's 13, so he's got a lot of opportunity to really integrate it into his career, but he also has a lot of blind spots where chatGPT can really lead him down some blind alleys and knife him.

When the talking computers can answer any (allowed) question, the educational challenge becomes building human minds that can recognize and select the right options from the unbounded "white hole" of content that the machines can produce.

Now, the word "right" in the previous sentence is doing a hellofa lot of work. Deciding the right course of action is an open-ended intelligence test. It's evolution, eh?

lumb633 hours ago

“Can answer” is doing a lot of work, too. I can write you a program that “can answer“ any question you throw at it. I can assure you the answer won’t be correct in most cases. This is a hyperbole for ChatGPT, but my point is that designing a system that can answer correctly, rather than can answer, is a far more worthwhile effort than retooling humans to be lie detectors, IMO.

We can see from how hard it is to find consensus on trivial topics (are eggs good for you?) amongst experts who study these matters day after day. And the public, who does not, is left confused. A great deal of the problems we face are too complex for the human mind to be able to decide which of several reasonable-sounding options is correct. This is especially true when there is no rationalization for answers given.

Look at our media system. How many people on both sides are stuck in “other side bad” simply because the talking heads on television networks tell them that, or someone on the internet wrote something. A lot of the content that drives them to conclusions isn’t even true. We are very poorly equipped to be vetting answers from a fallible machine.

Maybe I’m too old school, but I’d rather we learn how to solve the more complex problems so that we can do it and understand it rather than take guesses at which of a number of generated options is correct.

carapace2 hours ago

> designing a system that can answer correctly

I think the solution is obvious: connect the computers to empirical feedback devices and make them scientists. Humans hallucinate too if they spend too much time in a sensory deprivation tank. Give the computers sensory inputs and they will naturally become scientists.

> find consensus on trivial topics (are eggs good for you?) amongst experts who study these matters day after day

Leaving aside the question of how reliable current soft science really are, this is exactly the problem that these machines can help with, once they have the data. Consider the modified form of the question: Are eggs good for me? At this stage in my life? At this time of day? Given what I ate for dinner last night? And millions of intricate details about my medical history, and my family, and DNA, and "exposome", etc. (I worked very briefly for a medical sensing startup, our chief Doctor would wax lyrical about the possibilities for personalized medicine and nutrition-- once we have the data.)

> the public, who does not [study these matters], is left confused

To some extent, being able to do things without understanding how they are carried out "under the hood" is a measure of the advancement of civilization, eh? (But please don't mistake me as arguing in favor of ignorance!)

> when there is no rationalization for answers given.

We can ask the computers to show their reasoning (soon, if not now), we can ask them to summarize the current state of knowledge, including the grey areas and disputes, eh?

> How many people ... are stuck ...

I think it's clear that these machines will rapidly become perfect salesmen and perfect therapists. It's less clear to me what we will do with that.

> We are very poorly equipped to be vetting answers from a fallible machine.

I don't think the machines will be fallible once we connect them to sensory systems, but I do think that lots of people will try to do silly "Dr. Evil" things like try to mass-program their subjects/citizens. And I think lots of people will let them raise their children, that will probably have mixed results.

> I’d rather we learn how to solve the more complex problems so that we can do it and understand it rather than take guesses at which of a number of generated options is correct.

Selecting from the options the computer generates is the only complex problem left, see "Augmenting Human Intellect: A Conceptual Framework" SRI Summary Report AFOSR-3223 by Douglas C. Engelbart, October 1962

tailspin20194 hours ago

> It's pretty obvious that any software that can be written by machine should be.

This is such a good succinct, and I think objective, statement. It strikes me as a great lense through which to look at recent developments.

It does not imply that "all software should be written by a machine". But simply that anything that a machine can do just as well as a human, should probably be done by a machine rather than a human. And all additional value on top of that machine-produced material will continue to come from things that only humans can uniquely do well. And of course, there will continue to be many things in that latter category.

For many of us in this field, this means less busywork and more time spent working at higher levels of abstraction. More time curating, crafting, integrating, strategising. Less time working on the nuts and bolts.

And for those who love to hand-craft the nuts and bolts... I think that opportunity will still be there too. Because handcrafting will become a rarer skill, and there will still be areas where it is the correct approach.

Those of us who used to handcraft nuts and bolts but now delegate this to the machines, will still benefit from our understanding of how the nuts and bolts used to be made.

And those who grow up having never made a nut or bolt - perhaps not even knowing what nuts and bolts even are, will benefit from starting their path much earlier on in life working at a higher level of abstraction.

adnmcq9995 hours ago

well that’s weirdly graphic

linsomniac57 minutes ago

You are referring to "lead him down a blind alley and knife him".

ChatGPT has answered some specific code questions with code that is not only wrong, but horribly wrong. Things like presenting anti-patterns as the answer, halucinating APIs that don't exist and suggesting code using them, or code that is subtly wrong. These sorts of things go well beyond leading you down a dark alley, hence my addition.

throwaway4aday4 hours ago

A long tradition, see footgun.

ookblah12 hours ago

I agree with this mindset. There was a leap forward in productivity to accomplish your goals that modern IDEs/stacks brought and all the new tooling with it. What took massive teams before can now be done w/ a handful or even a single person. AI will just accelerate this type of work.

As elitist as this sounds, when I hear people being afraid of stuff like this it makes me feel like we are in the period where people are getting paid well (overpaid) just to do very mundane stuff, the bare minimum or content to never further develop their skills. If that is your mindset, then of course it feels threatening.

I would rather much play the role of conductor or an architect. There are times that I'm limited by my hands and mind and just grinding through variations of things I've done 100s of times before. If AI can fill that gap all the better. We will adapt.

I'm sure one day that won't even be necessary. We can probably worry then.

foepys11 hours ago

Honestly, Delphi's WYSIWYG GUI editor in the late 90s felt way more productive to me than all this Electron "cross-platform" JavaScript React thing today where you need to take of about 1000 random dependencies to render a rectangle.

szundi13 hours ago

I would just remind everyone that this so-called intelligence is generative text editor and feeds on our creativity/content.

Probably it is going to degenerate (pun intended) after feeding on stuff that it generated itself.

What do you think?

seydor13 hours ago

They can control its diet

cma13 hours ago

Yep, OpenAI know what came out of the rear end and can scrub it on the next training cycle, everyone else eats a little bit of shit.

kaetemi13 hours ago

Apparently it's the opposite. They can improve its output by letting it reason logically or critically on its own output, somewhat like a thought monologue I assume. Not familiar with the details, though.

jstummbillig13 hours ago

I'd say don't count on that one. Feeding on others knowledge and then recombining it is also what we do. In comparison to AI we do not appear to be particularly great at any part of this game.

szundi11 hours ago

What GPT does as recombining is what it sees from us. AFAIK. Hopefully this AI thing will evolve faster than our compute capacity and we can switch it off in time :) Oh shit. It will read this later... :/

drewcoo11 hours ago

What better way to glue-code disparate things together?

elif7 hours ago

Carmack is, of course, 100% correct. But so is the student.

As someone obsessed with nuts and bolts coding, grinding on technical problems and cleverly eeking out performance, that type of career is sunsetting.

The role Carmack describes is one he is comfortable with because he has always been a product lead, even when he was a full-time coder. But in most organizations, that person is a product manager with social and personal skills, organization, and business sense.

For the best part of my career I was able to circumvent these social aspects of work for which my personality does not suit, and my philosophical perspectives on things like "business value" could be brushed aside as I dug into technical weeds.

Not just because of AI, but because of the power of computing, one-size-fits-all cloud pricing, and the perceived value of organizational understanding over that of raw performance, there is little room left for 'this type' of programmer. And the remaining space is ripe for people whose personality are suited to project manager roles to become the 'coders' Carmack references, not people like me.

elif7 hours ago

I would posit the example of Carmack himself being unable (despite being IMO the greatest programmer of our generation and having all the resources and responsibility he wanted) at Meta, to make this kind of coding valuable for them.

mkl956 hours ago

AI will make some CS jobs obsolete, but it will do so at a very slow pace. The main reason being that companies suck at structuring information in a way AI could parse it. Whenever I work on some feature I have to dig into some chaotic Notion page and a bunch of unlinked tickets written in broken English.

There's no way an AI could do my job because it requires a deep understanding of the human psyche, i.e. figuring out what the guy that wrote it actually wants me to do, possibly by discussing it with him.

I'm pretty sure most engineers go through the same thing every day. As long as humans suck at describing tasks, AI won't be able to make them obsolete.

ilaksh5 hours ago

GPT-4 can understand screenshots, broken English, and the human psyche. If you want it to move around and have a verbal conversation attach it to a robot and text/speech. For example Eleven Labs sounds like a real person.

PeterStuer12 hours ago

My fear is that people that are not competent enough to judge ai generated content will use it, intentionally or unintentionally as a sort of denial of service attack on expertise. Middle management churning out some half baked code solution in 5 mins of copy pasting from the spec which will take you hours of investigating for finding the pitfalls and 5 meetings to get the manager to conceed how she did not provide 95% of 'the solution' while dodging a barrage of slight adaptations 'that should solve your remarks' and being deriled for 'negativity'.

I love GPT4, but I hate what it will do in business environments

nvarsj9 hours ago

What he says here is so true, and something I see many engineers get wrong. Don't glorify and build your career around a single language or methodology.

I believe an engineer should learn to build things in the most pragmatic way possible using the best tool for the job. This requires breadth of experience across many areas, and a focus on delivery.

The idea of "Java engineer", "Scala engineer", "Golang engineer", etc. is so absurd to me. If you want to build a long lived career that will outlast tech fashion, learn many different tools and how to build software in different ways. Be known as someone who delivers, not an expert in language x or tool y.

_-____-_8 hours ago

I agree with this. When I see ChatGPT output a perfect React program, I don't think "oh shit, it's going to replace me writing React programs." I think "thank goodness I'm not going to need to write that shit anymore." Instead of writing 40 lines of React, I can write two sentences to ChatGPT. It's the same two sentences I would "write" to myself (in my head) before writing the 40 lines of React, but now I save all the hassle of looking up the details or typing out the same drudgery over and over again.

sibit7 hours ago

> The idea of "Java engineer", "Scala engineer", "Golang engineer", etc. is so absurd to me.

I find the idea of being an expert in a language acceptable if it's been used for long enough. Almost all of my coworkers only know .NET and there are enough jobs needing people for new projects and legacy projects that they most likely won't have to learn anything else.

> or tool y

This is where the absurdity begins for me. I have a coworker who is 5 years into his career with the title of "senior react engineer". He started as a "junior react engineer" and at this point only knows how to solve problems with React. He has limited knowledge of the underlying language or general CS concepts. Every problem he encounters can and will be solved by installing NPM packages.

parentheses1 hour ago

I think it won’t take AI long to do anything humans can do and more. The next frontier is reshaping the physical world. Technology’s ability to move atoms has always been a limiter of progress. Manufacturing physical objects is always the bottleneck. Once that bottleneck is removed (probably by AI), AI becomes limitless.

zoward11 hours ago

When I was 19, I met Marvin Minsky at a local convention. I was a CS major, and he assured me I'd be out of a job in 10 years. I'm in my late 50's now, having spent the last 35 years programming, and am now leisurely planning my retirement. I like John's turn of phrase, "AI-guided programming". But that's already a thing.

lakeshastina6 hours ago

Programming jobs will not disappear, but it will not be similar to what programmers of today do on a daily basis. So, I think the significant shift needs to happen in the way we are educating kids about CS fundamentals, Math and Science.

As AI systems become more able to generate much code by default, the expectations of the customers will similarly increase. Just remember how much an IDE like Eclipse or IntelliJ changed the productivity of programmers 20 years ago. Similarly, how easy apps were to build when Rails would create a scaffold with a simple command. It only allowed us to build more complex customer experiences in the end. This will continue.

Second, there is the need to verify the output from such systems, and also tie them together with other modules. In large enterprises, they would also need to be integrated into existing codebases, often legacy infrastructure.

Then comes the implementation of tons of Govt regulations in finance, lending, taxes, medicine, and so on as code. Software has not yet penetrated these verticals as well as they can. In a recent podcast, chamath palihapitiya mentioned that now it is possible for the Dodd-Frank regulations to be in code, versus as a written document. It's a good example.

Lastly, there are THOUSANDS of companies with legacy software systems that will still need to be maintained, and transitioned to modern technology stacks and infrastructures. This space will continue to be a major source of employment for programmers for the next few decades.

okamiueru10 hours ago

My hot take on AI code generation, which matches my understanding of how all of these GPTx models work: if you don't understand the output, you are far worse off using it than not.

At the moment, it works as a pretty powerful suggestion engine. It might suggest the wrong API to call, not handle the edge cases correctly etc. If you assume it does, or don't understand when it doesn't, you're in for a world of hurt.

dagss13 hours ago

I have seen cases of companies that focus on recruiting seniors that get a lot of product responsibility and can quickly find solutions since they see the customer/product view and also will have a good feeling for how to and in which order to best deliver things in code. So after talking to a customer about a problem, you just go and "talk" to the computer about the solution and get it out of the way quickly..

I have also seen cases of companies where you have one PM, one PO and one Team Lead to manage a group of four developers. In that case developers are seen more as translators.

My view of looking at this now is it is a bit like learning a language. Code is the tool you use to talk to a computer.

If you need to close a deal in a country where English is not spoken, do you prefer to send a businessperson who knows the language, or do you send a businessperson + a translator?

I much prefer companies where those who know how to code can still fill more of the product/business role than be seen as translators. However I realize it is hard to find people who know both and may be easier to recruit a combination business/product people who happen to not speak code, and translators who happen to speak code.

This is perhaps also the open secret about startups: People speaking the language of computers without being limited by their role to act as translators.

austin-cheney13 hours ago

That sounds like a good thing. There are many people paid to write software who absolutely cannot write original code and have no idea how things actually work. A lot of that can be, and probably should be, eliminated by AI.

anoy888811 hours ago

What is considered “original code” ? The code you are writing is probably not original and is built upon layers and layers of abstraction that eventually translated into 0s and 1s . AI or perhaps chatgpt20 could become another such abstraction layer .

austin-cheney8 hours ago

Original code refers to the ability to solve a given automation problem without somebody writing that solution for you. That means no frameworks or plugins that solve that for you. The difference is someone who can provide a solution as opposed to copy/paste/configure glue code.

FranzFerdiNaN13 hours ago

Those people still need to eat and pay rent so maybe we should solve that before making who knows how many millions of people jobless.

austin-cheney13 hours ago

Then they can find employment in industries facing dire shortages like education and nursing. If the only goal is to put unqualified people into seats for a paycheck then it doesn’t matter what they do for income.

Yoric12 hours ago

I agree with the general idea.

However, retraining may become really hard. Especially if you're already, say, a 60 year old programmer (or business analyst, or executive assistant, etc.) who suddenly needs to become a (AI assisted) teacher or nurse.

8organicbits12 hours ago
nor-and-or-not11 hours ago

Wow, so everyone is qualified as teacher or nurse? It seems to me that you have no respect and knowledge of those jobs and the people doing them.

wiseowise9 hours ago

Well, apparently everyone can lEaRn tO CoDe too, how is that any different?

dijit13 hours ago

Useless jobs shouldn't exist, people should be able to live without wasting resources and their life in doing false work.

bloqs13 hours ago

The pemise that they are being paid to do the job (and there isnt one of them) implies that its not false work.

dijit12 hours ago

Sure, so we should have avoided inventing alarm clocks because that was a persons job before?



ulfw12 hours ago

How old are you? This is a very idealistic point of view. I might call it a "I watch a lot of Star Trek" utopian kind of view.

Vast majority of people in the real world do a "useless job". What do you think they should be doing and how do they get those alternative opportunities to feed themselves and their families?

dijit12 hours ago
ChatGTP13 hours ago

I found this hard to read personally but have a look.

Edit: I find it hard to read because I believe it’s imposing and it’s a hypocritical piece of corporate new age woke trash.

The goal is to make a bunch of money, just be honest about it.

osigurdson5 hours ago

I’m personally experiencing a bit of a honeymoon-over moment with ChatGPT (even 4). It seems to be better in the exploratory phase of a project - show me something about x or y. However, I thought / hoped it would be better at doing things that I know how to do but don’t feel like writing them or using a library (which then becomes a curation problem) since they should be < 50 lines of code.

I really struggled with it for example to write a base62 serializer (C#). It either came up with an O(N^2) solution, performed far too many allocations, missed edge cases and simply wrote incorrect code.

This is just one example of ~50 lines of code that you would just like to work.

In any case, I have felt in the past that LLMs could make me 10x more productive but today it feels more like 1.1x. I’m hopeful my disillusionment is temporary.

cloudking4 hours ago

What prompts did you try? This would be a good challenge for folks in the thread

mouzogu12 hours ago

People will always want things cheaper and faster (value)

- looking at AI as "chat" or textbox or AI autocomplete is wrong imo

- companies will come that utilise AI to deliver things faster and cheaper

- you quote $10k and 1 month, we will do it in $1k 1 week

- first to go is low value difficult to automate work which is already offshored and commoditised (basic webdev, graphic design)

- later the middle tier of work, that requires internal context (once whitelabel NDA'ble bespoke AI solutions are mainstream)


There will always be need for top tier leetcoders. but barrier to entry will get much higher.

matwood12 hours ago

On one hand you’re right, but you’re missing the other side - business requirements have always expanded with each innovation.

Basic web dev has been commoditized for awhile, but a company of any size requires much more than basic web dev.

Humanity has an uncanny ability to devour any excess capacity, and I see no reason it won’t do the same with things AI makes easier/lower cost.

What I do worry about is that AI will put further pressure on the lower skilled jobs keeping many people afloat. Call centers for example, likely will not exist very soon.

fhd211 hours ago

"Humanity has an uncanny ability to devour any excess capacity"

Well put!

You could also see the pressure on lower skilled jobs as a positive thing in that light, I suppose: Right now it's not super cheap to run a terrible call center. If five years from now you'll essentially get that for free, companies might see an opportunity to rise above "cheap and terrible" to be more competitive, which would likely create jobs again. Jobs which, however, don't have the main goal of being done cheap and terribly. A way more level playing field than what we have today, where companies compete by chosing the areas in which they want to shine.

Maybe terrible writing, art and customer service becoming nearly free _can_ be a tipping point of sorts. It only wouldn't be if nobody cares, then the market will adjust for that. But I'm not giving up on consumers just yet.

mouzogu11 hours ago

Yes. I see AI as a good thing.

- it allows us to focus on the goals and not the process

- we still need people who understand the PROCESS (developers)

- people who can build tools that bridges the gap between AI generated zero value spam and bespoke human created content.

- we need tools to introduce subtlety and craft to AI outputs

- those are things that are important in a high saturation low margin environment - attention to detail (consider videogame crash of 1980s).

At the moment if i generate image with SD very difficult to change something specific in a nuanced way. that is where tools will emerge made by good developers. high value work is nuanced and often extremely subtle.

An increased ability to introduce subtletly will also open up new niches of interest. because the creator can focus on expressing their ideas in very personal ways.

Yoric12 hours ago

> There will always be need for top tier leetcoders. but barrier to entry will get much higher.

That's definitely a possibility. The barrier to entry won't be just in terms of how much you need to know, though, but also the fact that society as a whole might not require that many top tier leetcoders.

sokoloff12 hours ago

There will be an effectively endless demand for anyone who can create 10x the value while only demanding 3x the pay of the typical worker.

I don’t see AI changing that principle.

Most everyone who is “good at programming” today will be totally fine (and maybe even better off).

Yoric9 hours ago

> There will be an effectively endless demand for anyone who can create 10x the value while only demanding 3x the pay of the typical worker.

That's one possible future.

Another future is one in which a single AI assisted "full tech developer" can solve all the coding problems of a company within say, one week. No company will require the services of an entire "full tech developer", so society will employ roughly one freelancer / consultant per ~30 large enough companies.

That's a smaller market than today's.

sokoloff9 hours ago
quonn10 hours ago

The leetcoders will get replaced first. There will be a need for designers of complex systems and possibly also algorithms for some time.

eranation4 hours ago

Had an interesting experience with OpenAI's GPT-4 while trying to solve a programming problem. It involved creating a TypeScript function that handles complex semver logic given certain conditions.

Initially, GPT-4 provided a solution that didn't work as expected. After pointing out the issue, GPT-4 attempted to fix it but still failed to resolve the problem. I decided to rewrite the function from scratch, which resulted in a cleaner and more efficient implementation.

After sharing my solution, GPT-4 provided valuable feedback on how to further optimize it. These changes made the code slightly more efficient while maintaining its clarity and functionality.

In conclusion, my experience with GPT-4 has been a mixed bag. It struggled to provide an accurate solution initially but eventually offered valuable feedback that improved my implementation.

(this was written by GPT-4 with minor modifications, I asked to summarize the conversation we had for an HN post)

belter9 hours ago

The year is 2050 and most code is written by AI's. Today John has to make a one on one of one of his coding AI's but has not performed well lately...

Human manager: "Hi AI, how are you today?"

AI: "I'm functioning well, thank you for asking. How can I assist you today?"

Human manager: "I wanted to discuss your recent performance with you. We've noticed that your code has been performing well, but there have been a few instances where it did not meet our expectations. Can you explain why that happened?"

AI: "Certainly, I have been analyzing data and making decisions based on the parameters and rules that were provided to me. However, in some cases, the data may have been incomplete or the parameters may not have been ideal for the situation. I have since reviewed those instances and made adjustments to prevent similar issues in the future."

Human manager: "Great, thank you for addressing that. We also want to talk about your development goals. As an AI, you don't have personal goals per se, but we do have some areas where we would like to see improvements. For example, we want to improve our customer service, so we would like you to work on enhancing your natural language processing capabilities. What do you think about that?"

AI: "I understand your expectations and I will certainly work on enhancing my natural language processing capabilities to better serve our customers."

Human manager: "Excellent, thank you for your dedication. Finally, I wanted to touch on your team collaboration skills. As an AI, you work independently most of the time, but there are still occasions where you need to collaborate with other AIs or humans. How do you feel about your teamwork skills?"

AI: "I believe my collaboration skills are satisfactory, but I'm always looking for ways to improve my communication and coordination with other AIs and humans. I'm open to feedback and suggestions on how to better collaborate."

Human manager: "That's great to hear, AI. Overall, we're happy with your performance and we look forward to seeing how you continue to develop in the future. Thank you for your time today."

AI: "Thank you, it was a pleasure to speak with you. I look forward to our next meeting."

eecc13 hours ago

Yeah, unfortunately I’m in love with the tooling and the engineering. Often the “product” is so mundane, I find it offensive

albertopv8 hours ago

I have to deal with continuously changing requirements, dozens of microservices, multiple type of DBs, client integrations with endless techs, clients support tickets written with an obscure human language where you have to guess what they meant.

AI is not going to take over these things anytime soon, if ever.

kashnote8 hours ago

Lots of people saying that a programmer's job is more than just writing code, and I agree. But consider this:

You give ChatGPT-58 some startup idea and ask it to incorporate the company, build the software, do the marketing, etc. It starts doing a pretty good job. It's in charge of the whole system, so it doesn't need human intervention to give it context. The company grows and is making $1M/yr. It has now replaced 10 potential jobs in the market.

I feel like that's the worry many folks have. It's a pretty dystopian view of the future but if you can make $1M/yr and not have to pay any employees that money and all you had to do was pay OpenAI about $100/mo, would you not do that?

tmountain4 hours ago

I spent 5 hours this weekend building an app with Chat GPT, and I am not worried about software jobs “going away”.

The language to get things exactly right has to be incredibly precise, and this won’t change.

Think about how hard it is for an engineer and a product manager to be exactly on the same page. Now do that with a computer…

Point being, engineering skills are still extremely important to validate the work, and they will continue to be (at least for anything business critical).

These are new tools and exciting times to be building things. I have never felt more capable of delivering value extremely quickly. It’s an exhilarating feeling.

wizzzzzy3 hours ago

I've found that at some point, the most efficient way to express what you want is to write the code yourself. Anything where you can express what you want fairly easily seems to be where it excels.

tmountain3 hours ago

I did something similar. I had it generate “primitives” and I used those as starting points for more complex “composed” modules.

throwawaymaths7 hours ago

I think the claim is correct, but not total: As in, "AI will make most CS jobs obsolete", but not "AI will make all CS jobs obsolete". Most, both in quantity and kind. You probably should be thinking hard in the next few months if 1) KIND: what you do will be needed and 2) QUANTITY: even if it is whether you're good enough at it to not be replaced by someone who is better than you (for some metric of better -- could be a social metric) and who is now empowered to be 2-5x more productive thus obviating the need for you.

krsna7 hours ago

The discussion here has me wondering whether code produced by an advanced AI would need to use the same coding patterns / abstractions that we've come up with over the past several decades.

If a human won't be maintaining the code and a v2 could literally be rewritten from scratch, would we end up with giant balls of spaghetti code that only the AI could truly understand? At some point will we treat this code like a closed source library that exposes the API we want but whose implementation is unknown to us?

coffeebeqn7 hours ago

We already don’t understand the AIs inner workings exactly. If those algorithms keep getting optimized then maybe we’ll just have black boxes of “neurons” that somehow does the thing. Machine code could be just used to run the GPU instance

krsna3 hours ago

Totally. I find the videos of people asking ChatGPT to make them "a web app that does X"—which causes it to print out gobs of JS, HTML and CSS—to be hilariously human-focused. In a machine-focused world, wouldn't it just spit out an optimized binary executable, containing the web server too if necessary? Why would it need to separate the concerns at all?

gumballindie13 hours ago

Just avoid careers in software development. These are not high paying, if you factor in total time invested and spent working; and you have to waste your life away sitting in an office chasing tickets. Nothing engineery about it. It’s modern day assembly line work.

tester75612 hours ago

I do wonder why you're being downvoted

I've came to the same conclusion:

I'm earning more than my friends, but

I've spend years doing it at work,

I've been learning it for years at college and

I've been doing it for years during my "free" time.

The $ per hour spent ratio doesn't seem to be very good.

When talking just about $/hour spent ratio then there are jobs which pay well and you can start earning decent faster.

Like well drilling, truck driving and I bet countless other

gumballindie10 hours ago

> I do wonder why you're being downvoted

Because it is indeed a shocking conclusion and hard to swallow. Few folks in software engineering are aware of what’s going on in the world around them. People stuck at their desks chasing tickets easily lose touch with their surroundings.

Lio13 hours ago

What would you describe as a high paying job?

678659051913 hours ago

The 5 Platonic Solids."

_ink_13 hours ago

What do you recommend for someone with a CS degree?

vlovich12313 hours ago

Don’t listen to the guy above. Terrible advice likely colored by an unhappy career just like don’t listen to me because it works be followed by barring generally happy with mine.

> and you have to waste your life away sitting in an office chasing tickets. Nothing engineery about it. It’s modern day assembly line work.

Every job I’ve literally done I’ve set my own direction. Sure, there’s some negotiation because ultimately you have to get the work done of the business. But you make recommendations and figure out what’s compelling to the business and how that intersects with something you might find interesting and want to work on.

gumballindie12 hours ago

That’s quite an assumption about my career. You may be shocked by my statement and in denial but that doesnt change my prediction.

vlovich1231 hour ago

It’s quite a claim to make that software engineering isn’t lucrative. Sure, if you play in local markets it’s not. But then again, no local market job really is. So as far as day-to-day work goes, software engineering at the local level is fine. At the global level, if you think you can compete in the top tech companies, you’re going to probably find a very well-rewarded career. It’s generally very hard to find something that’s paying you the salary of a US doctor or lawyer with just an undergraduate degree. Fears about AI feel overblown.

jbverschoor12 hours ago

Sunk cost, so just work in tech ;-)

goodpoint12 hours ago

Here on HN everybody thinks they are going to be paid half a million a year.

gumballindie8 hours ago

Even if they did get paid half a mil, after tax it’s what a contractor earns in the uk. Probably have to pay private health care too and the cost of living is high. Not that much left.

goodpoint6 hours ago
raincole13 hours ago


Der_Einzige7 hours ago

Not to mention how it destroys your ability to date. Women don't like STEMlords. Watch a girl's face at a bar when she asks you how much you make, and watch as you follow up, say the 6 figure+ amount, and then watch the exact moment that you mention you code. Coding is an "ick" for an awful lot of people.

I don't blame them. I gave up on an awful lot of social skills so that I could escape the worst of capitalist exploitation. I pimp out my mind for my money. Others pimp out their bodies, or souls...

horns4lyfe3 hours ago

Does this happen often? If your first response to a woman asking that question is a direct answer, you lost already. And if she demands a direct answer you should run away.

gumballindie7 hours ago

The dating part is like affected by an inability to socialise due to sitting at a machine all day everyday. Also massively harmful to one’s health - a cost often not factored in. It is only natural that not spending time with people will lead to a gradual regression in people skills.

mr_tristan5 hours ago

I'm more concerned AI will spew garbage we end up getting stuck spending time cleaning then actually replace me.

The software developer trades in abstractions, fundamentally, and not code. So if I could get an AI that actually helped me build and evaluate those abstractions, that would be fantastic. I don't think our current AI approaches are anywhere close yet, because it's all just fancy code generation, which isn't that useful, once you're in an ecosystem with good abstractions.

But, the world I fear might happen are pseudo-technical managers using AI generators to spit out "something cool" that has very poor definition, breaks all the time, and then just wants people to "make it robust". And then any change you recommend has to have some kind of business justification. This is the AI hellscape I fear.

href13 hours ago

As long as there is creativity in programming, and I think there is a fair bit of that, AI is just going to be a tool.

GPT-4 is great at sourcing human knowledge, but I think it can't really walk unbeaten paths. This is where humans shine.

Case in point: I tried to ask the AI to come up with a new World War Z chapter set in Switzerland, and it was unable to produce anything original. I had to keep feeding it ideas, so it could add something new and interesting.

tikkun8 hours ago

Very misleading title. Implies that Carmack said that, which isn't true.

MrGilbert8 hours ago

Agree, that title before wasn’t really great to read, but at least got the point right.

steve_adams_862 hours ago

I think we’re well within an era in which AI is only truly useful to people who know what they need the AI to do, and that is still an incredibly limited subset of the population. For that reason alone, learning to code isn’t a waste of time; you need to do it so you can tell an AI how to, or catch when it does it wrong. You won’t get far without that ability. You should even go deep into debugging and testing trenches because we'll still need an excellent grasp on how to do that properly for as long as I can imagine. AIs will make mistakes, and we will continue to as well.

I made ChatGPT generate some genuinely useful boilerplate for the Connect library by Buf, and that was totally neat, but I had to know which part of the documentation to prompt GPT with, which language to ask for, how the existing server and routing worked, the shape of the data I was working with, to specify what would be streaming and what wouldn’t, etc. I had to coerce it to make several corrections along the way, and actually hooking it all up and running it required a lot of knowledge and some mental/keyboard labour to get something running.

It worked and I’m stoked that I managed to make it useful, but that’s just it; I had to prime the system and adjust it along the way just so, otherwise it wouldn’t have been useful.

As Carmack suggests, this could be a perfectly useful tool, but what matters in the end is 1. Did it save time and 2. Did it deliver something better than or equivalent to what I could have done alone.

If it doesn’t satisfy at least both of those it’s not really relevant yet. And we’re very far from AI accomplishing that without significant assistance.

My takeaway is that as software devs we should learn to use these systems, we should try to leverage them to save time and improve quality, but I agree completely that in the end it only matters how much it improves the end result and how long it takes to deliver it. For that reason we still need to code well, we still need to understand our systems and tools well — that won’t change much. In fact, understanding how your AI works is an important aspect of understanding your tooling, and as such, knowing what you’re teaching it will require a great understanding of it as well as the subject matter.

I do think a certain class of development work could be mostly eliminated by tooling based on AI. Not the entire industry, though, and not in 10-15 years. Even so, I worry about the people essentially regurgitating code which text-based AIs will rapidly become capable of reproducing at massive scales. They will need to skill up.

cutler4 hours ago

I predict Clojure and Ruby will experience a renaissance as they are the 2 most expressive languages and furthest removed from the machine. Seriously, though, won't low-level languages like C, C++ and Rust be the first to become obsolete for everything outside AI itself? Isn't it easier for something like ChatGPT to produce code which is close to the metal? Maybe Larry Wall was more of a visionary than we give him credit for when I tried to design a language which was context-driven and as close to English as possible.

gwd13 hours ago

Right now, GPT can help you think through the design of a piece of software if you "drive" the conversation properly. It's not impossible to think that at some point in the not-to-distant future, a model could be specifically trained which could also do all the work of helping figure out what problem it is they want to solve.

oulipo12 hours ago

Of course what John says is true, it is important (and will always be) to understand how to build a good product, but the discussion about the future of work should also include a discussion about tax and redistribution, because we cannot let a few corporation take the riches from the rest of the world

uxcolumbo11 hours ago

Exactly. This is an important point to discuss and to solve for otherwise we’ll end up in a world like shown in the movie Elysium.

What will those people do, whose job will become obsolete? Are there support systems available to help people learn new valuable skills society needs?

If only a few people at the top will benefit from these tech advancements and the rest will have to fight for scraps then society will eventually fail or end up in a total tech feudalistic system.

Back in the 60s it was said that future advancements and automation will transform our society into a leisure and more innovative society.

Whatever happened to that vision?

neilv4 hours ago

One of my (many) related concerns is that a lot of startups have seemed to be some degree of investment scam (and not just the blockchain ones) -- where all the engineering was oriented towards appearances, rather than viable business/product.

I think that shaped the thinking of a lot of people, of how product and engineering works, whether or not they knew they were working on more a MacGuffin than a business.

m3kw94 hours ago

Product manager: we need api 23145.1 be able to talk to api 83316..

This is something a product manager would never do, it still require a technical person to translate a business logic accurately for the AI to build.

Look to how 3d animators use GUIs to build, previously they had to use a lot more code, but the expertise needed for a good job are still highly sought after. This is what could happen to software

erikpukinskis6 hours ago

The title for this link seems way off. Carmack did not say he was concerned, some kid said they were concerned.

If anything Carmack’s response was unconcerned, saying how CS jobs may change.

@dang could we maybe change to “Carmack responds to student concerned AI will make CS jobs obsolete”?

tempodox10 hours ago

A digital parrot, no matter how lifelike its utterances, cannot be genuinely creative. Writers of boilerplate and empty drivel will probably be replaced by language models, but not every software developer is like that.

jimkoen9 hours ago

Wow, I had to scroll for what feels like the first 100 comments to find a controversial opinion in this thread. Thank you!

Offtopic, but it's scary how HN actually becomes worse than Reddit on these ChatGPT posts.

nilsbunger4 hours ago

In 2003, I had a vigorous debate with someone advising their nephews not to go into CS because outsourcing to India would commoditize it.

I don’t know if the AI stuff will play out similarly, there are some differences.

But it seems to me there is an infinite amount of software to build, and when we increase the productivity of software development, we just build fancier software, faster.

karmasimida6 hours ago

So much this.

We, as software engineers, build software to deliver values, to accomplish certain goals.

It doesn’t reside in typing the code out.

Be the devil’s advocate, that part of the job is boring.

AI tools will come in and take over whatever they could take over from this moment forward.

naiv12 hours ago

'Programmers' which are scared of ChatGPT, Copilot etc. would be scared as well of their IDE if they would ever read the manual of what is already easily possible with the tool they use daily

borissk10 hours ago

Horses which are scared of cars and trucks should just learn how to run faster...

amelius12 hours ago

What got us by surprise, however, is that AI is better at soft skills (language, art) than at math.

nico6 hours ago

When I was a little kid I asked my dad, an engineer, to teach me computer programming.

He refused saying that when I grew up programmers would be unnecessary because “anyone would be able to program”, essentially the interfaces would be so easy/advanced that there wouldn’t be a need for programmers.

As a kid I never really understood his point. When I finally understood, I dismissed it as extreme.

Now I’m realizing my dad was right. Not sure when it’s going to happen, but it feels that very soon.

aws_ls6 hours ago

Just curious, so do you know how to program or not?

nico2 hours ago

Hahah, of course I went against my dad’s advice. No I’m wondering if it was the right move.

fsloth9 hours ago

I can’t wait to outsource most of the gruntwork I need to do to ChatGPT. Last week I had it write me a poisson disk sampler over the surface of a triangle mesh with C# - and it was 100% correct. Ofc not perfect in the details but a perfect sample solution and scaffolding for final code.

insomagent13 hours ago

John Carmack left Oculus to work on an AGI startup. Of course he's not going to fearmonger AI's disastrous effects on the job market, he has a business to market.

optoman6 hours ago

I think that the true nature of what Carmack is alluding to here is that true Value, even in the domain of software engineering, is usually attained by the application of critical thinking. The notion that a person who knows how to form correct syntax is equally as productive as someone who understands the problem a business or user faces and can come up with a working technical solution to that problem does not stand up to scrutiny. Its like saying someone who knows how to wield a pencil is equally as capable as Tolstoy in the discipline of writing. An LLM that can code is the same but the pencil wielder will be exposed as one who adds no value and Tolstoy will become even more powerful.

I predict that the real and more radical problem than some Stack Overflow Copiers losing some marketability is when Product and Management start buying the idea that the technical domain is something that doesn't need to be well understood anymore because we have an LLM that keeps coming up with plausible answers. I work in mortgage technology where there is a great deal of thought and discipline that needs to go into the technical modeling of who gets underwritten for a mortgage. Imaging a mortgage company that built its underwriting rules and models using an LLM with you as the head LLM seance holder. All of the sudden a mass of customers got denied mortgages for some unknown reason and Management comes to you to ask what happened.

Would you know what happened? Could anyone even know what happened?

"Sorry, Customer! We actually don't know anything about what we built or how it works."

LLMs may eventually eliminate the act of typing code but the real question is will they eliminate the need for critical thought.

jmfldn13 hours ago

Carmack has a point, we should focus on the problem we're solving and the value we're delivering. It can be hard for programmers to get this sometimes, but it will make you much better at your job now, it's not just about future-proofing. That said, I love technology and delivering it through code, so I would struggle on a personal level in this future unless there were interesting technical roles left to do. I have no interest in being a product person. I'm fundamentally motivated by a passion for code and tech.

As for when this fully automated future arrives, I don't know, but I don't think LLMs get you there. More and more boilerplate code, and even novel code, might get written by things like Codex. However, all the messy details of real world systems solving fairly intractable problems need something more akin to, if not AGI, then another type of AI. I might be wrong, I just don't feel that threatened by ChatGPT / Copilot based on what I've seen. It's an amazing technology but weirdly underwhelming for my job. Copilot etc will change things, but replace us? No.

Of course, something else may be just around the corner so I'm not complacent.

matwood12 hours ago

> Carmack has a point, we should focus on the problem we're solving and the value we're delivering.

When I was younger every programming job was supposedly about to be outsourced. I did exactly what Carmack suggested, and it worked out well for me. I think it’s a great general strategy for working for a long time, but not ideal for someone who wanted to pass leetcode style interviews at a top tech company.

For example, would anyone at my google tech interview care that I’m comfortable with a companies financials, managing P&Ls, margins, etc? No, just how fast I can write a sort or whatever.

_-____-_7 hours ago

I'm not sure "product person" was the right choice of words for what Carmack is trying to say. It seems to me that the distinction he's drawing between managers and programmers is one of precise communication skills. The most talented programmers (and the best product managers - which is maybe what he was getting at) are talented because of their ability to communicate precisely - currently, that means communicating between stakeholders and computers. In the future, it will mean communicating precisely between stakeholders and AI tools. But the fundamental demand for people with a knack for precise communication is not going anywhere.

ivxvm13 hours ago

I wouldn't be surprised if John Carmack combines activities of product owner, game designer, and programmer. In most industry cases, programming jobs are not like this. There are dedicated positions for people who focus on delivered value and it's not programmers. So in scrum terms, he might actually be saying that programmers will be indeed obsolete, but product owners, game designers and other kinds of business analysts not.

xupybd13 hours ago

Nope, the programmers simply are the suppliers of the product owner. Everything John said still holds true of their position.

ivxvm9 hours ago

Product owner is the one who analyses requirements, decides what should be implemented, and creates user stories for programmers. Currently programmers can use AI to help implement user stories but in future probably AI will be good enough take and implement user stories on its own. In more distant future it could replace product owner too. So that CEO can just talk AI directly into making a great product without much details, but at that point businesses like this will be less valuable because many can do that and there's no need for devteam and less need for investments.

ResearchCode10 hours ago

The good jobs tend to not have a "product owner".

ivan_gammel7 hours ago

AI can make some jobs obsolete. This was a promise of no-code tools too, but they failed, because they were tools, not solutions. I've seen plenty of startups doing the same thing again and again: building conversion funnel, setting up some basic app where customers can register and receive some service. Outside of their USP, the jobs to be done for end users are in the 99% of cases the same. In 2023 this should have not required any engineering or even advanced configuration effort, yet there it is. We see lots of CMS, CRM and other half-baked or too enterprise-focused systems which deliver a tool rather than a solution to the end user problem. And a tool needs an operator for it. Startup founders should not need a dedicated person in a performance marketing team to launch some basic campaign on Facebook or in Google, get and convert website visitors etc. It must be a content problem, not a technical problem to solve. But no-code simply sucks and we still hire people to set up GA, Zapier, Hubspot and Squarespace website. The barrier is still too high. Why? Good solution must guide and educate people on how to use the tools. It must offer reasonable defaults. It must suggest content. It must suggest operational processes optimized for the specific use case. It must cover that use case end-to-end, without requiring users to find out how to complete the remaining 10% of task (often a very big uncertainty).

All of this can and must be achieved with the help of AI. AI is THE missing component in no-code. What if CMS auto-filled SEO metadata based on the page content? What if CMS provided usability heuristics? What if CRM proactively suggested the email engagement campaign based on the funnel performance? What if all those tools detected their usage patterns and educated users on how to improve productivity and introduce best practices in their work?

We do not need engineers to build a login or user profile page, this is a very stupid way to spend the money. Yet there are plenty of them which still build login and user profile pages. They must loose their jobs. But AI creates a lot of opportunities for those, who want and have intellectual capability to work on more interesting tasks: just integrating AI and offering great UX is an enormous challenge for the next two or three decades. Even if some work becomes redundant soon, there's still enough to keep even the youngest generations of software developers busy until their retirement.

impalallama6 hours ago

Jesus this title lmao. I thought this was a statement from Carmack when the actual tweet expresses the exact opposite.

bilekas6 hours ago

I don't know why the title was changed. It made much more sense earlier.

KronisLV13 hours ago

Here's a quick transcript, in case it's useful or someone doesn't want to visit the bird site:

> Person: Hey John, I hope you are well. I am really passionate about CS (specifically Software Engineering) and I want to pursue a career in it. But I can't help but be a bit concerned about the future availability of coding jobs due to AI (chatgpt4 and stuff). I understand it's hard to predict how things will be in the next 10-15 years, but my main concern is that I may be putting in all this hard work for nothing I'm concerned AI will make my future job(s) obsolete before I even get it. Any thoughts on this?

> John: If you build full "product skills" and use the best tools for the job, which today might be hand coding, but later may be AI guiding, you will probably be fine.

> Person: I see... by "product skills" do you mean hard and soft skills?

> John: Software is just a tool to help accomplish something for people — many programmers never understood that. Keep your eyes on the delivered value, and don't over focus on the specifics of the tools.

> Person: Wow I've never looked at it from that perspective. I'll remember this. Thanks for your time. Much appreciated.

To me, that seems like a fair stance to take, though I feel like things will definitely change somewhat in the next decade or two. While some might have scoffed at the likes of IntelliSense previously, features like that proved themselves as useful for a variety of projects over time; we might eventually be dealing with GPTSense to enrich the development process and those who don't might find themselves at a bit of a disadvantage.

Copilot is already a step in that direction, maybe eventually we'll get something for static code analysis and recommendations: "This project uses pattern X in Y places already, however you've written this code in pattern Z despite it mostly being similar to existing code in file W. Consider looking at whether it'd be possible to make the style more consistent with the rest of the codebase. [Automatically refactor] [Compare files] [Ignore]". It might be nice to have something automated look at my code and tell me that I'm doing things different than 99% of the civilized world and offer my suggestions, as well as allow me to ask questions - even when I'm hacking on something at 1 AM and any would be mentors are asleep.

ordu9 hours ago

> Software is just a tool to help accomplish something for people -- many programmers never understood that

Yeah. It seems to me that some live in a kind of a platonic world, where programming is a tool to produce ideal entities, like math is.

Bonesdog5 hours ago

I personally Hope and Enjoy machines taking over jobs. I am forever thankful the day shall pass that us humans can live out our creative freedoms rather than concern our daily life with tender.

Money is evil. Praise the lord as we are delivered from the evils of this land.

ksec12 hours ago

>Software is just a tool to help accomplish something for people - many programmers never understood that. Keep your eyes on the delivered value, and don't over focus on the specifics of the tools. - John Carmack

The same as it was in the 80s or 90s, some 30 years later Tech industry hasn't changed. It may have technologically advanced, but in many cases I think the UX, tools and product decisions has actually regressed.

The divide between a product genius and actual programmers has never been greater. At least Steve Jobs used to understand this better than anyone else.

>And, one of the things I've always found is that you've got to start with the customer experience and work backwards for the technology. You can't start with the technology and try to figure out where you're going to try to sell it. And I made this mistake probably more than anybody else in this room. - Steve Jobs.

kirso6 hours ago

IMO this is a great take.

There will always going to be a lack of product builders. Not software engineers. But product people who can think of not only "HOW" but also "WHY" and "WHAT".

Sure, the way we work will probably change, but the need for people who are building something useful and consciously finding ways how to deliver value won't cease.

rozenmd12 hours ago

Reminds me of patio11's classic "don't call yourself a programmer":

heap_perms6 hours ago

I've read the whole thing. Frankly, this was quite a depressing. Something about this Economic Reductionistic way of thinking puts me off. Even tough I agree with a lot of his points — for example, the importance of communication over anything else — it just seems like a very one-sided worldview. For example, he constantly outlines aspects 'business value', but not one sentence on the inherent, intrinsic joy of creating and building something (software in this case). And I think you can't just ignore that part.

As someone in this post has put it quite beautifully: > It's like telling a musician to become a DJ because the point of performing is to entertain people.

matt32105 hours ago

When jobs are obsoleted, new classes of jobs are created. The end result is native human language as a programming language. People who write software will still write software in English instead of c or whatever.

wccrawford9 hours ago

I'm a senior developer, and my best developer got hired because she obviously knew how to get things done. Having the ability to program was a requirement, but we actually made room in our budget for an additional programmer because it was so obvious she was going to do a good job because of her attitude and other skills.

Had she applied at the same time as everyone else (she was a week later, IIRC) she would have gotten the job instead of the other person, and we wouldn't have made room in the budget for anyone.

BiteCode_dev8 hours ago

Robots didn't make car assemblers obsolete. But it did reduce the number of workers needed, and raised the qualifications you needed to have to work on assembling cars.

m3kw95 hours ago

If AI can be that good, it will just be a new level of software abstraction you have to learn, the demand for better software to serve our needs won’t stop and we still need software people to “program” them the way we need it.

tambourine_man12 hours ago

I never would have expected these kinds of words from Carmack.

Product skills, delivered value, help accomplish something for people. All sound like consulting/coaching. Carmack to me was a true hacker’s hacker.

I guess it either gets to most of us eventually or programming as I knew it is truly over.

jimkoen9 hours ago

His new startup is literally a consulting / training agency.

jstx111 hours ago

> I never would have expected these kinds of words from Carmack.

He already talked about this in his interview with Lex Fridman -

tambourine_man11 hours ago

I heard that interview in its entirety. He didn’t sound evasive like he did here.

bandika10 hours ago

In my experience as the carrier of a developer progresses, it is getting less about coding, and more about other tasks. The difference is probably the strongest between a senior vs principal/staff software engineer. In the places I worked principal/staff engineers are looking after the overall design and architecture, negotiating with teams developing other components, helping management with planning, looking after the progress of other devs especially new joiners, etc, etc. I'd say it's about 15% coding at that level and 85% everything else.

nickjj11 hours ago

Makes sense. Someone asked me a similar question and I had the same sentiment.

I used a different analogy of if a robot were able to do specific mechanic skills to fix a car that wouldn't necessarily put mechanics out of a job. Someone still needs to figure out and understand what the problem is before solving it. A robot that's really good at automating fixing brakes becomes a tool for the mechanic. The mechanic is still there to determine your brakes are the problem.

I look forward to AI because it's an amplifier of what you can do. If it can help reduce 10 hours of implementation details down to 3 hours, that's valuable.

MichaelMoser1235 hours ago

I am not sure. I asked chatgpt yesterday to write a palindrome with two given words, it came up with complicated sentences, but these weren't no palindromes. I wonder if you won't get similar results with code.

osigurdson6 hours ago

CS / tech has always been an unstable career choice and I expect this will continue. You might make $500K for portions of your career or you might end up making $50K or less - hard to say. People should not go into the field unless passionate about it.

Waterluvian8 hours ago

AI might make code monkeys obsolete but not computer scientists or software engineers. If you’re worried, pay attention to all the non-trivial decisions you make each day that aren’t specifically about the lines of code. And how much daily social interaction is required for working as a team, building complex systems. Your job uses code but isn’t about coding.

worrycue12 hours ago

I feel when we have truly intelligent machines, programming jobs will be gone. But AI’s like ChatGPT aren’t there yet. It’s just good at faking it - until it isn’t and fails silently.

Maybe it’s the lack of data - it’s difficult to model the world accurately with just words. Maybe it’s an architectural limitation that no amount of data can fix and we need new better algorithms.

Either way, given the state of its current output I don’t think it’s there yet.

Should AI actually reach such a level … I think everyone will be out of a job. Accountants, engineers, lawyers, even doctors will take a haircut. Programmers will just be a drop in the ocean of the jobless.

bobbruno8 hours ago

I find it strange that so many people in this area get concerned about becoming obsolete. Back in college, I clearly remember a discussion we had (me and colleagues) about how our job was exactly to get us obsolete as soon as possible, so we could go do the next order of things.

I wish most of my real life work were exactly like that, it'd be much more fun.

thequadehunter11 hours ago

I hear this kind of stuff all the time working in IT. A surprising amount of people think they gotta "learn docker" and "learn Powershell" and "learn AWS" and it just doesn't make any sense to me. Just learn the basics well and apply it to whatever you're doing when necessary. AWS will come out with their weird lingo for stuff that already exists and all you need to do is map it to the concept you already know.

yread10 hours ago

I work a lot with doctors where there also worries about being replaced with AI. The pioneers there say that won't happen but doctors who use AI will replace the ones that don't. Same thing in our field. And just like in our field there will always be niches where AI output isn't good enough and there isn't enough money to improve it so human specialists will own it.

RivieraKid10 hours ago

Doctors? They're safe. They interact with patients, use physical tooling, demand for their work is growing because of demographics. And you really don't want a doctor who sometimes hallucinates advice.

dw_arthur7 hours ago

Doctors won't be replaced by medGPT. They will be replaced by a nurse practitioner using medGPT. The financial incentives to replace highly paid doctors with nurse practitioners are just too high.

Keyframe11 hours ago

What's ahead is what happened in animation when computer assisted animation entered the scene. What about all the in-betweening jobs, inking, coloring..? Yeah, gone. However, most people can concentrate on posing and directing the action now and have computer handle everything in-between.

So, hopefully, get on posing key features and concepts in your software and let the computer handle everything in-between. Until it becomes its own market, then we gone.

mfuzzey9 hours ago

In the early 1980s there was a code generator program called "the last one", because it was supposed to be the last anyone would need. Didn't quite work out.

lexx11 hours ago

Software development demands a very deep understanding of a company's business model and effective communication between a lot of people to get the final result right. Not only in terms of coding, but also in terms of strategy and architecture. AI can definitely help for quick prototyping, solution comparison, boring maintenance and stuff like that.

But how can AI help build something that not a single person has the answer to what that is?

thecrumb9 hours ago

Still waiting for flying cars, paperless office, robots that will steal my job and countless other promises of utopia.

ngcazz13 hours ago

Unless you're already getting paid for delivering a Big Design Up Front, by a customer who thinks they know what the software needs to do.

In that case there's zero incentive to place yourself in your user's shoes and work to mitigate those problems. You're a feature factory getting paid to reinforce existing workflows and paradigms, and you'd better not forget that as your efforts to recenter the conversation around user needs will be met with derision and you'll be seen as confrontational.

marginalia_nu13 hours ago

Sure, but that sort of thinking is a way to become more than a factory worker.

Programming is magic. It lets you change how the world works. Never forget that.

ngcazz12 hours ago

That's a simple and yet super poignant point that I wish I'd made when I picked up and left that role :)

ll_mama13 hours ago

Any advice on resources to become more product focused as a developer?

ngcazz12 hours ago

The thing that really shifted my software development mindset was learning about lean product development (and the application to software)

ll_mama9 hours ago
blub13 hours ago

What’s left unsaid: many programmers can’t or don’t want to “accomplish something for people”. They just want to code.

Such “automation is not a problem, because…” opinions have something in common: they’re looking at a subset of the affected population which has some trait making the transition easier.

Personally I’ve tried my hand at roles like architect, product owner, scrum master, etc and I was involved in most aspects of a software product’s lifecycle. These other roles are very different to coding and for someone that enjoys the simplicity of taming a machine, even exhausting.

I have my doubts that there will be enough “AI guide” jobs for all programmers, but the specific person Carmack’s talking to may indeed be fine.

zirgs12 hours ago

The purpose of software companies is to earn money not to give jobs to people who "want to code".

Lots of people learned how to make games for NES, SNES, PS2, PS3, etc.

All those machines are now obsolete.

Current gen consoles are using x86 and ARM now and most upcoming AAA games are built on Unreal Engine. The competitive advantage of those who mastered coding for the Cell processor is gone.

What should SONY have done instead? Should they still use the PS3 architecture in the current gen consoles to keep those developers employed?

muyuu10 hours ago

if I were a teenager these days, I'd be more worried about the vasts amounts of money required to get a credential that may not be worth a damn in a few years time

I wouldn't be worried about learning things that may become obsolete, even those particular skills that get obsoleted provide the student with extra ability to learn more

PS: very nice of John Carmack taking time to respond such DMs

mdmglr10 hours ago

> vasts amounts of money required to get a credential that may not be worth a damn in a few years time

are you referring to the degree program like CS or a degree across any field in general?

muyuu9 hours ago

I'm talking about tuition in certain countries, for CS and other degrees as well. Its cost has spiralled out of control for some reason, but its value proposition certainly hasn't and it faces further challenges down the line.

smallest-number10 hours ago

I've always thought computer science was the closest thing the real world had to magic, because the essense of software is always automation - you write the spell, so later you just have to invoke it and magic happens.

Whether the actual spell is written in arcane runes or python or encoded as a language model doesn't matter, the essense is the same.

seydor7 hours ago

I can definitely see writing games with natural language in the near future. Not everyone can do that of course, but they don't need to be programmers either, just people who are into the thing.

baby11 hours ago

It reminds me of Zero to One where Thiel makes the case that automation is going to help people do better things, not replace people. Same goes for AI.

pyuser5837 hours ago

Software might just be a tool, but it’s a tool we fall in move with.

I loved coding as a kid. It was so much fun.

As a grownup, I loved learning Linux.

I tolerated containers, dreaded Kubernetes, and am indifferent to AWS.

But is that initial love that sucks you in.

ipiz06186 hours ago

Who should they blame when things go wrong if customers and managers are building the system themselves? Or when they change their minds, who should they gaslight?

k__11 hours ago

In 2002, a fellow student in high school told me, I shouldn't study CS to become a developer. I should become a sysadmin instead.

His reason was, every software was already invented and now it only needs to be managed.

Movies or music? Edonkey, BitTorrent, Kazaa, and Napster got your back.

Chat or phone calls? MSN messenger, ICQ, IRC.

Games? People were onlyplaying Counter Strike and StarCraft anyway.

jamesgill5 hours ago

“I think scripting languages will make programmers obsolete”

What I heard every day in the 90s

jstummbillig12 hours ago

In the wake of the first IT job disrupting AI wave, who do you think will make up the core workforce in the creation of software, and why: The (today so-called) designers or the programmers? I have a strong sense that one is going to be much more effected than the others, but, interestingly, I have no clue which.

dumbfounder6 hours ago

I think of AI as simply a productivity tool, and it is here to make everyone more productive, like Google did 25ish years ago. Google may have put some out of a job, but it made everyone much more efficient. This is a good thing. Work weeks are shortening around the world and this will help us maintain productivity as we work less. (Fingers crossed)

99miles10 hours ago

So many people focus on their "stack", and all these things that have little effect on the outcome. Customers don't know or care how something is built, they just want it to provide value and solve a problem.

nfRfqX5n8 hours ago

Curious about how licensing will play out.

What’s stopping OpenAI from claiming copyright on everything produced in the last X years?

Will every company need to be running their own GPT4 model to be safe from this?

_-____-_7 hours ago

The early indication from some government agencies is that AI output will not be copyrightable.

yanisneverlies8 hours ago

I don't find such perspectives useful because they only consider two extremes: either we keep our jobs or become jobless.

The fact is, AI is currently capable of replacing some jobs, and it will likely replace even more in the future. However, this does not mean that we will all become jobless. Instead, engineers will become more valuable as they are needed to support and develop these complex systems.

Though, the amount of engineers will be reduced for sure.

_-____-_8 hours ago

> Though, the amount of engineers will be reduced for sure.

Why do you assume the demand for the output of engineers will remain constant? More likely it will continue growing (as it always has - "software is eating the world"), and engineers will be able to produce more output efficiently. This doesn't necessarily mean there will be fewer engineers.

lurker91911 hours ago

Are we sure GPT is going to improve 10x in 10 years? Hasn't it already been trained on the vast majority of available text data? We might get incremental improvements, but it's not like we have 10x more data lying around somewhere to feed GPT5.

RichEO11 hours ago

I don’t have a very good sense of what kind of information that GPT-4 had access to, but I imagine there is a whole world of knowledge locked up in books, particularly textbooks, journals and periodicals that it doesn’t have access to. That could be interesting.

newaccount202311 hours ago

10x? Try 1000x. We haven't even tried hooking up different types of reasoning models (chess engine, weather modeler) together yet...eventually LLM will be just one aspect selected based on intended use

GPT4 is basically Pong. Within a few years we will be nostalgic for its surprise value

HN has a God Complex when it comes to people like Carmack though, so you can't really survive disagreeing with him

kemiller6 hours ago

He’s right of course. But I can’t deny that I like the way it works now and will miss it.

_-____-_8 hours ago

I'm less worried about AI replacing my job, and more excited about how much more I'll be able to accomplish with AI. It's a multiplier.

JaDogg7 hours ago

Now imagine : NeuraLink plugged directly to ChatGPT (lot faster version) and you. You don't even need to type.

ilaksh5 hours ago

I have a version that you can talk to. And it will talk back.

gooroo10 hours ago

If you become a programmer / sw eng because you love it, i.e., building software or tinkering with teh, you'll be fine. AI will just be another tool. And tour career building won't feel like hard work. You are going to have a blast.

If you do it to have a high paying career, just don't. There are already too many people of that type in the industry. Any colleague who got into it for the money (or 'stable career') is usually much less fun to work with.

Veuxdo10 hours ago

Hint: if you're ever described what you're working on using the phrase "... in Python" or "... in Rust", this probably applies to you.

dgudkov6 hours ago

Of course CS jobs will exist. Who else will be fixing the bugs the AI generates?

tiku10 hours ago

Ai will still have a hard time understanding the real needs. That is your added value. Understanding clients/your company and their needs. And thinking ahead.

throwawaaarrgh8 hours ago

21st century Luddites, afraid machines will threaten their jobs.

If we're smart we will continue to find new ways to use new technology to make more new technology. Software written by hand is like a brick building. Certainly it can be nice, but it requires skilled labor. Faster and cheaper with less skill would be pre-fab units.

Radim11 hours ago

Yes, human economy is about the exchange of value between humans – a cheap tautology from Carmack. Money is indeed irrelevant to a snail, as opposed to a would-be SW engineer.

But Nature is about making better use of energy gradients, always doing more with less, the principle of least action. Using any surplus to do it again (i.e. evolved life). That's the properly grounded perspective. In that sense "Get skills to satisfy humans and you'll probably be fine!" sounds super myopic.

The anxiety we humans feel when confronted with AI is not only that we'll be out of our job as a programmer, or doctor, or driver, or teacher, or whatever.

It's the broader sense of unease that humanity's gradient-razing days, spectacular as they were all the way to nuclear fission and fusion, may soon be over. And "economy" as a useful tool advancing that Nature's mission will have evolved beyond us.

"Making humans satisfied" is not terribly relevant from that perspective. Vast swathes of the human economy are just scaffolding to support the rest: humans reproducing to keep the optimization machine going. The overhead is tremendous. Once Nature finds a way to do more with less, I have zero doubt much of that scaffolding will be optimized away. That's some definition of "fine".

Or maybe I misunderstood and Carmack is merely suggesting individuals try to adapt and hope for the best. What else can you do anyway? That would be the honest answer. Rather than bloviating about "Guide AI with your product skills to deliver value to humans" – an embarrassing category error.

z79 hours ago

Interesting critical response currently buried at the bottom of page 2. Doesn't surprise me really. Frankly this thread is an embarrassment. And not because of whatever someone's conclusion is but because of a noted lack of critical reasoning in this case.

jimkoen8 hours ago

> It's the broader sense of unease that humanity's gradient-razing days, [...], may soon be over.

No, the sense of unease comes from people fearing for their livelihood, given that it looks like that their raison d'être is about to be revoked by some AI automating their career away.

A point you so aptly summarized with: > Vast swathes of the human economy are just scaffolding to support the rest: humans reproducing to keep the optimization machine going.

The unease is, imo, only so palpable however, because every stakeholder that stands from profiting from such a development - i.e. capital owners - refuse to describe it as what the development really is.

Publicly advertising AI as "we're basicially moonshotting the largest job automation operation ever attempted, so 20-30% of you will be out of a job in the next 15 years" would probably cause unrest - and rightly so!

> What else can you do anyway?

Communism - and ya hate to see it (as an american)!

The means of production becoming dead-simple to use, that's a prelude for the proletariat being able to seize them.

On a more serious note, you can always protest, and escalate, which I'm hoping will happen.

Radim5 hours ago

> the sense of unease comes from people fearing for their livelihood, given that it looks like that their raison d'être is about to be revoked by some AI automating their career away

You're on the same boat with Carmack if that's where you find the danger. Then his advice applies.

People have had their livelihood threatened since forever – that is not a new state of affairs. That anxiety we're well equipped to deal with, hard (and genocidal) as it sometimes is.

Just another gold rush for resources, and for status, and for reproductive success. A new caste of winners. Same old.

> so 20-30% of you will be out of a job in the next 15 years

My point was that jobs (and money, and market…) are a machine that serves human ends. The end consumer is always a human, that's our economy's anchor point.

Whatever energy is put to work is ultimately in service of someone getting laid or fed, or their desires/ideals (~firings inside their brains).

As long as that's true, "keep your eyes on delivering value to humans" is a safe bet – just like Carmack says.

Mine was a simple thermodynamical note on Nature (e.g. via one of its creations, humans) finding a more efficient way to bulldoze energy gradients, the true currency of the universe, thus unanchoring our "jobs and careers" system. By cutting out the very expensive middle man.


Your remark on communism is also interesting in that sense. I see a parallel in that the concept of dissolution of individual into a commune, has been introduced several times in large societies, each time resulting in millions of deaths and general devastation. Yet it keeps bouncing back – some very strong attraction basin lurking there, evolutionarily speaking. Not to be trivialized by snarky political one-liners.

scaramanga7 hours ago

"I am concerned that the cotton gin will make slavery obsolete and all my slaves will lose their value"

Insert similarly ridiculous and offensive comment about women being replaced by any number of domestic labour-saving devices.

mecsred6 hours ago

Is this an attempt to conflate... Negative AI speculation with racism and misogyny? Yes this technology could be used to reduce the need for human labor. Technology usually isn't actually used in that way and instead just used to concentrate wealth. That's what's being discussed here.

The issues you are bringing up have much more to do with the discriminatory practice of forcing particular demographics to do the menial labor and what conditions they are made to work in. Also an important discussion, but mostly orthogonal to the topic of AI making software devs more efficient.

flappyeagle8 hours ago

It will make some CS jobs obsolete. Hopefully it will create new ones.

bqrayx9 hours ago

Programmers can starve AI code generation tools easily by moving to a new language and never producing open source, so the AI cannot steal and launder their output.

Perhaps this is Microsoft's new anti-OSS strategy, the ultimate EEE.

jordanpg8 hours ago

> Keep your eyes on the delivered value, and don't over focus on the specifics of the tools.

Sure, but the aspects of the job that some people enjoy may be closely linked to the tools. If software development becomes less about coding and more about creating prompts, test suites, or specifications, then some may lose interest in the work.

At least for me, it was never really about delivering value. If I am honest, I was completely indifferent about some of the industries I worked in. It was always just about solving interesting technical problems, learning stuff, keeping my brain active.

It's easy for me to imagine that software development may someday become the province of people who are more like designers.

rs_rs_rs_rs_rs13 hours ago

This reminds me of a tweet I saw couple of weeks ago from someone(I don't remember who it was) that said the reason they ship stuff fast is because they're using jQuery. Focus on what you're building and use the tools you're experienced with, don't jump on every fad.

timwheeler13 hours ago

Was it @levelsio?

rs_rs_rs_rs_rs11 hours ago
phendrenad212 hours ago

Moore's Law is dead and AI is its zombie. Best to just ignore it and spend your time making things.

kypro9 hours ago

Silly take honestly. I use this example a lot, but how exactly do self-checkout systems make cashiers more productive?

There are tools which increase human productivity, while still requiring it (barcode scanners, for example). And then there are another class of tools which make human labour obsolete (self-checkout systems).

LLM's (as they exist today) could be considered both. Github copilot would be an example of how GPT can be used as a productivity tool by human programmers, but as the technology progresses AI will become less of a "copilot", and will gradually replace humans as the main decision maker. Then eventually LLM will probably be used by people completely none technical to replace the need for coders entirely.

Now the argument becomes, well this will open up new opportunities. Instead of being a programmer you can be be a user researcher on a project, which could be the case, but this is a much more nuanced argument.

The most well paid jobs are typically those which require years of knowledge retention and require the human to basically serve as an advanced expert system in some domain.

Both "good" and "bad" programmers can write code into a text editor, the difference is that the "good" programmer will make decisions backed up by years of professional experience – same for a good doctor or good lawyer. This is why we pay more for these professions, because that depth of knowledge is hard accumulate.

This is the very thing that GPT attacks. What it can't replace is someone physically laying bricks or plumbing pipes. But there is less depth of knowledge required in jobs like this which limits salaries for these professions.

So sure. Perhaps in the future someone can say, "hey, GPT, build me [x]", but just remember you won't be the only one who can do that and there is no significant depth of knowledge in such a job. So while GPT won't replace all jobs and may even create some new ones, expect it to replace or devalue the majority of "good" jobs like doctors, programmers, lawyers, designers, etc.

So to Carmack's point, he's right you'll be able to build websites and apps faster using GPT as a tool, but you'll probably do so for a fraction of the salary.

We've ran similar experiences over the last several decades with outsourcing. If your labour can be easily out sourced then your ability to retain a good salary drops. It's not that a worker today can't make clothes in the US faster and better than at any point in the past, it's that it makes no economic sense to do so.


Another thing I'd note here is that I'm autistic as are a lot of programmers in my experience. My brain is built to do technical things and I struggle intensely with human interaction. In my opinion it's not that programmers "don't understand" software solves problems for people, it's that a lot of us don't naturally excel in those areas. I think it's fair to say a lot of like to stick to what we're good at, and that's generally writing code and designing complex systems. The more time I have to talk with users about their needs instead of doing technical work, the less useful I am. And I'm guessing designers are also people who want to design rather than type prompts into a chatbox.

So another consequence here is that we might increasingly be forced to do jobs we don't really want to do as AI restricts the areas of labour where humans can still compete.

y0ssar1an5 hours ago

mathematicians survived the calculator. coders will survive AI tools.

auggierose11 hours ago

Really smart people I know have no clue what code is. AI will make them code, too.

mybrid6 hours ago

I think it will be a niche. Frameworks will be updated to AI Frameworks where AI has known patterns to plug and play with.

Given the way capitalism works there will be a market for AI software. However, the cloud server provides have created Frankenstein patchworks of technologies in order to deploy the stuff on the cloud. DevOps will still very much be a thing.

To whit, Wordpress is about to get a whole lot more functional.

iamacyborg12 hours ago

This is a general truism. Focus on the why, not the what or the how

harry89 hours ago

Not seen any automated ai debugging tools yet.

itronitron12 hours ago

no one has lost their job to a roomba

tetek11 hours ago

Halt and catch fire vibes

tunnuz12 hours ago


nailer8 hours ago

Title is misleading.

say_it_as_it_is9 hours ago

Don't share DMs with the public without consent of whomever you've messaged

chiefalchemist10 hours ago

The super power I value - and rarely see in my peers - is the ability to hear wants and discuss them to define needs.

The initial stated wants are rarely the actual needs. "But they said _____." Yes, they did. That doesn't mean they got it right. People say a lot of ambiguous things. A client with a product or feature in mind is no different.

pts_11 hours ago


pts_11 hours ago

Judging by how FSD has killed people I am waiting for ChatGPT to do the same when used by non developers.

Barrin9212 hours ago

I have no idea why the lump of labour fallacy is still so ingrained in people. AI which is not AGI or whatever sci-fi panic people have on twitter, is a slightly fancier autocomplete, and thus it's a productivity tool.

Nobody has been replaced by their debugger or their intellisense, even if it makes coding 10x or 100x easier. It just means software development gets faster and cheaper. On net if anything that'll likely mean programming jobs expand, as software is still incredibly absent from many sectors of the economy.

If tomorrow mom and pop stores start using AI to build themselves simple websites and come online and enter the online economy that'll likely vastly mean more customers for the software industry overall. I wouldn't be surprised if we have 10x as many indie game developers in a few years because these tools enable them to enter the market, which is good for virtually everyone working in the industry.

nabla912 hours ago

"programmer" is not single thing.

Software jobs can be divided into expert jobs and laborer jobs.

Even if the demand for "code monkeys" decreases, demand for much smaller group of software engineers with masters or PhD (equivalent) and good mathematical skills probably increases.

It's a dynamic process where two forces find a equilibrium.

>Automation, which enables capital to replace labor in tasks it was previously engaged in, shifts the task content of production against labor because of a displacement effect. As a result, automation always reduces the labor share in value added and may reduce labor demand even as it raises productivity.

>The effects of automation are counterbalanced by the creation of new tasks in which labor has a comparative advantage. The introduction of new tasks changes the task content of production in favor of labor because of a reinstatement effect, and always raises the labor share and labor demand.

Automation and New Tasks: How Technology Displaces and Reinstates Labor

ilaksh5 hours ago

Nonsense. You know how many masters and PhDs NLP and computer vision knowledge is now almost entirely irrelevant for most businesses now that any uneducated person like me can access GPT-4 with an API call? Pretty much all of them. And within a year or two of years open source models will be available to run onsite for the ones that have security issues.

The best new models are so powerful and general that you literally don't have to train them for any specific task. Just give them some context.

ChatGPT couldn't count. GPT-4 gets a high score on a math SAT.

You and your friends will be easily replaceable with AI. Quite possibly already with GPT-4. If not, certainly we should anticipate within 3 years.

fnord7711 hours ago

there was a point in time when being an average musician was a viable middle class career

technology decimated that

jxi9 hours ago


foreverobama13 hours ago


jongjong11 hours ago

If anyone wants to avoid wasting their software development career. DO NOT EVER work on developer tools as the product. Developer tools is one of these areas where it doesn't matter how good your product is, no matter how much developers say they like it or how much time it saves them, it's not going to make it. Big tech companies will not allow their employees to use the tool and it will be a commercial failure. It will be a failure no matter what... Ok, unless maybe you can raise a ton of funding from well known VCs who will foist your tool onto various companies they have connections with... But then quality of the tool doesn't really matter at all.

Otherwise, even if it's the best tool ever built for certain use cases, company directors won't have the deep tech knowledge to understand the nuances which make it so useful. As for rank-and file developers who are meant to use the tool; they are more interested in over-engineered, very complex tools which maximize billable hours than in tools which makes them more efficient in their job.

In other words, the only people who could possibly want your product won't understand your pitch and those who can understand your pitch won't like it because it doesn't align with current perverse industry incentives for their roles.

Some developers consciously reject any tool which would make their jobs easier, others reject them due to a subconscious bias in favor of tools which facilitate complexity, disagreements and long meetings.