I managed to get an AI to build pong in assembly. Are are pretty cool things, but not sci-fi level just yet, but I didn’t just say “build pong in assembly”, I have to hand hold it a little bit. You need to be a programmer to understand how to guide the AI to do the task.
That was something very simple, I doubt that you can get it to do more complex tasks without a more lot of back and forth.
To give you an example I had a hard time getting it to understand that the ball needed to bounce off at an angle if intercepted at an angle, it just kept snapping it to 90° increments. I couldn’t fix it myself because I don’t really know assembly well enough to really get into the weeds with it so I was sort of stuck until I was finally able to get the AI to do what I wanted it to. I sort of understood what the problem was, there was a number somewhere in the system and it needed to make the number negative, but it just kept setting the number to a value. A non-programmer wouldn’t really understand that’s what the problem was and so they wouldn’t be able to explain to the AI how to fix it.
I believe AI is going to become an unimaginably useful tool in the future and we probably don’t really yet understand how useful it’s going to be. But unless they actually make AGI it isn’t going to replace programmers.
If they do make AGI all bets are off it will probably go build a Dyson Sphere or something at that point and we will have no way of understanding what it’s doing.
I tried to get it to build a game of checkers, spent an entire day on it, in the end I could have built the thing myself. Each iteration got slightly worse, and each fix broke more than it corrected.
AI can generate an “almost-checkers” game nearly perfectly every time, but once you start getting into more complex rules like double jumping it just shits the bed.
What these headlines fail to capture is that AI is exceptionally good at bite sized pre-defined tasks at scale, and that is the game changer. Its still very far from being capable of building an entire app on its own. That feels more like 5-10 years out.
Yeah, I don’t see AI replacing any developers working on an existing, moderately complex codebase. It can help speed up some tasks, but it’s far from being able to take a requirement and turn it into code that edits the right places and doesn’t break everything.
How much longer until cloud CEOs are a thing of the past? Wouldn’t an AI sufficiently intelligent to solve technical problems at scale also be able to run a large corporate division? By the time this is actually viable, we are all fucked.
Yeah hows that goin’?
It can write really buggy Python code, so… Yeah, seems promising
It does a frequently shitty job of writing docstrings for simple functions, too!
Almost like dealing with real engineers…
When I last tried to let some AI write actual code, it didn’t even compile 🙂 And another time when it actually compiled it was trash anyway and I had to spend as much time fixing it, as I would have spent writing it myself in the first place.
So far I can only use AI as a glorified search engine 😅
We are now X+14 months away from AI replacing your job in X months.
I don’t get how it’s not that AI would help programmers build way better things. if it can actually replace a programmer I think it’s probably just as capable of replacing a CEO. I bet it’s a better use case to replace CEO
You can hire a lot of programmers for the cost of one CEO.
Something I’ve always found funny about the “AI will replace programmers soon” is that this means AI’s can create AI’s and isn’t this basically the end of the economy?
Every office worker is out of a job just like that and labourers only have as long as it takes to sort out the robot bodies then everyone is out of a job.
You thought the great recession was bad? You ain’t seen nothing!
amazon cloud CEO reveals that they have terminal CEO brain and have no idea what reality is like for the people they’re in charge of
checks out
It’s really funny how AI “will perform X job in the near future” but you barely, if any, see articles saying that AI will replace CEO’s in the near future.
Somewhere there is a dev team secretly programming an AI to take over bureaucratic and manegerial jobs but disguising it as code writing AI to their CTO and CEO
C-suites are like Russian elites.
The latter are some thieves who’ve inherited a state from Soviet leadership. They have a layman’s idea of what a state and a country is, what history itself is, plus something that a taxi driver would say. In the last 20 years they are trying to apply that weird idea to reality, as if playing Hearts of Iron, because they want to be great and to be in charge of everything that happens.
The former have heard in school that there were industrial revolutions and such, and they too want to be great and believe in every such stupid hype about someone being replaced with new great technology, and of course they want to be in charge of that process.
While in actuality with today’s P2P technologies CEO’s are the most likely to be replaced, if we use our common sense, but without “AI”, of course. Just by decentralized systems allowing much bigger, more powerful and competitive cooperatives than before, and those that form and disband very easily.
Until an AI can get clear, reasonable requirements out of a client/stakeholder our jobs are safe.
So never right?
If the assumption is that a PM holds all the keys…
They could churn out garbage and scams for the idiots on Facebook, sure.
Honestly I feel even an AI could write better code then what some big tech software uses lol
Big words from someone who can’t even write “than” properly.
?
The paramount+ app doesn’t even know how to properly hide the pause icon after you hit resume ffs. It’s been months.
But Human QAs … Human QAs everywhere!
Wasn’t it the rabbit 1 scammer who said programmers would be gone in 5 years, like 3 years ago?
The thing that I see most is that AI is dumb and can’t do it yet so we don’t need to worry about this.
To me, it’s not about whether it can or not. If the people in charge think it can, they’ll stop hiring. There is a lot of waste in some big companies so they might not realize it’s not working right away.
Source: I work for a big company that doesn’t do things efficiently.
That big company will go through a crisis realize it’s mistake and start quickly hiring again or it will fail and disappear
AI is terrible at solving real problems thru programming. As soon as the problem is not technical in nature and needs a decision to be made based on experience, it falls flat on its face.
It will never understand context and business rules and things of that nature to the same extent that actual devs do.
I’d believe AI will replace human programmers when I can tell it to produce the code for a whole entire video game in a single prompt that is able to stand up to the likes of New Vegas, has zero bugs, and is roughly hundreds of hours of content upon first play due to vast exploration.
In other words, I doubt we’ll see human programmers going anywhere any time soon.
Edit:
Reading other replies made me remember how I once, for fun, tried using a jailbroken copilot program to do python stuff slightly above my already basic coding skill and it gave me code that tried importing something that absolutely doesn’t exist. I don’t remember what it was called ince I deleted the file while cleaning up my laptop the other day, but I sure as hell looked it up before deleting it and found nothing.
Could you imagine Microsoft replacing windows engineers with a chat gpt prompt? What would that prompt even look like?
To be honest, this could be an example of where AI could do marginally better. I don’t mean that because of code quality or functionality. I mean it in the sense of MS software getting absolutely fucked by internal competition and stack-ranking fostered during the Balmer years. The APIs are inconsistent and there is a ton of partially implemented stuff that will never be pushed to completion because everyone who worked on it was fired.
An AI might be able to implement things without intentionally sabotaging itself but, because LLMs are in the forefront of what would be used and do not have the capability of intention or understanding context, I’m a bit pessimistic.
Honestly, GPT has strengthened my coding skills… for the simple reason that the handful of times I’ve asked it to do something the response I get back is so outlandish that I go “That CAN’T be right” and figure out how to do it myself…
Research with extra steps… I get it, but still…
I feel like it’s whispering bad advice at me while I’m typing. It’s good for as auto completing the most rudimentary stuff, but I have a hard time imagining it completing even one file without injecting dangerous bugs, let alone a large refactor.
The best copilot can do is autofill lines that everyone’s written a million times. That’s not nothing, but it aint replacing a human brain any time soon.