Chatgpt and Programming

Soldato
Joined
20 Dec 2004
Posts
15,845
I think it will take longer than people think to actually replace developers. I believe it will be a long time before AI can actually think and reason. What we're seeing now is a very powerful predictive text system, not actual machine intelligence. Don't get me wrong, I think generative AI will, and has already, changed the way a developer works. It will make us more productive. Junior developers will still exist, but their job will look/feel different.

Actual AGI is also extremely scary if you really think about it.
I know people like to dismiss LLMs as just predictive text systems, but saying the neural models being created inside them as just text prediction doesn't do them justice at all. Quite interesting article the other day that a LLM was being tested on some data, and it figured out it was probably being tested based on the data it was being fed.

Basically, just because the means of communicating with the NN is through a natural language interface that is predicting the next word, doesn't mean the NN itself can't continue to evolve and display intelligence as the network grows larger and more complex.

Ultimately, any NN has to have physical IO, be it a text interface, camera (eye), other senses, temperature, sound etc.
 
Soldato
Joined
2 May 2004
Posts
19,946
I know people like to dismiss LLMs as just predictive text systems, but saying the neural models being created inside them as just text prediction doesn't do them justice at all. Quite interesting article the other day that a LLM was being tested on some data, and it figured out it was probably being tested based on the data it was being fed.

Basically, just because the means of communicating with the NN is through a natural language interface that is predicting the next word, doesn't mean the NN itself can't continue to evolve and display intelligence as the network grows larger and more complex.

Ultimately, any NN has to have physical IO, be it a text interface, camera (eye), other senses, temperature, sound etc.
I’m not dismissing them at all, I understand there is and will be a lot more going on than predictive text, just saying there’s a really long way to go before they can actually replace a developer and I don’t believe it’s going to be only 10-15 years before 99% of the ‘average’ programmers are replaced.
 
Last edited:
Man of Honour
Joined
13 Oct 2006
Posts
91,170
I had an interesting one with the much maligned DPD AI support chatbot the other day - I had a parcel where the sender had incorrectly entered the address - the house name was in the county part of the address and the house name just a -. First two attempted drops the drivers just went to the postcode and went "meh" and failed it without even trying to figure it out. Tried with human support and just getting replies of "Sorry we can't change the address for this consignment" or "Sorry we don't allow customers to contact the driver" with a failure to recognise the problem or what I was actually asking, seeming to just react to key words in the query.

Thought I had nothing to lose putting the same description of the issue to the AI chat, just describing the issue and not suggesting how to resolve it, and it came straight back with something like "Is this correct I can add a delivery note for the driver that the house name is wrong in the address and to use X instead?" and it was sorted. It would have to have at least some interpretive ability rather than just complex key word matching to have figured out the slightly less straight forward course of action.
 
Last edited:
Associate
Joined
4 Jul 2009
Posts
1,004
I had an interesting one with the much maligned DPD AI support chatbot the other day - I had a parcel where the sender had incorrectly entered the address - the house name was in the county part of the address and the house name just a -. First two attempted drops the drivers just went to the postcode and went "meh" and failed it without even trying to figure it out. Tried with human support and just getting replies of "Sorry we can't change the address for this consignment" or "Sorry we don't allow customers to contact the driver" with a failure to recognise the problem or what I was actually asking, seeming to just react to key words in the query.

Thought I had nothing to lose putting the same description of the issue to the AI chat, just describing the issue and not suggesting how to resolve it, and it came straight back with something like "Is this correct I can add a delivery note for the driver that the house name is wrong in the address and to use X instead?" and it was sorted. It would have to have at least some interpretive ability rather than just complex key word matching to have figured out the slightly less straight forward course of action.

That's both very cool and annoying!
While this seems like a simple case of changing a field in a delivery address, it's not always going to be and might not be simply because of the way the system was designed in the first place.

This has likely been bolted on to a mixture of new and legacy software with very limited access to control that system.
 
Back
Top Bottom