I am absolutely disgusted by the idea of people using ChatGPT for serious coding work.
Maybe I am just getting old but the idea of using a non-deterministic tool that can hardly be reasoned about that will straight up hallucinate facts for any professional work sounds insane to me.
Yes, I do see the value for Junior Devs as I am sure it can drastically increase their output in the short term but aren't they shooting themselves in the leg in the long run? That might sound elitist but at the end of the day, you will need to learn to read technical documentation anyway and once you understand it there is no need for ChatGPT.
Yes, if you are constantly hopping from one framework of the month to the next big thing, you just don't have the time to learn anything in depth and then again ChatGPT can help. But do you really want to live like this? Instead of band aid solutions we might want to push for less hype-driven and more pragmatic development styles that allow us the time to learn our frameworks.
I’m not an expert in every area I need to touch. I don’t work at a place like google where there are teams dedicated to solving the same problems across the org which I can rely on.
Getting a simple example roughly tailored to my needs, which I can use as a launching point, is often extremely useful.
For example, I encounter scientific libraries with poor documentation and I could spend half my day reading through it or searching stack overflow for a good explanation…or, I can ask ChatGPT4 for a specific implementation using a specific library, then ask it to explain anything I don’t understand.
These tools aren’t solving any real problems for me - they are replacing slower and less responsive systems I already relied on for research and development.
> a non-deterministic tool that can hardly be reasoned about that will straight up hallucinate facts for any professional work
This sounds like a decent enough description of the human brain.
Don't get me wrong: I'm not for a moment suggesting that what we have today is anything approaching "General Artificial Intelligence", and I am extremely concerned & worried about the inevitable massive damage AI is going to do to our world, but I do think it's a little funny that many people's specific objections to it amount to "it can do what people do". Yes. That's the idea.
The main concern with AI usage is not that it will write bad buggy code or lie: we already do that ourselves plenty, so that's not a novel skill in the professional arena. The main concern is that we can scale that stupidity.
But that's a problem of scale: individual usage isn't really going to do you notable damage on an individual level.
> Yes, if you are constantly hopping from one framework of the month to the next big thing, you just don't have the time to learn anything in depth and then again ChatGPT can help.
I actually think the opposite is true. Having used ChatGPT quite a lot for work (by mandate - I wouldn't have chosen to either but am glad now that I had to), I've found it's really very good at generating bad code and being confidently wrong (again, much like people): if you ask it about a subject you're not deeply knowledgeable about, it's going to lead you astray. The most astute way to use it is actually in going the last mile on something you're already very confident in, so that you can correct it as needed.
It's good at generating simple things that are 60% to 100% correct.
It's good at summarizing high level concepts or summarizing how multiple high level concepts relate with 60% to 100% accuracy
It's ok at helping you think about things to troubleshoot when you have an error you're not sure to do with.
To be honest I would say it's actually not going to be good for junior devs because they don't have the skills to properly fact check, quickly. But for more senior folks it's very easy to immediately see what's wrong, ignore those parts or ask it for clarification, and use what's right.
If you mostly know what you're doing, it can be very helpful to get immediate feedback, niche examples targeted exactly to what you're working on, or summaries of whatever high level concept you need a little more clarification on.
Much much faster than spelunking through Google.
As a concrete example I just this morning had it summarize Python's asyncio Queues versus Tasks, what they are, a few examples of using them, and when you would want to use one versus the other. All in just a few minutes.
I've been giving it a shot based on recommendations from my bosses, and I'm a senior+ dev.
80% of the time, the suggestions are either too generic for our codebase and so has to be patched up with our variable names, methods, etc. About half the time it completely misses the mark/intent of what is being written, and offers an auto-complete for a different problem.
20% of the time, it actually acts like a full line or full section autocomplete. Areas I've found it particularly helpful are writing tests. For example I added a new integration yesterday that touched a bunch of interface files for those integrations. Copilot was able to pick up on that and auto-generate a 90% functional test with ~30 lines, including setup and assertions.
I'm not convinced it's saving me time yet since I have to correct it/ignore it most of the time. When it works, it's cool. And like another comment said, typing speed has never been a bottleneck for me (record of 155 wpm on typing tests, I probably sit around 120-130 for average day-to-day typing).
I often start writing (short) programs based on something else I wrote - I take what I did before and hack it to start doing what I need now. Even when you end up changing everything there's some advantage for me in not having to start entirely from scratch. I can see it being useful for that.
I feel exactly as you do and I've reviewed code, found all sorts of weirdness and then realised that the developer used ChatGPT to do it and probably to do its unit tests. It all worked within itself but clearly didn't work in the real world at all! :-)
Ultimately, I cannot really be bothered to use ChatGPT. It might be my backwardness but I can be bothered to use an IDE when it helps (instead of VIM) so I don't think I'm a hopeless fundamentalist. Why has my brain written it off? I don't know but I know I am lazy and I do things that make life easier.
Personally, I’M terrified of people using ChatGPT for serious coding work. They will be seeding the future with vulnerable/buggy software that no one understands.
Granted, us humans are doing that now. Just many orders of magnitude slower. Probably slow enough that we can find/fix the important stuff.
The other aspect that terrifies me is the potential for nation state entities with deep pockets to inject vulnerabilities. What would it be worth to the NSA to seed the future with programs they could exploit?
LLMs are also capable of analyzing, describing, and debugging software. There have been several papers published on this. So I wouldn't worry too much about any new buggy software being produced. In a few years, LLMs will probably be issuing PRs on your repos to fix things you haven't gotten around to or noticed yet.
This actually seems like a great use case. Read and flag potential bugs for review. Although if it wrote the code in the first place how would it identify the bug? What it produced was something close to the most likely result given what it has seen so far
Are you able to back up the claim that ChatGPT code, when refined and deployed to production, is more buggy than code that is patchworked together by 10-20 stack overflow posts until something sticks?
I didn’t make that assertion. My fears are based on the speed that code of unknown quality, which is also poorly understood can be produced. I specifically said:
Granted, us humans are doing that now. Just many orders of magnitude slower. Probably slow enough that we can find/fix the important stuff.
What's so sanctimonious about writing code? If I vaguely know what I want to do in bash, but I don't know everything off the top of my head, I could search stackoverflow, or I could just type it into ChatGPT, apply some judgement, and run it. We're not slaughtering a holy cow, it's just another tool in the tool box in addition to the manual and google.
Don't be. There are some use cases that fit very well.
For example, we plan to use GPT-4 for generating text that ends up being seen and read by our customers. (Can't disclose the details, as it's near to core of our business model.) We need a _lot_ of text, and its in style that is time-consuming to think and write by hand. However, all the generated data is checked for correctness by humans. I think that these tips are excellent for designing the generating process so that the end result is of as high quality as it could reasonably be expected to be.
Edit: Oh, you mean coding work as for generating code for developers? Not in the sense of using it as a component of a data-generating/transforming system? In that case, I pretty much agree.
Edit2: Oof, I just realized that I mistook this whole thread for another thread that was posted earlier on Hacker News. So I thought we were talking about engineering systems that has GPT-4 as a subsystem: https://platform.openai.com/docs/guides/gpt-best-practices
Text generation with human proofreading/editing is definitely a valid use case. Maybe not for top-tier prose but yeah if you need large amounts of decent quality text it is the way to go. Definitely makes some new business ideas viable.
I've found the opposite to be true, actually. I'd prefer it in the hands of someone more experienced.
Real world example: Hey chatgpt, write me some Camel routes to take an input off of a JMS queue, wire it up to the camel salesforce getSObject endpoint, use these headers for the lookup, and write some unit tests.
Actual result: Code that is 99% correct, but messed up a couple of parameters and used a slightly wrong approach to mocking some things in the unit test. Having used all of the frameworks extensively, it took ~5 minutes to fix and certainly saved time. Partially because I'm used to reviewing code from people who make similar mistakes.
I can't imagine that working out as well if it was a truly junior engineer (not just a junior in name only) who didn't understand what was going on with the code very well.
And this is just the beginning. It is hard to imagine not using this for the bulk of the work within a few years.
I don’t think it’s more insane than copying and pasting code from anywhere on the internet. You have to understand what you use and you must not trust it blindly, whatever the source.
Well, I don't copy and paste code from the internet either. It's fine for learning but at some point you should be able to write your own code from scratch.
Ive been moving a large HTML / nunjuck CMS to nextjs. It's pretty decent and TSX. And when I said, this is the template, rewrite it based upon this example it does a decent job and definitely saves me time. For rewriting things like these it's been a time saver. There were also tasks where it mostly wasted my time.
Consider whether or not this is an ego driven reaction. Do you have the same feelings about your compiler? It also writes code that you likely don’t read, and likely don’t understand in depth.
A non-deterministic compiler would be pretty terrifying. In that case I would expect many people still writing assembly by hand for any serious work.
Yes, there is missing/unclear language specifications and compiler bugs and sometimes you get surprised by certain optimizations but you can reason about compilers well enough for practical purposes. Plus it's not economically viable anymore to do without them, sadly.
Maybe I am just getting old but the idea of using a non-deterministic tool that can hardly be reasoned about that will straight up hallucinate facts for any professional work sounds insane to me.
Yes, I do see the value for Junior Devs as I am sure it can drastically increase their output in the short term but aren't they shooting themselves in the leg in the long run? That might sound elitist but at the end of the day, you will need to learn to read technical documentation anyway and once you understand it there is no need for ChatGPT.
Yes, if you are constantly hopping from one framework of the month to the next big thing, you just don't have the time to learn anything in depth and then again ChatGPT can help. But do you really want to live like this? Instead of band aid solutions we might want to push for less hype-driven and more pragmatic development styles that allow us the time to learn our frameworks.