I am jack's complete lack of surprise.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
The biggest value I get from AI in this space is when I get handed a pile of spagehtti and ask for an initial overview.
So when the AI bubble burst, will there be coding jobs available to clean up the mess?
There already are. People all over LinkedIn are changing their titles to "AI Code Cleanup Specialist".
About that "net slowdown". I think it's true, but only in specific cases. If the user already knows well how to write code, an LLM might be only marginally useful or even useless.
However, there are ways to make it useful, but it requires specific circumstances. For example, you can't be bothered to write a simple loop, you can use and LLM to do it. Give the boring routine to an LLM, and you can focus on naming the variables in a fitting way or adjusting the finer details to your liking.
Can't be bothered to look up the exact syntax for a function you use only twice a year? Let and LLM handle that, and tweak the details. Now, you didn't spend 15 minutes reading stack overflow posts that don't answer the exact question you had in mind. Instead, you spent 5 minutes on the whole thing, and that includes the tweaking and troubleshooting parts.
If you have zero programming experience, you can use an LLM to write some code for you, but prepare to spend the whole day troubleshooting something that is essentially a black box to you. Alternatively, you could ask a human to write the same thing in 5-15 minutes depending on the method they choose.
This is a sane way to use LLM. Also, pick your poison, some bots are better than others for a specific task. It's kinda fascinating to see how other people solve coding problems and that is essentially on tap with a bot, it will churn out as many examples as you want. It's a really useful tool for learning syntax and libraries of unfamiliar languages.
On one extreme side of LLM there is this insane hype and at the other extreme a great pessimism but in the middle is a nice labour saving educational tool.
No shit sherlock!
so is the profit it was foretold to generate, but it actually costs money than its actually generating.
According to Deutsche Bank the AI bubble is ~~a~~ the pillar of our economy now.
So when it pops. I guess that's kinda apocalyptic.
Edit - strikethrough
Only for taxpayers ☝️
Its great for stupid boobs like me, but only to get you going. It regurgitates old code, it cannot come up with new stuff. Lately there have been less Python errors, but again the stuff you can do is limited. At least for the free stuff that you can get without signing up.
Yea, I use it for home assistant, it's amazingly powerful... And so incredibly dumb
It will take my if and statements, and shrunk it to 1/3 the length, while being twice as to robust... While missing that one of the arguments is entirely in the wrong place.
It regurgitates old code, it cannot come up with new stuff.
The trick is, most of what you write is basically old code in new wrapping. In most projects, I'd say the new and novel part is maybe 10% of the code. The rest is things like setting up db models, connecting them to base logic, set up views, api endpoints, decoding the message on the ui part, displaying it to user, handling input back, threading things so UI doesn't hang, error handling, input data verification, basic unit tests, set up settings, support reading them from a file or env vars, making UI look not horrible, add translatable text, and so on and on and on. All that has been written in some variation a million times before. All can be written (and verified) by a half-asleep competent coder.
The actual new interesting part is gonna be a small small percentage of the total code.
Oh wow. No shit. Anyway!
I'm not a programmer in any sense. Recently, I made a project where I used python and raspberry pi and had to train some small models on a KITTI data set. I used AI to write the broad structure of the code, but in the end, it took me a lot of time going through python documentation as well as the documentation of the specific tools/modules I used to actually get the code working. Would an experienced programmer get the same work done in an afternoon? Probably. But the code AI output still had a lot of flaws. Someone who knows more than me would probably input better prompts and better follow up requirements and probably get a better structure from the AI, but I doubt they'll get a complete code. In the end, even to use AI, you have to know what you're doing to use AI efficiently and you still have to polish the code into something that actually works.
AI companies and investors are absolutely overhyping its capabilities, but if you haven't tried it before I'd strongly recommend doing so. For simple bash scripts and Python it almost always gets something workable first try, genuinely saving time.
AI LLMs are pretty terrible for nearly every other task I've tried. I suspect it's because the same amount of quality training data just doesn't exist for other fields.
The people talking about AI coding the most at my job are architects and it drives me insane.
I'd much rather write my own bugs to have to waste hours fixing, thanks.
The good news is: AI is a lot less impressive than it seemed at first.
The bad news is: so are a lot of jobs.
I can't even get Copilot to write Vitest files for React without making a mountain of junk code that describes drivel.