this post was submitted on 27 Jun 2024
990 points (98.1% liked)

Programmer Humor

19623 readers
2719 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 1 year ago
MODERATORS
 

Cross posted from: https://lemm.ee/post/35627632

top 50 comments
sorted by: hot top controversial new old
[–] bjoern_tantau@swg-empire.de 93 points 4 months ago (4 children)

Yeah, in the time I describe the problem to the AI I could program it myself.

[–] takeda@lemmy.world 52 points 4 months ago (2 children)

This is what it is called a programming language, it only exists to be able to tell the machine what to do in an unambiguous (in contrast to natural language) way.

[–] catastrophicblues@lemmy.ca 17 points 4 months ago (1 children)

Ugh I can’t find the xkcd about this where the guy goes, “you know what we call precisely written requirements? Code” or something like that

[–] abcd@feddit.de 12 points 4 months ago (1 children)

This reminds me of a colleague who was always ranting that our code was not documented well enough. He did not understand that documenting code in easily understandable sentences for everybody would fill whole books and that a normal person would not be able to keep the code path in his mental stack while reading page after page. Then he wanted at least the shortest possible summary of the code, which of course is the code itself.

The guy basically did not want to read the code to understand the logic behind. When I took an hour and literally read the code for him and explained what I was reading including the well placed comments here and there everything was clear.

AI is like this in my opinion. Some guys waste hours to generate code they can’t debug for days because they don’t understand what they read, while it would take maybe two hours to think and a day to implement and test to get the job done.

I don’t like this trend. It’s like the people that can’t read docs or texts anymore. They need some random person making a 43 minute YouTube video to write code they don’t understand. Taking shortcuts in life usually never goes well in the long run. You have to learn and refine your skills each and every day to be and stay competent.

AI is a tool in our toolbox. You can use it to be more productive. And that’s it.

load more comments (1 replies)
[–] Norgur@fedia.io 25 points 4 months ago (1 children)

This goes for most LLM things. The time it takes to get the word calculator to write a letter would have been easily used to just write the damn letter.

[–] emptyother@programming.dev 15 points 4 months ago

Its doing pretty well when its doing a few words at a time under supervision. Also it does it better than newbies.

Now if only those people below newbies, those who don't even bother to learn, didn't hope to use it to underpay average professionals.. And if it wasn't trained on copyrighted data. And didn't take up already limited resources like power and water.

[–] MagicShel@programming.dev 12 points 4 months ago (2 children)

I think there might be a lot of value in describing it to an AI, though. It takes a fair bit of clarity of thought to get something resembling what you actually want. You could use a junior or rubber duck instead, but the rubber duck doesn't make stupid assumptions to demonstrate gaps in your thought process, and a junior takes too long and gets demoralized when you have to constantly revise their instructions and iterate over their work.

Like the output might be garbage, but it might really help you write those stories.

[–] Distant_Foreground@lemm.ee 15 points 4 months ago (2 children)

When I'm struggling with a problem it helps me to explain it to my dog. It's great for me to hear it out loud and if he's paying attention, I've got a needlessly learned dog!

[–] 0x0@lemmy.dbzer0.com 11 points 4 months ago

The needlessly learned dogs are flooding the job market!

load more comments (1 replies)
load more comments (1 replies)
[–] IronKrill@lemmy.ca 6 points 4 months ago (1 children)

I have a bad habit of jumping into programming without a solid plan which results in lots of rewrites and wasted time. Funnily enough, describing to an AI how I want the code to work forces me to lay out a basic plan and get my thoughts in order which helps me make the final product immensely easier.

This doesn't require AI, it just gave me an excuse to do it as a solo actor. I should really do it for more problems because I can wrap my head better thinking in human readable terms rather than thinking about what programming method to use.

[–] CylustheVirus@beehaw.org 6 points 4 months ago

A rubber ducky is cheaper and not made by stealing other's work. Also cuter.

[–] Korne127@lemmy.world 62 points 4 months ago (3 children)

In my experience, you can't expect it to deliver great working code, but it can always point you in the right direction.
There were some situations in which I just had no idea on how to do something, and it pointed me to the right library. The code itself was flawed, but with this information, I could use the library documentation and get it to work.

[–] uhN0id@programming.dev 13 points 4 months ago

ChatGPT has been spot on for my DDLs. I was working on a personal project and was feeling really lazy about setting up a postgres schema. I said I wanted a postgres DDL and just described the application in detail and it responded with pretty much what I would have done (maybe better) with perfect relationships between tables and solid naming conventions with very little work for me to do on it. I love it for more boilerplate stuff or sorta like you said just getting me going. Super complicated code usually doesn't work perfectly but I always use it for my DDLs now and similar now.

The real problem is when people don't realize something is wrong and then get frustrated by the bugs. Though I guess that's a great learning opportunity on its own.

[–] xia@lemmy.sdf.org 11 points 4 months ago (1 children)

It can point you in a direction, for sure, but sometimes you find out much later that it's a dead-end.

load more comments (1 replies)
[–] danc4498@lemmy.world 5 points 4 months ago* (last edited 4 months ago)

It’s the same with using LLM’s for writing. It won’t deliver a finished product, but it will give you ideas that can be used in the final product.

[–] MystikIncarnate@lemmy.ca 41 points 4 months ago (13 children)

AI in the current state of technology will not and cannot replace understanding the system and writing logical and working code.

GenAI should be used to get a start on whatever you're doing, but shouldn't be taken beyond that.

Treat it like a psychopathic boiler plate.

[–] CanadaPlus@lemmy.sdf.org 13 points 4 months ago* (last edited 4 months ago) (8 children)

Treat it like a psychopathic boiler plate.

That's a perfect description, actually. People debate how smart it is - and I'm in the "plenty" camp - but it is psychopathic. It doesn't care about truth, morality or basic sanity; it craves only to generate standard, human-looking text. Because that's all it was trained for.

Nobody really knows how to train it to care about the things we do, even approximately. If somebody makes GAI soon, it will be by solving that problem.

load more comments (8 replies)
load more comments (12 replies)
[–] nikaaa@lemmy.world 41 points 4 months ago (1 children)

My dad's re-learning Python coding for work rn, and AI saves him a couple of times; Because he'd have no idea how to even start but AI points him in the right direction, mentioning the correct functions to use and all. He can then look up the details in the documentation.

[–] GodIsNull@discuss.tchncs.de 10 points 4 months ago* (last edited 4 months ago) (3 children)

You don't need AI for that, for years you asked a search engine and got the answer on StackOverflow.

[–] EatATaco@lemm.ee 8 points 4 months ago (2 children)

And before stack overflow, we used books. Did we need it? No. But stack overflow was an improvement so we moved to that.

In many ways, ai is an improvement on stack overflow. I feel bad for people who refuse to see it, because they're missing out on a useful and powerful tool.

load more comments (2 replies)
load more comments (2 replies)
[–] xia@lemmy.sdf.org 40 points 4 months ago (1 children)

This is the experience of a senior developer using genai. A junior or non-dev might not leave the "AI is magic" high until they have a repo full of garbage that doesn't work.

[–] jaybone@lemmy.world 17 points 4 months ago (1 children)

This was happening before this “AI” craze.

[–] criss_cross@lemmy.world 12 points 4 months ago (2 children)

10 years ago it was copy/pasting from stack overflow

load more comments (2 replies)
[–] crossmr@kbin.run 33 points 4 months ago (2 children)

Gen AI is best used with languages that you don't use that much. I might need a python script once a year or once every 6 months. Yeah I learned it ages ago, but don't have much need to keep up on it. Still remember all the concepts so I can take the time to describe to the AI what I need step by step and verify each iteration. This way if it does make a mistake at some point that it can't get itself out of, you've at least got a script complete to that point.

[–] RestrictedAccount@lemmy.world 13 points 4 months ago

Exactly. I can’t remember syntax for all the languages that I have used over the last 40 years, but AI can get me started with a pretty good start and it takes hours off of the review of code books.

[–] Auzy@beehaw.org 6 points 4 months ago (1 children)

I actually disagree. I feel it's best to use for languages you're good with, because it tends to introduce very subtle bugs which can be very difficult to debug, and code which looks accurate about isn't. If you're not totally familiar with the language, it can be even harder

load more comments (1 replies)
[–] RobotZap10000@feddit.nl 28 points 4 months ago (1 children)

Why is the AI speaking in a bisexual gradient?

[–] BoneALisa@lemm.ee 16 points 4 months ago (1 children)

Its the "new hype tech product background" gradient lol

[–] Fleur__@lemmy.world 8 points 4 months ago (3 children)

Because all robots are bisexual

load more comments (3 replies)
[–] Snapz@lemmy.world 27 points 4 months ago (2 children)

Except AI doesn't say "Is this it?"

It says, "This is it."

Without hesitation and while showing you a picture of a dog labeled cat.

load more comments (2 replies)
[–] NegativeLookBehind@lemmy.world 19 points 4 months ago

When I used to try and ask AI for help, most of the time it would just give me fake command combinations or reference some made-up documentation

[–] jaybone@lemmy.world 15 points 4 months ago (2 children)

It’s almost like working with shitty engineers.

[–] Knock_Knock_Lemmy_In@lemmy.world 7 points 4 months ago* (last edited 4 months ago)

Shitty engineers that can do grunt work, don't complain, don't get distracted and are great at doing 90% of the documentation.

But yes. Still shitty engineers.

Great management consultants though.

load more comments (1 replies)
[–] Auzy@beehaw.org 14 points 4 months ago (1 children)

My workmate literally used copilot to fix a mistake in our websocket implementation today.

It made one line of change.. turned it it made the problem worse

[–] MagicShel@programming.dev 6 points 4 months ago

AI coding in a nutshell. It makes the easy stuff easier and the hard stuff harder by leading you down thirty incorrect paths before you toss it and figure it out yourself.

[–] Blackmist@feddit.uk 13 points 4 months ago (1 children)

I guess whether it's worth it depends on whether you hate writing code or reading code the most.

[–] anakin78z@lemmy.world 15 points 4 months ago (2 children)

Is there anyone who likes reading code more than writing it?

[–] Croquette@sh.itjust.works 9 points 4 months ago (2 children)

Probably a mathematician or physicist somewhere.

I hate reading the code I wrote two days ago.

load more comments (2 replies)
load more comments (1 replies)
[–] Deceptichum@sh.itjust.works 10 points 4 months ago (2 children)

So what it’s really like is only having to do half the work?

Sounds good, reduced workload without some unrealistic expectation of computers doing everything for you.

[–] nickwitha_k@lemmy.sdf.org 18 points 4 months ago (1 children)

So what it’s really like is only having to do half the work?

If it's automating the interesting problem solving side of things and leaving just debugging code that one isn't familiar with, I really don't see value to humanity in such use cases. That's really just making debugging more time consuming and removing the majority of fulfilling work in development (in ways that are likely harder to maintain and may be subject to future legal action for license violations). Better to let it do things that it actually does well and keep engaged programmers.

[–] jaybone@lemmy.world 5 points 4 months ago (3 children)

People who rely on this shit don’t know how to debug anything. They just copy some code, without fully understanding the library or the APIs or the semantics, and then they expect someone else to debug it for them.

load more comments (3 replies)
[–] takeda@lemmy.world 8 points 4 months ago

From my experience all the time (probably even more) it saves me is wasted on spotting bugs and the bugs are in very subtle places.

[–] Daxtron2@startrek.website 5 points 4 months ago (4 children)
load more comments (4 replies)
[–] jaschen@lemm.ee 5 points 4 months ago (1 children)

From a person who does zero coding. It's a godsend.

[–] RecluseRamble@lemmy.dbzer0.com 12 points 4 months ago* (last edited 4 months ago) (2 children)

Makes sense. It's like having your personal undergrad hobby coder. It may get something right here and there but for professional coding it's still worse than the gold standard (googling Stackoverflow).

[–] SparrowRanjitScaur@lemmy.world 7 points 4 months ago (1 children)

Nah, you just need to be really specific in the requirements you give it. And if the scope of work you're asking for is too large you need to do the high level design and decompose it into multiple parts for chatgpt to implement.

load more comments (1 replies)
load more comments (1 replies)
[–] onlinepersona@programming.dev 5 points 4 months ago

Code is the most in depth spec one can provide. Maybe someday we'll be able to iterate just by verbally communicating and saying "no like this", but it doesn't seem like we're quite there yet. But also, will that be productive?

Anti Commercial-AI license

load more comments
view more: next ›