this post was submitted on 28 Feb 2024
180 points (97.4% liked)
World News
32348 readers
415 users here now
News from around the world!
Rules:
-
Please only post links to actual news sources, no tabloid sites, etc
-
No NSFW content
-
No hate speech, bigotry, propaganda, etc
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I think that’s the point. You can’t trust the average developer to do things safely. And remember, half of all programmers are even worse than average.
Maybe even more!
Wouldn't that be the median programmer instead of average?
The word “average“ can mean many things, for example, mean, median, mode, or even things like “within 1 standard deviation from the mean”.
I was using it strictly as the mean which divides the population exactly in half.
The median is the one that splits a data set in half and picks the middle.
You’re right of course, that was a stupid mistake on my part.
Half of all programmers constitute the so called "average" group
Which half am I in?
If you have to ask
You know
Yes. And 75% of car driver believe they are above average as well...
99% of devs believe they are in the top 1%
Yea! I'm one of them!
Bell curves don't work to make this point. A bell curve is symmetrical, so half of developers will always be below average on a bell curve. But yes, it is true that for other types of distributions, more or less than half of the developers could be below average. What the person above you was looking for, in the general case, would be the median.
The mean is in the center of the bell curve, so I’m not sure what your point is.
What? How would you define "average"? His statement is technically correct.
Average is the mean (i.e. sum of all "skill" divided by the amount of programmers)
What they were thinking of is the median (50th percentile = 0.5 quantile), which splits the group in two equal sized groups.
For a bell curve, they are the same values. But think of the example of average incomes: 9 people have an income of 10$, one has an income of 910$. The average income is 100$ ((10*9+910)/10). The median is basically 10 however.
The distribution of skill in humans, for various tasks and abilities, can often be approximated by a normal distribution. In that case, as you know, the mean is equal to the average.
Yeah, fair enough
Actually, in order to test your assumption, you'd need to quantitatively measure skill, which per se is something already problematic, but you'd also need to run a statistical test to confirm the distribution is a normal/Gaussian distribution. People always forget the latter and often produce incorrect statistical inferences.
Literally.
Or rather a Dunning Kruger issue: seniors having spent a significant time architecturing and debugging complex applications tend to be big proponents for things like rust.