this post was submitted on 28 Feb 2024
180 points (97.4% liked)
World News
32365 readers
351 users here now
News from around the world!
Rules:
-
Please only post links to actual news sources, no tabloid sites, etc
-
No NSFW content
-
No hate speech, bigotry, propaganda, etc
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
What? How would you define "average"? His statement is technically correct.
Average is the mean (i.e. sum of all "skill" divided by the amount of programmers)
What they were thinking of is the median (50th percentile = 0.5 quantile), which splits the group in two equal sized groups.
For a bell curve, they are the same values. But think of the example of average incomes: 9 people have an income of 10$, one has an income of 910$. The average income is 100$ ((10*9+910)/10). The median is basically 10 however.
The distribution of skill in humans, for various tasks and abilities, can often be approximated by a normal distribution. In that case, as you know, the mean is equal to the average.
Yeah, fair enough
Actually, in order to test your assumption, you'd need to quantitatively measure skill, which per se is something already problematic, but you'd also need to run a statistical test to confirm the distribution is a normal/Gaussian distribution. People always forget the latter and often produce incorrect statistical inferences.