this post was submitted on 14 May 2024
1354 points (99.1% liked)
Programmer Humor
32558 readers
526 users here now
Post funny things about programming here! (Or just rant about your favourite programming language.)
Rules:
- Posts must be relevant to programming, programmers, or computer science.
- No NSFW content.
- Jokes must be in good taste. No hate speech, bigotry, etc.
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
You poor innocent soul... I can try to explain why decimal is even mentioned, but it would probably take a lot of time, and I'm not sure if I will be able to clarify things up.
I can at least say this: 2 TB HDD drive is indeed 2*10^12 B, but suddenly shindow$ in its File Explorer will show you that in fact the drive is only 1.82 TB. But WHY? Everyone asks, feeling scammed. Because HDD spec uses decimal units (SI; MB) and Window$ uses binary units (JEDEC; MB), i.e., 1.82 TiB (IEC; MiB). And macOS also uses JEDEC units, AFAIK.
More and more FOSS software uses IEC units and KDE Plasma is a good example: file manager, package manager etc. uses IEC units. Simply put, JEDEC added the binary meaning to decimal units, so at first MB (and now) only carried decimal meaning (until JEDEC shit out their standard). And the only reason why "gibibyte" is ridiculous, is because we all grew up with JEDEC interpretation of SI units. So it will take many generations for everyone to adapt xxbityte words into daily conversations. I'm (already) doing my part. It's just the legacy that we have to deal with.
All international bodies (BIPM, NIST, EU) agree that the SI prefixes "refer strictly to powers of 10" and that the binary definitions "should not be used" for them.
https://en.wikipedia.org/wiki/Binary_prefix#IEC_1999_Standard
https://en.wikipedia.org/wiki/Binary_prefix#Other_standards_bodies_and_organizations
https://en.wikipedia.org/wiki/JEDEC_memory_standards#JEDEC_Standard_100B.01
Well, thank you for taking the time to write this detailed explanation!
Windows and MacOS use the abbriviation "MB" referring to the binary units, correct? How come that these big OS's use another unit than these large international bodies recognize?
On a side note, I've always found it weird why HDDs or SSDs are/were sold with 128GB, 265GB, 512GB etc. when they are referring to decimal units.
Yez. I'm only sure about the first one, but didn't test myself whether the macOS is using power of 2 or 10 under the hood (of MB). You can open properties of something big and try converting raw number of bytes with
/1024^n
and/1000^n
and compare the end results.Legacy, legacy everywhere (IMO). And of course they don't want to confuse their precious users that don't know any better. And this also would break some scripts that rely on that specific output. GNU C library also uses JEDEC units by default, hence flatpak and other software.
It is weird for everyone, because we mainly only count with multiples of 2 when it comes to digital size of information. I didn't investigate why they use power of 10, but I've seen that some other hardware also uses decimal units (I think at least in RAM, but JEDEC is used intentionally or not for CPU cache memory). I had a link where the RAM thingy is lightly addressed, but I couldn't find it.
spoiler
P.S. it's "OSes" and "macOS" BTW.Maybe people would listen to you if you werent such a prick
Ok, show me what I did wrong and what should I do instead to not be a prick, please.
Dont start a comment with 'you poor innocent soul'
What is bad about it? It wasn't an offensive statement, it was stating the fact that that the person was new to the whole "MB vs. MiB" mega story that is an ongoing issue for at least over 10 years, and that I envy/pity the "cruel world that they are in" (where everyone uses JEDEC units while IEC ones should be used instead).
If that is the only reason why you're starting calling names other people and downvoting all of their comments then you're overreacting. The person I talked with didn't even mention it. I heard this phrase from some movie or something.
I donno, maybe I misinterpreted your tone but your comments all sound really condescending to me. With the "iTs cALled macOS not MacOS" and stuff
The last part was wrapped in a spoiler and under the post scriptum clause to indicate that it's not important and that you should really see it if you don't want to. And I added that just to educate a bit more, since I already started the " this is wrong and this is right" conversation. To be honest, I hate that that OS had so many naming changes that everyone is just left confused in the end. Some still say OS X or whatever else.