I often find myself explaining the same things in real life and online, so I recently started writing technical blog posts.
This one is about why it was a mistake to call 1024 bytes a kilobyte. It’s about a 20min read so thank you very much in advance if you find the time to read it.
Feedback is very much welcome. Thank you.
TLDR: the problem isn’t using base 2 multipliers. The problem is doing so then saying it’s a base 10 number
In 1998 when the problem was solved it wasn’t a big deal, but now the difference between a gigabyte and a gibibyte is large enough to cause problems