I often find myself explaining the same things in real life and online, so I recently started writing technical blog posts.

This one is about why it was a mistake to call 1024 bytes a kilobyte. It’s about a 20min read so thank you very much in advance if you find the time to read it.

Feedback is very much welcome. Thank you.

  • Lmaydev@programming.dev
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    8 months ago

    In terms of storage 1000 and 1024 take the same amount of bytes bits to represent. So from a computer point of view 1024 makes a lot more sense.

    It’s just a binary Vs decimal thing. 1000 is not nicely represented in binary the same as 1024 isn’t in decimal.

    Edit: was talking about storing the actual number.