The C99 language has a similar classification thru <stdint.h>
(and C is even more fine-grained, with types like int_fast32_t
) ; having that many integral types is useful for portability and efficiency.
Compatibility with C99 & C++ may be enough a reason for Go to have these types.
You may want to write code which can efficiently run on embedded microcontrollers, tablets (32 bits ARM), cheap laptops (32 bits x86), bigger desktops (64 bits x86-64), servers (perhaps also PowerPC or 64 bits AARCH64 ARM), etc... And you have various programming models or ABIs on some operating systems (e.g. x86, x32, amd64 on my Linux desktop).
On various architectures, the cost of integral operations may vary greatly. On some machines, adding an int
might be more costly than adding a C long
(or an int64
from Go). On other machines (probably most of them), it might be the opposite. And CPU cache considerations could matter a lot w.r.t. performance. And in some cases (e.g. if you have billion-sized array) data size matters a lot. At last, for binary data coming from outside, size, layout, alignment, and endianness matter a lot. Read about serialization.