Currently, I'm trying to translate some compression algorithm from existing C code.
Encoding and decoding doesn't appear difficult to me. It's more about the serialization to and from a stream (be it a file or a socket).
The input is 12 bit and the compressed output is 7 bits. But writing something to a stream always involves writing entire 8 bits.
So as there is always 1 bit remaining for each value, does that mean I would have to buffer 7 bytes just to be able to write 8 values? Which would give the following bytes (whereas all 1s belong to the first value, all 2's to the second one, etc.)
11111112
22222233
33333444
44445555
55566666
66777777
78888888
The real codec or the language being used both don't really matter (actually: the codec is G.711 and the language is Golang). So maybe the go
-Tag is inappropriate.
Any clue on this?