I am trying to develop a toy CPU architecture in Go to learn and familiarise myself with the language, something I had done before in C. However, one part of the learning process has surprised me, and that is bit manipulation. In particular, I am struggling with implementing the concatenation of two 8 bit values into a 16-bit value. I have translated this general purpose C code I wrote:
uint16_t connect(uint8_t a, uint8_t b)
{
return (uint16_t) a | (uint16_t) b << 8;
}
Into this Go code:
func DereferenceWord(addr uint32) uint16 {
return uint16(memoryPointer[addr]) | uint16(memoryPointer[addr + 1] << 8)
}
To me at least, the code seems correct. However, when tested with 0xff, 0xff
and 0x0000
(address in my VM pointing to value 0xffff
), the Go code outputs 0xff
only (while the C code outputs the correct 0xffff
). Why could this be?
CONTEXT: function that sets a word in the VM's memory. Tested and working.
func SetWord(addr uint32, data uint16) {
initial := 0
for i := 0; i < 2; i++ {
memoryPointer[addr + uint32(i)] = uint8((data >> uint32(initial)) & 0xff)
initial += 8
}
}