I am wondering how many binary bits used to represent number greater than 255, such as 256, is it represented by 100000000 or something like 0..0100000000(more than 9 bits, say 16 bits?);
I am using C# to send a number in the range of 0-4095 (12 bits) by serial port to 12-bit DAC. If the number (>255) send out is in the form of 16 bits then the extra 4 bits will shift out the first 4 bits in the register of DAC.
I am using C# to send a number in the range of 0-4095 (12 bits) by serial port to 12-bit DAC. If the number (>255) send out is in the form of 16 bits then the extra 4 bits will shift out the first 4 bits in the register of DAC.