I've been trying to wrap my mind around 1's & 2's complement (in computer hardware number representation schemes) and, while reviewing an explanation I came across an example I cannot follow. The example is for 1's complement bitwise addition. First, let me start with an example I CAN follow.
This makes sense to me (1's complement bitwise subtraction):
11111111
01001001
10110110
What I see, from left to right, is: 1 bit minus 1bit equals zero bits, 1 bit minus 0 bits equals 1 bit, etc.
This does NOT make sense to me (1's complement bitwise addition):
001100111
010011110
100000101
From left to right, the first operation makes sense to me: 1 bit plus zero bits equals 1 bit. The rest makes no sense to me. For some reason, in the second position 1 bit plus 1 bit equals zero bits but in the third position, 1 bit plus 1 bit equals one bit (which is exactly the opposite of the previous position which had the same bits). Then, in positions four through seven, 1 bit plus zero bits equals 0 bits (which is the exact opposite of what the operation produced in the first position which had the same bits). I understand that to finish this answer you take the ninth bit and add it back to the least significant bit. However, my problem here is that I can't see the pattern in bitwise addition that produces the answer given in the first place. Could someone out there possibly break it down for me left-bit-to-right-bit? I'm not sure what I'm missing. Am I missing a pattern of xor/nor/not/and operations or maybe these operations have nothing to do with bitwise addition? I don't know. I've put these searches into google: "bitwise addition", "explain bitwise addition" and even "bitwise addition for dummies". Nothing comes back with an explanation of just exactly what is going on here, from left-to-right, bit-for-bit. So, if someone out there has the patience for this explanation, I'd REALLY appreciate it.
This makes sense to me (1's complement bitwise subtraction):
11111111
01001001
10110110
What I see, from left to right, is: 1 bit minus 1bit equals zero bits, 1 bit minus 0 bits equals 1 bit, etc.
This does NOT make sense to me (1's complement bitwise addition):
001100111
010011110
100000101
From left to right, the first operation makes sense to me: 1 bit plus zero bits equals 1 bit. The rest makes no sense to me. For some reason, in the second position 1 bit plus 1 bit equals zero bits but in the third position, 1 bit plus 1 bit equals one bit (which is exactly the opposite of the previous position which had the same bits). Then, in positions four through seven, 1 bit plus zero bits equals 0 bits (which is the exact opposite of what the operation produced in the first position which had the same bits). I understand that to finish this answer you take the ninth bit and add it back to the least significant bit. However, my problem here is that I can't see the pattern in bitwise addition that produces the answer given in the first place. Could someone out there possibly break it down for me left-bit-to-right-bit? I'm not sure what I'm missing. Am I missing a pattern of xor/nor/not/and operations or maybe these operations have nothing to do with bitwise addition? I don't know. I've put these searches into google: "bitwise addition", "explain bitwise addition" and even "bitwise addition for dummies". Nothing comes back with an explanation of just exactly what is going on here, from left-to-right, bit-for-bit. So, if someone out there has the patience for this explanation, I'd REALLY appreciate it.