Signed integer representation allows computers to distinguish between positive and negative integers by reserving a bit to indicate the number's sign. The most common methods are signed magnitude, one's complement, and two's complement, with two's complement being the most widely used due to its ease of arithmetic operations.