Difficulty: Easy
Correct Answer: 7
Explanation:
Introduction / Context:
ASCII (American Standard Code for Information Interchange) is the foundational character encoding for early computer systems and continues to influence modern encodings. Knowing its bit width clarifies how many distinct characters the original standard supports and helps distinguish ASCII from extended encodings such as ISO-8859-1 and UTF-8.
Given Data / Assumptions:
Concept / Approach:
Original ASCII defines 128 unique codes numbered 0–127 inclusive. The minimum number of bits required to represent 128 distinct values is 7 bits, because 2^7 = 128. While many serial links used an 8th parity bit for error detection, that extra bit is not part of the ASCII code itself. Later “extended ASCII” sets used the full 8 bits (0–255) but these are not standardized as “ASCII proper.”
Step-by-Step Solution:
Verification / Alternative check:
Check a standard ASCII table: it lists 128 entries. Serial protocols often frame characters as 7 data bits plus parity/stop bits, aligning with the 7-bit standard.
Why Other Options Are Wrong:
8: describes extended byte encodings or 7 bits plus a parity bit, not original ASCII.
16: pertains to encodings like UCS-2 or UTF-16 code units, not ASCII.
4: would allow only 16 characters, far too few to cover ASCII's control and printable sets.
Common Pitfalls:
Conflating ASCII with “extended ASCII,” assuming parity makes ASCII 8 bits, or confusing code pages (which are locale-specific) with the global ASCII definition. Always differentiate the original 7-bit ASCII from later 8-bit extensions and from Unicode.
Final Answer:
7
Discussion & Comments