Character encoding fundamentals — ASCII bit width How many bits are defined in the original ASCII character code standard?

Difficulty: Easy

Correct Answer: 7

Explanation:


Introduction / Context:
ASCII (American Standard Code for Information Interchange) is the foundational character encoding for early computer systems and continues to influence modern encodings. Knowing its bit width clarifies how many distinct characters the original standard supports and helps distinguish ASCII from extended encodings such as ISO-8859-1 and UTF-8.


Given Data / Assumptions:

  • The term “ASCII” here refers to the original, standard ASCII as published, not vendor-specific extensions.
  • We are not counting parity bits or extended code pages.
  • The question asks about the number of bits per code point in the original scheme.


Concept / Approach:

Original ASCII defines 128 unique codes numbered 0–127 inclusive. The minimum number of bits required to represent 128 distinct values is 7 bits, because 2^7 = 128. While many serial links used an 8th parity bit for error detection, that extra bit is not part of the ASCII code itself. Later “extended ASCII” sets used the full 8 bits (0–255) but these are not standardized as “ASCII proper.”


Step-by-Step Solution:

Compute capacity: 2^7 = 128 unique values, matching ASCII's 0–127 range.Note that control codes occupy 0–31 and 127; printable characters fill 32–126.Therefore the canonical ASCII width is 7 bits.


Verification / Alternative check:

Check a standard ASCII table: it lists 128 entries. Serial protocols often frame characters as 7 data bits plus parity/stop bits, aligning with the 7-bit standard.


Why Other Options Are Wrong:

8: describes extended byte encodings or 7 bits plus a parity bit, not original ASCII.

16: pertains to encodings like UCS-2 or UTF-16 code units, not ASCII.

4: would allow only 16 characters, far too few to cover ASCII's control and printable sets.


Common Pitfalls:

Conflating ASCII with “extended ASCII,” assuming parity makes ASCII 8 bits, or confusing code pages (which are locale-specific) with the global ASCII definition. Always differentiate the original 7-bit ASCII from later 8-bit extensions and from Unicode.


Final Answer:

7

More Questions from Number Systems and Codes

Discussion & Comments

No comments yet. Be the first to comment!
Join Discussion