ASCII character width — number of code bits How many information bits (not counting start/stop in serial framing) are used by standard ASCII to represent one character?

Difficulty: Easy

Correct Answer: 7

Explanation:


Introduction / Context:
ASCII (American Standard Code for Information Interchange) is a long-standing character encoding used in computers and communication equipment. Distinguishing between the code width and transmission framing bits (start/stop, optional parity) prevents confusion when configuring serial links or interpreting files.


Given Data / Assumptions:

  • We refer to classic ASCII, not extended encodings.
  • Framing bits in UART (start/stop/parity) are excluded from the code width.
  • Extended ASCII or ISO/Windows codepages are outside the scope.


Concept / Approach:
Standard ASCII defines 128 symbols encoded with 7 bits, ranging from 0x00 to 0x7F. Many systems store ASCII in 8-bit bytes, leaving the high bit 0 or using it for parity/extension, but the information content of ASCII itself is 7 bits.


Step-by-Step Solution:

1) Count of ASCII codes = 128.2) 2^7 = 128 → requires 7 bits.3) Therefore, ASCII uses 7 information bits per character.4) Start/stop/parity bits are transmission overhead, not part of the character code width.


Verification / Alternative check:
ASCII table ranges 0x00–0x7F (0–127 decimal). The most significant bit is not used in standard ASCII.


Why Other Options Are Wrong:

  • 1 or 2: far too small for 128 characters.
  • 8: corresponds to extended encodings or storage width, not the base ASCII code width.


Common Pitfalls:
Equating 8-bit storage with 8-bit ASCII; “extended ASCII” is not the same as standard 7-bit ASCII.


Final Answer:
7

More Questions from Number Systems and Codes

Discussion & Comments

No comments yet. Be the first to comment!
Join Discussion