Difficulty: Easy
Correct Answer: Incorrect
Explanation:
Introduction / Context:
Data sizing terms are foundational in digital electronics, embedded systems, and computer science. Misunderstanding the size of a byte leads to errors in memory mapping, protocol design, and register programming.
Given Data / Assumptions:
Concept / Approach:
Historical variations existed, but the de facto and standardized definition of a byte is 8 bits. Therefore, saying “4 bits equal one byte” is incorrect; 4 bits equal a nibble, and two nibbles compose a byte. This convention aligns with hexadecimal notation, where one hex digit corresponds to 4 bits (one nibble) and two hex digits correspond to one byte.
Step-by-Step Solution:
Verification / Alternative check:
Memory addressing and bus widths are commonly expressed in bytes; hex dumps group bytes as pairs of hex digits, confirming the 8-bit convention.
Why Other Options Are Wrong:
“Correct” conflicts with standard definitions. BCD encoding or endianness do not redefine the cardinality of a byte.
Common Pitfalls:
Confusing hex digit counts with bytes; assuming that a “character” is always one byte, which may differ in some encodings but the byte remains 8 bits.
Final Answer:
Incorrect
Discussion & Comments