Difficulty: Easy
Correct Answer: Incorrect
Explanation:
Introduction / Context:
A precise understanding of bit-group terminology is essential in digital electronics and computer architecture. The term “byte” is now universally used in hardware manuals, programming languages, and communication protocols, and it must be used unambiguously when sizing memories, defining data paths, or describing file formats. This question tests whether a learner can distinguish a generic grouping of bits from the standardized definition of a byte.
Given Data / Assumptions:
Concept / Approach:
By modern convention, a byte equals 8 bits. Standards, operating systems, compilers, and memory chips encode and address data in 8-bit bytes (also called octets in networking standards). While historical systems sometimes used 6-bit or 7-bit character sets, these were not called “1 byte” in the modern sense. Therefore, equating 6 bits with a byte is incorrect in contemporary digital design practice.
Step-by-Step Solution:
Verification / Alternative check:
Check typical RAM specifications (e.g., 1 MB = approximately 10^6 bytes, each 8 bits). Networking documentation uses “octet” explicitly to mean 8 bits. Instruction set manuals (x86, ARM, RISC-V) define byte-wide loads/stores as 8-bit transfers. All corroborate that a byte is 8 bits.
Why Other Options Are Wrong:
Common Pitfalls:
Confusing historical character width with the standardized byte, or assuming that transmission framing bits (start/stop/parity) change the definition of the stored byte.
Final Answer:
Incorrect
Discussion & Comments