Difficulty: Easy
Correct Answer: bit
Explanation:
Introduction / Context:
Understanding how digital information is measured is foundational for computer science, embedded systems, and networking. The smallest unit identifies how data is stored, addressed, and manipulated at the hardware level.
Given Data / Assumptions:
Concept / Approach:
A single binary digit is the atomic piece of information in digital systems. Everything else—nibble (4 bits), byte (8 bits), word (platform-dependent, often 16/32/64 bits)—is composed of multiple such digits. Therefore, the smallest unit must be that single binary digit.
Step-by-Step Solution:
Identify the hierarchy: bit < nibble < byte < word.Recall definitions: bit = binary digit (0 or 1).Compare options and select the smallest unit by definition.
Verification / Alternative check:
Any digital storage specification (e.g., memory sizes, bus widths) ultimately breaks down to bit counts, confirming the bit as the smallest unit.
Why Other Options Are Wrong:
Nibble: equals 4 bits, not the smallest.Byte: equals 8 bits, not the smallest.Word: architecture-dependent and always multiple bytes/bits.BCD: a coding scheme, not a unit of size.
Common Pitfalls:
Confusing coding formats (like BCD) with size units; assuming a byte is always the smallest because it is commonly referenced in software.
Final Answer:
bit
Discussion & Comments