Difficulty: Easy
Correct Answer: Parallel
Explanation:
Introduction / Context:
Inside a computer, the central processing unit (CPU) moves and transforms data across registers, arithmetic logic units (ALUs), and internal buses. Although many external interfaces (for example, UART, SPI) use one-bit-at-a-time transmission, the CPU's core operations rely on multi-bit words being handled simultaneously. This question checks whether you recognize that internal computation is fundamentally parallel rather than serial.
Given Data / Assumptions:
Concept / Approach:
Parallel operation means multiple bits are processed in the same clock event along dedicated bit-slices or wide datapaths. This enables high throughput and deterministic latency for arithmetic and logic. Serial operation inside the CPU would require cycling bit-by-bit through the same hardware, significantly reducing performance for general-purpose processing.
Step-by-Step Solution:
Identify how arithmetic works: adders, shifters, and logic units are built as multi-bit structures (ripple-carry, carry-lookahead, etc.).Observe register files: they read/write full words (many bits) at once.Internal buses: 32-bit, 64-bit or wider, moving many bits each cycle → parallel.Therefore, internal CPU data handling is parallel by design.
Verification / Alternative check:
Microarchitecture diagrams consistently show N-bit ALUs and N-bit registers. Even superscalar and vector units expand parallelism further (SIMD lanes), reinforcing that internal processing is parallel.
Why Other Options Are Wrong:
Common Pitfalls:
Final Answer:
Parallel
Discussion & Comments