Difficulty: Easy
Correct Answer: 0 and 1
Explanation:
Introduction / Context:
Channel capacity quantifies the maximum reliable information rate over a noisy channel. For a binary symmetric channel (BSC), capacity depends on the crossover probability p. Understanding the edge case p = 0.5 (complete randomness) clarifies the limits of digital communication under severe noise.
Given Data / Assumptions:
Concept / Approach:
The binary entropy function H(p) = − p log2 p − (1 − p) log2 (1 − p). At p = 0.5, H(0.5) = 1 (maximum uncertainty). Therefore, C = 1 − H(0.5) = 0: no information can be transmitted reliably because the output is statistically independent of the input (complete randomization).
Step-by-Step Solution:
Verification / Alternative check:
Intuition: with p = 0.5, the channel effectively flips the bit randomly; the output conveys no information about the input, hence zero capacity.
Why Other Options Are Wrong:
Common Pitfalls:
Applying C = log2 M − H without recognizing that M = 2 for binary channels; forgetting that H(p) peaks at p = 0.5.
Final Answer:
0 and 1
Discussion & Comments