In a binary symmetric channel (BSC), if the conditional probability of bit error is p = 0.5 due to extreme noise, what are the channel capacity and the binary entropy function H(p), respectively?

Difficulty: Easy

Correct Answer: 0 and 1

Explanation:


Introduction / Context:
Channel capacity quantifies the maximum reliable information rate over a noisy channel. For a binary symmetric channel (BSC), capacity depends on the crossover probability p. Understanding the edge case p = 0.5 (complete randomness) clarifies the limits of digital communication under severe noise.



Given Data / Assumptions:

  • BSC with error probability p = 0.5.
  • Capacity formula for BSC: C = 1 − H(p) bits per channel use (for binary input), where H(p) is the binary entropy function.


Concept / Approach:

The binary entropy function H(p) = − p log2 p − (1 − p) log2 (1 − p). At p = 0.5, H(0.5) = 1 (maximum uncertainty). Therefore, C = 1 − H(0.5) = 0: no information can be transmitted reliably because the output is statistically independent of the input (complete randomization).



Step-by-Step Solution:

Compute H(0.5): H(0.5) = −0.5 * log2 0.5 − 0.5 * log2 0.5 = 1.Compute capacity: C = 1 − H(0.5) = 1 − 1 = 0 bits/use.Thus, the pair (Capacity, H(p)) = (0, 1).


Verification / Alternative check:

Intuition: with p = 0.5, the channel effectively flips the bit randomly; the output conveys no information about the input, hence zero capacity.



Why Other Options Are Wrong:

  • Any nonzero capacity contradicts the total randomization at p = 0.5.
  • Entropy values other than 1 at p = 0.5 are incorrect; binary uncertainty is maximal at 0.5.


Common Pitfalls:

Applying C = log2 M − H without recognizing that M = 2 for binary channels; forgetting that H(p) peaks at p = 0.5.



Final Answer:

0 and 1

Discussion & Comments

No comments yet. Be the first to comment!
Join Discussion