In information theory and digital communications, the amount of uncertainty or surprise associated with a source of symbols is called what?

Difficulty: Easy

Correct Answer: Entropy

Explanation:


Introduction / Context:
When engineers and computer scientists analyze data sources and messages, they often need a way to quantify uncertainty. Information theory provides a rigorous definition for how “surprising” or “unpredictable” a symbol stream is. This concept drives compression limits, error-control coding, and cryptography baselines.


Given Data / Assumptions:

  • We are discussing a system that emits symbols (e.g., bits, letters, packets).
  • The question asks for the name of the quantity that measures uncertainty.
  • No numerical calculation is required—only the correct term.


Concept / Approach:
The standard measure of uncertainty in a random variable (or symbol source) is entropy. For a discrete source with probabilities p_i, entropy H is defined as H = − Σ p_i * log2(p_i). Bandwidth, by contrast, is a physical-layer rate or frequency span; “loss” refers to attenuation or information loss; “quantum” is unrelated to the classic information-theoretic term in this context.


Step-by-Step Solution:
Identify the domain: information theory / communications.Recall the uncertainty metric name: entropy (measured in bits when using log base 2).Match the definition in the stem—“amount of uncertainty”—to the term “entropy.”


Verification / Alternative check:
If all outcomes are equally likely among N symbols, H = log2(N). For example, a fair coin (N = 2) has H = 1 bit per toss, which aligns with the intuitive “uncertainty” per outcome. If the source is biased or deterministic, entropy decreases toward zero, again matching our understanding of uncertainty.


Why Other Options Are Wrong:
Bandwidth: Measures capacity of a channel in Hz or bit/s, not uncertainty.


Loss: A qualitative term for degradation; not a formal uncertainty measure.


Quantum: Not the information-theoretic term here.


None of the above: Incorrect because “entropy” is the standard name.



Common Pitfalls:
Confusing “information content” with “data rate.” Entropy is a property of the source distribution; achievable data rate depends on coding, channel, and protocol overheads.



Final Answer:
Entropy

More Questions from Networking

Discussion & Comments

No comments yet. Be the first to comment!
Join Discussion