Difficulty: Easy
Correct Answer: Entropy
Explanation:
Introduction / Context:
When engineers and computer scientists analyze data sources and messages, they often need a way to quantify uncertainty. Information theory provides a rigorous definition for how “surprising” or “unpredictable” a symbol stream is. This concept drives compression limits, error-control coding, and cryptography baselines.
Given Data / Assumptions:
Concept / Approach:
The standard measure of uncertainty in a random variable (or symbol source) is entropy. For a discrete source with probabilities p_i, entropy H is defined as H = − Σ p_i * log2(p_i). Bandwidth, by contrast, is a physical-layer rate or frequency span; “loss” refers to attenuation or information loss; “quantum” is unrelated to the classic information-theoretic term in this context.
Step-by-Step Solution:
Identify the domain: information theory / communications.Recall the uncertainty metric name: entropy (measured in bits when using log base 2).Match the definition in the stem—“amount of uncertainty”—to the term “entropy.”
Verification / Alternative check:
If all outcomes are equally likely among N symbols, H = log2(N). For example, a fair coin (N = 2) has H = 1 bit per toss, which aligns with the intuitive “uncertainty” per outcome. If the source is biased or deterministic, entropy decreases toward zero, again matching our understanding of uncertainty.
Why Other Options Are Wrong:
Bandwidth: Measures capacity of a channel in Hz or bit/s, not uncertainty.
Common Pitfalls:
Confusing “information content” with “data rate.” Entropy is a property of the source distribution; achievable data rate depends on coding, channel, and protocol overheads.
Final Answer:
Entropy
Discussion & Comments