Difficulty: Easy
Correct Answer: reduces
Explanation:
Introduction / Context:
Normalization aims to reduce redundancy and prevent update, insertion, and deletion anomalies. It does this by placing facts about different entities in separate tables and linking them via keys. The question asks how normalization impacts duplication.
Given Data / Assumptions:
Concept / Approach:
Through functional dependency analysis, normalization decomposes tables so each non-key attribute depends on the key, the whole key, and nothing but the key. This structure greatly reduces duplicate storage (e.g., a customer’s address appears once in a Customer table, not on every Order row). While duplication is minimized, complete elimination is not always possible or practical; reference codes and indexes may replicate values for performance or integrity.
Step-by-Step Solution:
Verification / Alternative check:
Compare storage and update behavior before and after normalization; updates to a single master row replace mass updates across many rows.
Why Other Options Are Wrong:
Eliminates: too absolute; some duplication may remain by design.
Increases/Maximizes: opposite of normalization’s purpose.
Common Pitfalls:
Believing 100% elimination is required; pragmatic designs sometimes keep controlled redundancy for performance with safeguards (triggers, ETL).
Final Answer:
reduces
Discussion & Comments