Express α^2 + β^2 in terms of a, b, c: If α and β are roots of ax^2 + bx + c = 0 (a ≠ 0), find α^2 + β^2.

Difficulty: Easy

Correct Answer: (b^2 − 2ac) / a^2

Explanation:


Introduction / Context:
This is a direct application of Vieta’s relations and algebraic identities. The sum and product of roots of a quadratic allow you to express symmetric functions like α^2 + β^2 without solving for the roots explicitly.


Given Data / Assumptions:

  • α and β are roots of ax^2 + bx + c = 0.
  • a ≠ 0 so that degree and relations are valid.


Concept / Approach:
Use the identity α^2 + β^2 = (α + β)^2 − 2αβ. From Vieta, α + β = −b/a and αβ = c/a. Substitute and simplify carefully to avoid sign or denominator errors.


Step-by-Step Solution:

α + β = −b/a ⇒ (α + β)^2 = b^2/a^2 αβ = c/a Therefore α^2 + β^2 = (α + β)^2 − 2αβ = (b^2/a^2) − 2(c/a) Put over a common denominator a^2: α^2 + β^2 = (b^2 − 2ac)/a^2


Verification / Alternative check:
Pick a sample quadratic (e.g., x^2 − 5x + 6 = 0 with roots 2 and 3). Compute α^2 + β^2 = 4 + 9 = 13. Formula gives (25 − 12)/1 = 13. Checks out.


Why Other Options Are Wrong:
The ones with +2ac use the wrong sign; others have incorrect scaling or extraneous factors in the denominator.


Common Pitfalls:
Missing the minus sign in the identity and forgetting to square the −b/a properly.


Final Answer:
(b^2 − 2ac) / a^2

More Questions from Quadratic Equation

Discussion & Comments

No comments yet. Be the first to comment!
Join Discussion