Compute α^3 + β^3 from ax^2 + bx + c = 0: If α and β are roots of ax^2 + bx + c = 0 (a ≠ 0), find α^3 + β^3 in terms of a, b, c.

Difficulty: Medium

Correct Answer: b(3ac − b^2)/a^3

Explanation:


Introduction / Context:
Power sums of roots can be expressed using elementary symmetric sums. For α, β as roots of ax^2 + bx + c = 0, we know α + β = −b/a and αβ = c/a. Using the identity α^3 + β^3 = (α + β)^3 − 3αβ(α + β), we can express the result entirely in a, b, c.

Given Data / Assumptions:

  • α + β = −b/a.
  • αβ = c/a.
  • a ≠ 0.


Concept / Approach:
Apply the cube sum identity and substitute Vieta’s formulas carefully. Simplify to a single rational expression with denominator a^3.


Step-by-Step Solution:

α^3 + β^3 = (α + β)^3 − 3αβ(α + β).= (−b/a)^3 − 3*(c/a)*(−b/a) = −b^3/a^3 + 3bc/a^2.Put over a^3: (−b^3 + 3abc)/a^3 = b(3ac − b^2)/a^3.


Verification / Alternative check:
Test with a simple quadratic, e.g., x^2 − sx + p = 0 (a = 1, b = −s, c = p). Then α^3 + β^3 = s^3 − 3ps, which matches b(3ac − b^2)/a^3 after substituting.


Why Other Options Are Wrong:

  • b(b^2 − 3ac)/a^3: Sign reversed.
  • b(3ac + b^2)/a^3: Adds instead of subtracts, not supported by identity.
  • None of these: Incorrect; the expression b(3ac − b^2)/a^3 is exact.


Common Pitfalls:
Sign mistakes when cubing (−b/a) and distributing the negative in −3αβ(α + β). Keep denominators consistent.


Final Answer:

b(3ac − b^2)/a^3

More Questions from Quadratic Equation

Discussion & Comments

No comments yet. Be the first to comment!
Join Discussion