Show that quadratic equation cannot have more than two roots

In this section we will discuss to show that quadratic equation cannot have more than two roots.
Theorem :A quadratic equation cannot have more than two roots.
Proof : Let us consider $\alpha, \beta$ and $\gamma$ are the three roots of the given quadratic equation $ax^{2}$ + bx + c = 0, where a,b,c $\epsilon$ R and a \ne 0. Then each $\alpha, \beta$ and $\gamma$ will satisfy this quadratic equation.
∴ $a\alpha^{2}$ + b$\alpha$ + c = 0 ------ (1)
$a\beta^{2}$ + b$\beta$ + c = 0 ------(2)
$a\gamma^{2}$ + b$\gamma$ + c = 0 -------(3)
Subtract equation (2) from (1) we get
$a\alpha^{2}$ + b$\alpha$ + c - ( $a\beta^{2}$ + b$\beta$ + c) = 0
⇒ $a(\alpha^{2} - \beta^{2}$) + b($\alpha - \beta$ ) = 0
a($\alpha - \beta$ )($\alpha + \beta$ ) + b($\alpha - \beta$ )= 0
($\alpha - \beta$ )(a ($\alpha + \beta$ ) + b)= 0
a($\alpha + \beta$) + b = 0 ---------- (4) [$\alpha - \beta \ne$ 0]
Subtract equation (3) from (2) we get
$a\beta^{2}$ + b$\beta$ + c - ( $a\gamma^{2}$ + b$\gamma$ + c) = 0
⇒ $a(\beta^{2} - \gamma^{2}$) + b($\beta - \gamma$ ) = 0
a($\beta - \gamma$ )($\beta + \gamma$ ) + b($\beta - \gamma$ )= 0
($\beta - \gamma$ )(a ($\beta + \gamma$ ) + b)= 0
a($\beta + \gamma$) + b = 0 ---------- (5) [$\alpha - \gamma \ne$ 0]
Subtracting equation (5) from (4) , we get
a($\alpha - \gamma$ )= 0
⇒ $\alpha = \gamma$
But this is not possible, because $\alpha$ and $\gamma$ are distinct and a$\ne$ 0. So their product can not be zero.
Thus our assumption that quadratic equation has three distinct real roots is wrong.
Hence, a quadratic equation cannot have more than two roots.

Examples on show that quadratic equation cannot have more than two roots

Solve the following quadratic equations by factorization method only.
1) 4$x^{2}$ - 12x + 25 = 0
Solution : We have,
4$x^{2}$ - 12x + 25 = 0
4$x^{2}$ - 12x + 9 - 9 + 25 = 0
$(2x - 3)^{2}$ + 16 = 0
$(2x - 3)^{2} - 16i^{2}$ = 0
(2x - 3 + 4i)(2x - 3 - 4i ) = 0
2x - 3 + 4i = 0       or 2x - 3 - 4i = 0
2x = 3 - 4i       or 2x = 3 + 4i
x = $\frac{3}{2}$ - 2i       or x = $\frac{3}{2}$ + 2i

2) $x^{2}$ + 1 = 0
Solution : We have,
$x^{2} - 1i^{2}$ 0
(x - i)(x + i ) = 0
x - i = 0       or x + i = 0
x = i       or x = -i



11th grade math

From show that quadratic equation cannot have more than two roots to Home