KnowledgeBoat Logo
|

Mathematics

If (a+1a)2=3\Big(a + \dfrac{1}{a}\Big)^2 = 3 and a ≠ 0; then show that :

a3+1a3=0.a^3 + \dfrac{1}{a^3} = 0.

Expansions

31 Likes

Answer

Given,

(a+1a)2=3a+1a=±3\Rightarrow \Big(a + \dfrac{1}{a}\Big)^2 = 3 \\[1em] \Rightarrow a + \dfrac{1}{a} = \pm\sqrt{3}

By formula,

a3+1a3=(a+1a)33(a+1a)\Rightarrow a^3 + \dfrac{1}{a^3} = \Big(a + \dfrac{1}{a}\Big)^3 - 3\Big(a + \dfrac{1}{a}\Big)

Substituting a+1a=3a + \dfrac{1}{a} = -\sqrt{3}, we get :

a3+1a3=(3)33×3=33+33=0.\Rightarrow a^3 + \dfrac{1}{a^3} = (-\sqrt{3})^3 - 3 \times -\sqrt{3} \\[1em] = -3\sqrt{3} + 3\sqrt{3} \\[1em] = 0.

Substituting a+1a=3a + \dfrac{1}{a} = \sqrt{3}, we get :

a3+1a3=(3)33×3=3333=0.\Rightarrow a^3 + \dfrac{1}{a^3} = (\sqrt{3})^3 - 3 \times \sqrt{3} \\[1em] = 3\sqrt{3} - 3\sqrt{3} \\[1em] = 0.

Hence, proved that a3+1a3=0.a^3 + \dfrac{1}{a^3} = 0.

Answered By

20 Likes


Related Questions