How do you prove that root 2 is irrational?

How do you prove that root 2 is irrational?

Proof that root 2 is an irrational number.

  1. Answer: Given √2.
  2. To prove: √2 is an irrational number. Proof: Let us assume that √2 is a rational number. So it can be expressed in the form p/q where p, q are co-prime integers and q≠0. √2 = p/q.
  3. Solving. √2 = p/q. On squaring both the sides we get, =>2 = (p/q)2

Why is √ 2 an irrational number?

The decimal expansion of √2 is infinite because it is non-terminating and non-repeating. Any number that has a non-terminating and non-repeating decimal expansion is always an irrational number. So, √2 is an irrational number.

Who proved square root of 2 irrational?

Hippasus of Metapontum
The proof of the irrationality of root 2 is often attributed to Hippasus of Metapontum, a member of the Pythagorean cult. He is said to have been murdered for his discovery (though historical evidence is rather murky) as the Pythagoreans didn’t like the idea of irrational numbers.

Is square root of 3 irrational?

It is more precisely called the principal square root of 3, to distinguish it from the negative number with the same property. The square root of 3 is an irrational number. It is also known as Theodorus’ constant, after Theodorus of Cyrene, who proved its irrationality.

How do you prove √ 3 is irrational?

A rational number is defined as a number that can be expressed in the form of a division of two integers, i.e. p/q, where q is not equal to 0. √3 = 1.7320508075688772… and it keeps extending. Since it does not terminate or repeat after the decimal point, √3 is an irrational number.

Is 2 a rational or irrational number?

This means that the value that was squared to make 2 (ie the square root of 2) cannot be a rational number. In other words, the square root of 2 is irrational.

Why is 2 irrational?

Specifically, the Greeks discovered that the diagonal of a square whose sides are 1 unit long has a diagonal whose length cannot be rational. By the Pythagorean Theorem, the length of the diagonal equals the square root of 2. So the square root of 2 is irrational!

Is 2 rational or irrational?

The square root of 2 is irrational.

Who proved root 2 is an irrational number?

Euclid proved that √2 (the square root of 2) is an irrational number. The proof was by contradiction. In a proof by contradiction, the contrary is assumed to be true at the start of the proof.

How do we know square root 2 is irrational?

The square root of 2 or root 2 is represented using the square root symbol √ and written as √2 whose value is 1.414. This value is widely used in mathematics. Root 2 is an irrational number as it cannot be expressed as a fraction and has an infinite number of decimals. So, the exact value of the root of 2 cannot be determined .

Why the square root of 2 is irrational?

Specifically, the Greeks discovered that the diagonal of a square whose sides are 1 unit long has a diagonal whose length cannot be rational. By the Pythagorean Theorem, the length of the diagonal equals the square root of 2. So the square root of 2 is irrational!

Is log 2 rational or irrational?

In short, a rational number can be written in a fraction of two integers and an irrational number cannot. Short proof of log 2 is irrational. Assume that log 2 is rational, that is, (1) where p, q are integers. Since log 1 = 0 and log 10 = 1, 0 < log 2 < 1 and therefore p < q.

Back To Top