Proof by Contradiction

Tarmac27

New member
Joined
Jan 29, 2021
Messages
36
Hello.

Just need some help with this proof by contradiction.

Thanks

a Let a, b, c, d ∈ Z. Prove that if a + b √ 2 = c + d √ 2, then a = c and b = d. b Hence, find c, d ∈ Z if √( 3 + 2√2 ) = c + d√2
 
Hello.

Just need some help with this proof by contradiction.

Thanks

a Let a, b, c, d ∈ Z. Prove that if a + b √ 2 = c + d √ 2, then a = c and b = d. b Hence, find c, d ∈ Z if √( 3 + 2√2 ) = c + d√2
In order to learn, it is necessary that you do some thinking yourself. That's why we ask you to show us some work, or at least some thought, so we can see what help you need:

So start by assuming the conclusion is false, and show us what happens. (Are you allowed to use the fact that √2 is irrational?)
 
The problem with helping on proofs is that we do not know what theorems you can use. It is utterly impossible unless we can see what you have tried.

Here is how I might start.

[MATH]\text {ASSUME, for purposes of contradiction, } a \ne c \land b \ne d.[/MATH]
[MATH]\therefore \exists \ p \land q \text { such that } p \ne 0, \ q \ne 0, \ a + p = c, \land b + q = d.[/MATH]
Now what?
 
Thank you.

I actually was working on this for about 1 hour and got stuck, I didn't just instantly come here for the answer but I understand it did look like that.

Anyway, writing the equation out again you get:

a+b√2 = (a+p) + (b+q)√2

Then rearrange:

a+b√2 - (b+q)√2 = (a+p)

irrational - irrational = integer

Is this enough to prove this since a+b√2 will not be the same irrational as (b+q)√2? It doesn't say you cant use the fact that √2 is irrational.

Is there another way to prove this without assuming √2 is irrational?

Thank you for the response, expressing the equation like this made a lot more sense.
 
I actually was working on this for about 1 hour and got stuck, I didn't just instantly come here for the answer but I understand it did look like that.
The problem was not that you looked like you hadn't done any work, but that you gave us nothing to go by to help you.

Anyway, writing the equation out again you get:

a+b√2 = (a+p) + (b+q)√2

Then rearrange:

a+b√2 - (b+q)√2 = (a+p)

irrational - irrational = integer

Is this enough to prove this since a+b√2 will not be the same irrational as (b+q)√2? It doesn't say you cant use the fact that √2 is irrational.

Is there another way to prove this without assuming √2 is irrational?
No, the difference of two irrational numbers can be an integer, so there is no contradiction. For example, (√2 + 1) - (√2) = 1. You have to use a proven impossibility.

I myself wouldn't use the p and q, but just suppose that a-c and b-d are nonzero. It amounts to the same thing.

What you might do with a+b√2 = (a+p) + (b+q)√2 is to "solve" for √2: that is, isolate it on one side. Then think about what you got.
 
Alright, thank you.

Start with:

a+b√2 = (a+p) + (a+q)√2

Then rearrange to make √2 the subject:

√2 = ((a+b√2) - (a+p)) / (b+q)

Is this the contradiction? An irrational number being expressed in the form of a fraction

Will this equation always be false even though the is a √2 in the numerator. Is it just not matter what the numbers are an irrational number can never be expressed in a fraction by two different numbers
 
Alright, thank you.

Start with:

a+b√2 = (a+p) + (a+q)√2

Then rearrange to make √2 the subject:

√2 = ((a+b√2) - (a+p)) / (b+q)

Is this the contradiction? An irrational number being expressed in the form of a fraction

Will this equation always be false even though the is a √2 in the numerator. Is it just not matter what the numbers are an irrational number can never be expressed in a fraction by two different numbers
√2 should be isolated. It's not isolated if you have it on both sides.
Try working with the original equality, without p and q.
To get a contradiction you need a fraction with integers only.
 
Ok thanks,

Start with:

a+b√2 = c+d√2

Then:

a - c = d√2 - b√2

Factor out √2 :

a - c = √2 (d - b)

Divide by (d - b):

(a - c) / (b-d) = √2

Where (b - d) cannot equal zero.

Since √2 is isolated and it is being expressed by a fraction of only integers. Is the contradiction?
 
Yes. You've shown that √2 is equal to a rational number, which is impossible. That's the contradiction.
 
Ok thanks makes a lot more sesne now.

There's actually a second part to this question which I think I have already solved but just want some input on wether it is correct or not.

Part Two:

Hence, find c, d ∈ Z if √(3 + 2 √ 2 )= c + d √ 2

My Solution:

Square both sides:

3 + 2√2 = c^2 + 2cd√2 + 2d^2

In order for this equation to be true c^2 + 2d^2 = 3 and cd = 1 since as just proven the integers must be the same.

Simultaneous Equations:

1. c^2 + 2d^2 = 3
2. cd = 1

2a. cd=1
c = 1/d

Sub 2a into 1

(1/d)^2 + 2d^2 = 3

Combine Fractions

(1 + 2d^4)/ d^2 = 3

(1 + 2d^4) = 3d^2

2d^4 - 3d^2 + 1 = 0

Factor:

(d+1)(d-1)(2d^2 + 1) = 0



2d^2 + 1 = 0 is a non integer solution

So we have:

(d+1) = 0 or (d-1) = 0

d = 1 or d = -1

Substituing Back into equation 1 we find that c = 1 and c = -1

Therefore, c = +/- 1 and d= +/- 1

Is this reasonable? The steps seem a bit excessive, is there a more elegant solution to this. Unfortunately the textbook im working from does not have the answers for the chapter this question is from.
 
That's correct, and I think it's about as elegant as it can get. (Watch someone else improve it, now!)

There's one little issue, though. You did excellently in finding both roots; but it's only asking for one! "The" square root means the principal root, which is ...
 
I'm confused. Is the principal root only the postive root? The question is asking for integers and -1 does work in the equation...
 
I'm confused. Is the principal root only the positive root? The question is asking for integers and -1 does work in the equation...
Yes, the radical symbol means only the non-negative root; √2 = 1.414..., not -1.414... . That is so that it is a function, with only one value.

Therefore, in the equation √( 3 + 2√2 ) = c + d√2, the left-hand side is positive, and can't be equal to -1 - √2. That's the other square root, the negative one.

They want c and d to be integers, implying that either could be negative; and if the answer had been -1 + √2, that would have been valid. But that does change the definition of the radical.
 
@Tarmac27 you need to be careful if you square both sides of an equation because this usually introduces extra solutions.

...
Square both sides:

3 + 2√2 = c^2 + 2cd√2 + 2d^2
...

For example, consider...

x=y , call this equation A

if you square both sides...

(x)^2=(y)^2, equation B

Notice that "x=-y" is now a solution to B, but it isn't a solution to A. Therefore, every time you use the "squaring both sides" technique, you must then plug your solutions back into equation A (or, in fact, any point before you did the squaring) to check which solutions are valid for the original equation.
 
That's correct, and I think it's about as elegant as it can get.

...I think this method is the correct one to use nowadays.

@Tarmac27 Don't learn this, but I believe that many years ago some shortcuts for "denesting radicals" were taught, but this was dropped from the syllabus. I guess it's seldom useful in later life. The thing to perhaps bear in mind is that two expressions containing roots can look very different but can occasionally have identical value.
 
Thanks guys.

So I understand now that -1 is not a solution but just suppose that the equation was written like this:

+/- √(3+2√2) = c + d√2

Would -1 now be an acceptable answer since it is now asking for the positive or negative root?
 
...

+/- √(3+2√2) = c + d√2

Would -1 now be an acceptable answer since it is now asking for the positive or negative root?

Yes, c=±1 and d=±1 is acceptable (EDIT - in the quoted circumstance). NOTE: this assumes the normal convention that the two "±" are linked so that either (c=1 and d=1) or (c=-1 and d=-1)
 
Last edited:
Top