Hello,
Apologies if this is not in the right forum. Asking so I, as a paraeducator, can make sure I'm explaining correctly to my students (7th graders, language learners), as the math classes I service are very large with a teacher new to this content area.
If solving a square root results in a decimal, is that always irrational, even if it's a short and manageable decimal? I've been trying to explain to my students that an irrational number has any of the following 3 qualities:
1. Is, or contains, the numeral pi
2. Is a long, continuing decimal that does not repeat or have a pattern
3. Is a non-perfect square root (the example is give to explain a "non-perfect square root" is that the square root of 100 is 10 and that is an easy, simple number -- so that is a perfect square, but the square root of 200 gives us 14.142... which gets long and crazy)
I then emphasize that these are "irrational" numbers because they're big and long and without a pattern/repetition, we don't know where they'll end, which I know is not exactly the definition, but at this grade level works well for what we're using it for. (If this is still incorrect, please let me know! I, honestly, also learned of it this way and only have a loose grasp of the "without a ratio" definition that I've seen.)
However, the teacher today in the lesson mentioned that a square root that has any decimal would be irrational. I admittedly can't think of any square root they would be solving that would have a simple decimal, like x.25, but I was hoping to get some clarity on this. Is a decimal in a solved square root, even if it is simple, always irrational, or is only if it is a continuing decimal without a pattern / without repetition?
Thanks! I wish a resource like this would have existed (or, would have been easier to find) when I was in school.
Apologies if this is not in the right forum. Asking so I, as a paraeducator, can make sure I'm explaining correctly to my students (7th graders, language learners), as the math classes I service are very large with a teacher new to this content area.
If solving a square root results in a decimal, is that always irrational, even if it's a short and manageable decimal? I've been trying to explain to my students that an irrational number has any of the following 3 qualities:
1. Is, or contains, the numeral pi
2. Is a long, continuing decimal that does not repeat or have a pattern
3. Is a non-perfect square root (the example is give to explain a "non-perfect square root" is that the square root of 100 is 10 and that is an easy, simple number -- so that is a perfect square, but the square root of 200 gives us 14.142... which gets long and crazy)
I then emphasize that these are "irrational" numbers because they're big and long and without a pattern/repetition, we don't know where they'll end, which I know is not exactly the definition, but at this grade level works well for what we're using it for. (If this is still incorrect, please let me know! I, honestly, also learned of it this way and only have a loose grasp of the "without a ratio" definition that I've seen.)
However, the teacher today in the lesson mentioned that a square root that has any decimal would be irrational. I admittedly can't think of any square root they would be solving that would have a simple decimal, like x.25, but I was hoping to get some clarity on this. Is a decimal in a solved square root, even if it is simple, always irrational, or is only if it is a continuing decimal without a pattern / without repetition?
Thanks! I wish a resource like this would have existed (or, would have been easier to find) when I was in school.