Percentage error

Agent Smith

Full Member
Joined
Oct 18, 2023
Messages
458
So for a measuring instrument like a ruler/scale, the actual length could be A and the measured length could be M, the percentage error according to me is [imath]\frac{|A - M|}{A} \times 100[/imath]. Should I have that absolute value function in the formula?

Question: A micrometer measures lengths to the nearest micron (millionth of a meter). It's used to measure ...
a) An eyelash, typically 100 micron in width. What is the percent error? If the micrometer says that the length is 100 micron, it could be somewhere between 95.5 microns and 100.5 microns. So the percent error = [imath]\frac{|95.5 - 100|}{95.5} \times 100 \approx 4.71\%[/imath]

b) A red blood cell, typically 8 microns. If the micrometer reads 8 microns, then the RBC (red blood cell) could be anywhere between 7.5 microns to 8.5 microns. Percent error = [imath]\frac{|7.5 - 8|}{7.5} \times 100 = 6.67\%[/imath]

c) A Hydrogen atom, typically 1 picometer (1 billionth of a meter). 100 picometers = 1 micron i.e. 1 picometer = 0.01 microns
A micrometer will give a reading of 0 microns for 0.01 microns (the nearest micron). Percent error = [imath]\frac{|0.01 - 0|}{0.01} \times 100 = 100\%[/imath]

Is this correct?
 
Last edited:
@Agent Smith -- I went through with corrections of definitions, and I left the calculations for you.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -- - - - - - -

If the expected (actual value) could be E, and the measured value could be M, then the
percentage error would be \(\displaystyle \dfrac{ |M − E| }{E} \) ×100%.

Question: A micrometer measures lengths (widths) to the nearest micron (millionth of a meter). It's used to measure ...

a) An eyelash, typically 100 micron in width. What is the percent error? If the micrometer says that the width
is 100 microns, it could be somewhere between 95.5 microns and 100.5 microns.
So the percent error = \(\displaystyle \dfrac{ |95.5 − 100| }{100} \) ×100% \(\displaystyle \ = \ \) ? %


b) A red blood cell, typically 8 microns. If the micrometer reads 8 microns, then the RBC (red blood cell) could
be anywhere between 7.5 microns to 8.5 microns.
Percent error = \(\displaystyle \dfrac{ |7.5 − 8| }{8} \) ×100% = ? %


c) A Hydrogen atom, typically 1 picometer (1 billionth of a meter). 100 picometers = 1 micron i.e. 1 picometer = 0.01 micron

A micrometer will give a reading of 0 microns for 0.01 micron (the nearest micron).
Percent error = \(\displaystyle \dfrac{ |0 − 0.01| }{0.01} \) ×100% = ? %
 
I dunno but the measured length has to be whole numbers (the micrometer reads to the nearest micron). So if it reads 100 microns, the actual/expected length [imath]100 \pm 0.5[/imath]. That would mean (?) the maximum percentage error = [imath]\frac{|95.5 - 100|}{95.5} \times 100[/imath]
 
I dunno but the measured length has to be whole numbers (the micrometer reads to the nearest micron). So if it reads 100 microns, the actual/expected length [imath]100 \pm 0.5[/imath]. That would mean (?) the maximum percentage error = [imath]\frac{|95.5 - 100|}{95.5} \times 100[/imath]

I left for several hours, and I am looking at this post of yours late at night.
The definition I gave in post # 2 was looked up, so it was not something
just off the top of my head.

The 100 microns is the expected (typical) length. The 95 microns or
105 microns represents a reading. Let's use the reading of 95 microns.

How does the reading of 95 microns compare with the expected (typical) length of 100 microns? You need the absolute value of their difference divided by the expected length, not the reading, because you want to know how far off the reading is from the expected length as a percentage error.

Also, you are coming up with a percent. You are taking a decimal number
and converting it into an equivalent form with a percent symbol, which is
100%, not 100. When you multiply by 100% (you must have the percent symbol), you are multiplying by 1.
 
I left for several hours, and I am looking at this post of yours late at night.
The definition I gave in post # 2 was looked up, so it was not something
just off the top of my head.

The 100 microns is the expected (typical) length. The 95 microns or
105 microns represents a reading. Let's use the reading of 95 microns.

How does the reading of 95 microns compare with the expected (typical) length of 100 microns? You need the absolute value of their difference divided by the expected length, not the reading, because you want to know how far off the reading is from the expected length as a percentage error.

Also, you are coming up with a percent. You are taking a decimal number
and converting it into an equivalent form with a percent symbol, which is
100%, not 100. When you multiply by 100% (you must have the percent symbol), you are multiplying by 1.
I'm probably wrong about this, but if I see a micrometer reading of 95 microns, I'd know the true length would be [imath]95 \pm 0.5[/imath]. The expected/true length is between 94.5 and 95.5, no? That would give me a percentage error, I believe, of [imath]\frac{|94.5 - 95|}{94.5} \times 100 \approx 0.00529\%[/imath]. Going the other way, [imath]\frac{|95.5 - 95|}{95.5} \times 100 \approx 0.00524\%[/imath]
 
So for a measuring instrument like a ruler/scale, the actual length could be A and the measured length could be M, the percentage error according to me is [imath]\frac{|A - M|}{A} \times 100[/imath]. Should I have that absolute value function in the formula?

Question: A micrometer measures lengths to the nearest micron (millionth of a meter). It's used to measure ...
a) An eyelash, typically 100 micron in width. What is the percent error? If the micrometer says that the length is 100 micron, it could be somewhere between 95.5 microns and 100.5 microns. So the percent error = [imath]\frac{|95.5 - 100|}{95.5} \times 100 \approx 4.71\%[/imath]

b) A red blood cell, typically 8 microns. If the micrometer reads 8 microns, then the RBC (red blood cell) could be anywhere between 7.5 microns to 8.5 microns. Percent error = [imath]\frac{|7.5 - 8|}{7.5} \times 100 = 6.67\%[/imath]

c) A Hydrogen atom, typically 1 picometer (1 billionth of a meter). 100 picometers = 1 micron i.e. 1 picometer = 0.01 microns
A micrometer will give a reading of 0 microns for 0.01 microns (the nearest micron). Percent error = [imath]\frac{|0.01 - 0|}{0.01} \times 100 = 100\%[/imath]

Is this correct?
Every source I can find defines percentage error as @lookagain did, relative to the actual value. For example,

What you are talking about is a valid thing to calculate, but is not called percentage error! You seem (at least in the OP) to be thinking not about the error (which requires knowing the actual value), but the precision of the instrument (which may be entirely different from its accuracy). Actually, you are thinking more about tolerance than precision, which deals with repeatability. Or there may be an even better term.

Here is a good discussion of these various concepts:

Do you have a source for what you initially asked about?
 
I think I got it

[imath]\text{Relative Error} = \frac{\text{Absolute Error}}{\text{Known Measurement/Measured Value}}[/imath]
[imath]\text{Percent Error} = \text{Relative Error} \times 100[/imath]

Then ...
For eyelash [imath]\text{Percent Error} = \frac{0.5}{100} \times 100 = 0.5\%[/imath]
For RBC [imath]\text{Percent Error} = \frac{0.5}{8} \times 100 = 6.25\%[/imath]
For H atom [imath]\text{Percent Error} = \frac{0.5}{0.000001} \times 100 = 1000000000\%[/imath] 🤔

Correct?

Also ...
Capture.PNG

I don't understand the "Combining the formulas" bit.
 
I don't understand the "Combining the formulas" bit.

I wrote what follows after the last thing I wrote, and chose not to send it ... yet. As I expected, you caught the same issue:


I went back and read that last link carefully, and saw that toward the bottom it shows two different ways to define relative error:

1734104058016.png
So there is at least one place that gives a definition similar to your version. But you still defined it incorrectly, since you talked about an actual value A, which is not presumed to be known in the right-hand definition. (I think the last line here is a little confusing, as in that case the only "known value" is the measurement.)


So, no, I don't like that last line, and have no desire to explain it.
Question: A micrometer measures lengths to the nearest micron (millionth of a meter). It's used to measure ...
a) An eyelash, typically 100 micron in width. What is the percent error? ...

b) A red blood cell, typically 8 microns. ...

c) A Hydrogen atom, typically 1 picometer ...
I'll accept for the moment that we should take this as a situation where the accepted or expected value is not known -- though the wording you give suggests there is an expected value.
I think I got it

[imath]\text{Relative Error} = \frac{\text{Absolute Error}}{\text{Known Measurement/Measured Value}}[/imath]
[imath]\text{Percent Error} = \text{Relative Error} \times 100[/imath]

Then ...
For eyelash [imath]\text{Percent Error} = \frac{0.5}{100} \times 100 = 0.5\%[/imath]
For RBC [imath]\text{Percent Error} = \frac{0.5}{8} \times 100 = 6.25\%[/imath]
For H atom [imath]\text{Percent Error} = \frac{0.5}{0.000001} \times 100 = 1000000000\%[/imath] 🤔

Correct?
Check that last calculation, which is obviously wrong.

On the other hand, the micrometer will not be able to read 0.000001 microns, so you are not calculating the right thing.

Now, I'd still like to see the source of the question, to be sure of the context, since we see that the correct definition depends on context (and my source is not the most sophisticated, but is the only one I found that says this).
 
@Dr.Peterson, regarding the last answer, which you said is wrong, I used the given formula, plug and chug is all I did:
[imath]\text{Relative Error} = \frac{\text{Absolute Error}}{\text{Known Measurement}} \times 100[/imath].

The article says that [imath]\text{Absolute Error} = 0.5[/imath] for a measurement given as [imath]x \pm 0.5[/imath]. In the case of the micrometer, it measures to the nearest micron/micrometer and we're supposed to take half of [imath]1[/imath] micron as the maximum error with this instrument = [imath]0.5[/imath] microns. The known measurement of a Hydrogen atom is [imath]1[/imath] picometer = [imath]0.000001[/imath] microns. The percent error should be [imath]500,000,000\%[/imath]


The question is from GeoGebra (I can't find it now and I didn't save a link). Here's a screenshot:
Capture.PNG
 
Last edited:
For the hydrogen atom, I have to add a correction. It's 100 picometers, not 1 picometer.

So percent error = [imath]\frac{0.5}{0.0001} \times 100 = 500000\%[/imath]

A better answer can be had from using the other formula

Percent error = [imath]\frac{Measured - Expected}{Expected} \times 100 = \frac{0 - 0.0001}{0.0001} \times 100 = 100\%[/imath], ignoring the negative sign
 
Percent error = [imath]\frac{Measured - Expected}{Expected} \times 100 \% = [/imath]

Do you see what I wrote at the end in the quote box? It's a percent symbol.
You need to write \(\displaystyle \ × 100 \%, \ \) not only \(\displaystyle \ × 100 \) after the fraction.
Everywhere in your posts where you wrote \(\displaystyle \ × 100 \ \) without a percent
sign immediately following it, that is incorrect.
 
I am not the person who is grading you.
Sorry if that was rude. Just wondering how you'd penalize me for this error. I believe losing even half-a-mark in SATs can be life-altering.

Why did you insist on me correcting that mistake? For my mathematical well-being I suppose. Gracias.

Are the rest of my posts ok? It's too late for me to edit my posts. I should've acted earlier.
🙂
 
Top