# Gage R&R - Uncertainty in Contamination

Discussion in 'Gage R&R and MSA - Measurement Systems Analysis' started by Richardjdrexler, Sep 28, 2020.

Tags:
1. ### RichardjdrexlerNew Member

Joined:
Sep 28, 2020
Messages:
2
0
Trophy Points:
1
Folks,

Let me preface this with I am not six sigma certified.

I recently got a new job where we use a balance to measure the contamination on a tin. The general process is as follows:
• weigh tin
• add solvent w/ contaminate to tin
• evaporate solvent; leaving contaminate behind
• weigh tin again
• delta in weight is the weight of contaminate
As of now, there is no accounting for uncertainty of the measurement in the calculation, which I thought was a little odd. We are just taking exactly what the scale shows for our calculation. Additionally, the balance's readings seem to fluctuate quite a bit. I think we are over estimating the capability of it. I used to work with a black belt and he and I performed a couple of Type 2 Gage R&Rs in the past to tighten up measurements. So I decided to perform a Gage R&R here.

I think my math is right as I cross checked it against different Gage R&R methods Average & Range, ANOVA, and EMP. Additionally, I ran my numbers through QI macros and got the same results. Lastly, my equipment variation matches the readability of the balance according to the manual. My questions revolve around interpreting my results.

Results:
• Equipment Variation - 0.00003 g (all methods)
• Appraiser Variation - 0.00001 g (all methods)
• Interaction Variation -0.00001 g (ANOVA w/ Int.)
Now for the questions:
• Is it proper to call the sum of the variations the uncertainty in the measurement; i.e. a measurement of 2.00000 g is actually 2.00000+/- 0.00005 g?
• To get the variation we perform a standard deviation. If the variation can be considered the uncertainty, am I only accounting for 68% of the uncertainty? Should I double the variation; i.e. 2.00000 g is actually 2.00000 +/- 0.00010 to have a higher confidence level of 95% in accounting for uncertainty?
I have attached both my excel spreadsheet and power point presentation to my management. Thank you for your assistance!

Cheers.
Rich

File size:
297 KB
Views:
7
File size:
42.2 KB
Views:
9
2. ### GejmetMember

Joined:
Jul 23, 2019
Messages:
16
8
Trophy Points:
2
Hi Richard,

Is there a specification for the weight of the contaminant left.....as small as possible I know!?

I'm assuming that your study is effectively weighing 10 empty tins across 3 operators?

If so, your study shows no detectible difference between operators for test/retest or operator bias, therefore the repeatability rounded to 5 decimal places is 0.00003 g. The measurement has an effective demonstrable resolution (Probable Error) of 0.00002 g so if your specification weight width is at least 5 or more times this PE you are going to have a high probability of "seeing it".

If you need uncertainty statements then the MSA inputs are just a subset of this, more work is required to justify the stated uncertainties.

For more information of the PE please see Miners Resource on here for MSA.

Hope this helps

3. ### RichardjdrexlerNew Member

Joined:
Sep 28, 2020
Messages:
2
0
Trophy Points:
1
Gejmet,

Thanks for the response! I am not sure I fully follow, but I will try to answer your questions and let you know how I interpreted what you wrote me.

1. There isn't a sole specification for the weight of the contaminate left. There are defined levels however. These levels are established in IEST-STD-CC1246. I have attached an image of the table to this post. Basically to reach "X" cleanliness level I am allowed "Y" weight of contamination over "Z" surface area. Some parts are small, some large, some parts need to be very clean others not so clean. So for instance if I have a part with 0.1m2 of surface to achieve an A level cleanliness I need to come at 0.00100g or less of contaminate. Anything more I would fail.

2. I had three operators weigh ten empty tins ten times each, creating 300 points of data. So that is three operators, ten tins, ten measurements per tin.

Right now the assumption is that the balance is accurate down to 0.00001g since that is the number of digits on the display. I don't think that is correct.

My objective for doing this was to put bounds on the capability of the balance and process.

Imagine I have an arbitrary error of +/- 0.00003g on any measurement I take with the balance. I take two measurements per calculation and subtract the pre-read weight from the post-read. So my total uncertainty would be +/- 0.00006g. If that was true I could say something along the lines of the statements below.
• I cannot prove that I have met an A/2 cleanliness level for a part with 0.01 m2 of surface area (0.00005g max contaminate allowed) because what I am trying to measure is less than my measurement error (0.00006g).
• I can meet an A/5 cleanliness level for a part with 0.05m2 of surface area (0.00010g max contaminate allowed) since the thing I am measuring is greater than my error. But, my new max reading off the balance should be 0.00004g rather than 0.00010g to guarantee the part came in under the limit because I have an error of 0.00006g.
Additionally my new limits would look like this.

I hope what I am saying makes sense. Thanks again for your help.

Cheers.
Rich

4. ### GejmetMember

Joined:
Jul 23, 2019
Messages:
16
8
Trophy Points:
2
Hi Rich,

Thanks for them clarifications.

I understand that you want to use an uncertainty interval in order that you provide some confidence in the measured output. Of course you know that the repeatability is at 0.00003 g and you could use this as an input into the uncertainty interval as you wish. You haven't mentioned bias yet which would become more critical as you near some of your tighter defined levels.

In order to establish bias you would need to weigh something that you know and are confident of the weight. If you took multiple readings and established a mean it would give you confidence for whether you have any bias or not after comparison to the reference weight, if you did have bias you could always allow for this in your measurements.

You mentioned that your scale is "accurate" in that it can read to 0.00001 g, this property is resolution not accuracy, when I mentioned a PE of 0.00002 g this is actually the demonstrable resolution of your scale based upon your study.

Personally, my opinion is that using uncertainty intervals/statements always for good reason over emphasize the effect of measurement on a specification and there are many reasons for this. In a production environment you have to make pragmatic decisions about the result of a measurement and saying that you doubt the measurement when it lies somewhere within that interval leads you to either look for other ways of measurement or concessionary statements.

If you repeat a measurement multiple times it can have the effect of lowering the PE. For example, your current PE is 0.00002 g, if you took 3 measurements, averaged them and reported this value it would lower the PE by the square root of 3 i.e. 0.00002 / 1.442 = 0.00001 g.

Of course time is money but this may be preferable than having to invest in something else or losing business.

This answer is already getting too long but the other way of approaching this is to guard band your specification allowing for measurement error (multiples of PE), this way you can pick your own confidence level and state with confidence that your measurement result is within your desired limits.