Over the many years of manufacturing insulation coatings, Mascoat has always tried to find the perfect read Infra-red type device to explain temperature in an accurate method. All types of instrumentation have been used from a in-expense Raytek™ style gun to very expensive IR cameras. We have consulted with FLIR™ technologies, and many others to find an answer for this as all manufacturers of the read devices could not explain a higher temperature reading compared to “touch or feel” results. Most all vendors of these read instruments truly saw that our coatings produced very odd results in the sense of showing temperature much higher than the actual result showed with a standard thermal probe measurement. Most all of the vendors offered to do research as to explain this phenomenon, but none came back to Mascoat with explanations. During a recent trade show, Palmer / Wahl Instruments was posed with this challenge and was kind enough to take on this challenge and provided by to Michael Stelmach (a Mascoat employee) an explanation of why they believed the IR style instrumentation struggles with our coatings. The below description is copied out of a direct email detailing this:
I've done some testing of your product in regards to demonstrating its thermal insulating properties with a thermal imaging camera. I did the testing using hot plates to heat the back of the aluminum plate. I took the sample you provided and cut out a smaller piece, which then had the material removed from 1/2 of it. This piece was tested on a 200°F and a 400°F hotplate.
What I found was that when measuring the unit with both a thermal imaging camera (HSI3000) and a single point Infrared Thermometer (DHS115), the coated section measured about 185°F on the 200°F hotplate and 380°F on the 400°F hotplate.
I then used an RTD meter and articulated surface probe (p/n's 392F and 121) to measure the temperature of the uncoated and coated sides of the
plate. The uncoated plate measured approximately 200°F and 400°F, as was expected. The coated side measured approximately 145°F and 270°F, which differed greatly from the IR measurement. I know the surface measurement to be accurate as I could easily hold my finger on the 145°F side for a long time without being burnt. Why? What I think is happening is that as the temperature of your material gets hotter, it transmits IR energy in the 8-14 micron region, which is the only wavelength the thermal imager or IR thermometer see. The emissivity of the imagers was set to .95. If the emissivity setting of the camera was lowered the readings would be even higher, resulting in readings higher than the true temperature of the hotplate. In the Technical Data, it lists Transmittance at 0.0. What conditions was this under? I suspect that at 200°F, the %transmittance in the 8-14 micron region is not 0. I think, since your material has a lot of trapped air in it that some of the IR "bounces" its way through, resulting in the erred reading. There is a concern that if a customer has piping or ovens coated with this material and they measure the temperature of this coating with most IR thermometers or thermal imaging cameras on the market, they will likely get an incorrect reading that errs in the direction of looking as if your product is not insulating as well. I can't think of anything other than transmission in the 8-14 micron region as the cause for this. As far as demonstrating your material, while Thermal Imaging would be a neat way to do it, I don't recommend it. I think having a hot plate with one half of it coated with this material and a handheld meter with surface "touch" method of demonstration. The meter and probe help to quantify this method and puts "real numbers" on it. It also would be lower cost
than a thermal imaging camera.
James S. Eldridge
Product Development Manager
Palmer / Wahl Instruments, Inc.