In a typical large chemical manufacturing plant or facility, typically there may be hundreds of gas detectors and gas monitors that warn operators about lurking hazards related to leaking chemicals and vapors. These detectors are installed during the plant startup or possibly are added after a few incidents or as the result of HAZOP or safety studies. So far so good. How do you, as the plant engineer or plant manager or the safety manager, ensure that they continue to work as intended?
Well, you calibrate them or just “bump test” them with a known gas mixture in a gas bottle. Fine, but after how many months or years?
How often should you check, whether your installed gas monitors & gas detectors are working OK? In order words, what should be an ideal calibration frequency for these instruments? This is one of the questions that many engineers and technicians ask us, after they learn about gas monitors (from our excellent e-learning course on gas detectors and gas monitors).
Nobody seems to have a common answer.
Some experts suggest every year, some every half year and others, every quarter. So who is right? Some may feel the more the frequency, the better. However, the catch is, that in most electrochemical type gas detectors, every time a calibration or even a bump test is carried out, a small amount of the electrolyte is depleted. This means that the useful life of the gas detector gets reduced , the more you test it. It may so happen, that on a particular test (say the fifth one on the same sensor since its installation), almost all the electrolyte will get depleted. However because sufficient electrolyte was present during the test, the detector will pass out with flying colors. BUT, suppose the next day there is a gas leak AND the electrolyte is now depleted, the instrument that was just declared healthy yesterday will fail in the actual emergency !
Does this bother you? It should. Probably some manufacturer will start indicating the level of useful electrolyte left (there are some models that have this, but I am not sure) or there is some kind of other sensor diagnostic available, but in the vast majority of these detectors, it does not seem to be present.
What about the catalytic combustion type? Well, frequently exposing them may to %LEL gas mixtures may cause damage of the bead (repeated explosions taking place on the bead) and could render them ineffective in an actual gas leak.
The only types that can escape this “destruction by calibration” seem to be the semiconductor and the optical/infrared types, that may be unaffected by frequent exposures to gases to which they are sensitive. Does it mean that we should replace all of the electrochemical types and the catalytic combustion types with IR sensors and semiconductors? Not really practical as they are much more expensive than the electrochemical type and the catalytic combustion type.
So do we use the “lightbulb replacement method”, that used to be present in many factories in the 80s (for those who have forgotten about it, here is a refresher-some smart guy had calculated that replacing all light bulbs-even the working ones-every six months was cheaper than replacing only those bulbs that failed, due yto the inventory carrying costs, the payroll costs of the “light-bulb-changers” ,etc, etc).
Do we simply replace all the electrochemical sensors and catalytic combustion sensors every two years? Any thoughts on this issue?
Please use the comments form below.
(P.S. BTW if you wish to know more about gas detection or have a good training course on gas monitors, why don’t you download our excellent e-learning course ? )