It’s a known fact that industrial gases are exceptionally important both within the industry and beyond. The safe control and regulation of these takes on a seriously critical role and it is gas detectors and analysers that are chiefly responsible for this, as explored by gasworld.
In the important and ever-growing industry of industrial gases, there can be a number of potentially explosive pitfalls or hazardous environments and situations.
Complex gas mixtures, combustible gases and highly toxic gases are all par for the course within the industry. Such hazardous atmospheres and environments need accurate assessment and control, therefore making the role of detection and analysis of immense importance. So what is an analyser and just how important are they?
While detectors, sensors and analysers are intrinsically linked, these devices are actually fundamentally different.
Gas sensors interact with a gas to initiate the measurement of its concentration and then provide output to a gas instrument to display the measurements. Sensors can output a measurement in a number of ways, including percent lower explosive limit (LEL), percent volume, trace, leakage or density.
An important specification for the detection and measurement of gases is the response time – the amount of time required from initial contact with the gas to the sensors processing of the signal. As well as this, other critical factors are distance and the flow rate, with the distance the maximum parameter from which the leak or gas source can be detected by the sensor, and the flow rate the necessary rate of air or gas flow across the sensor to produce a signal.
A sensor is the basic element or component within a detection product and differs for a diverse gas or application. So while a sensor may be seen as a basic technology in this area, an analyser often provides a more complete package.
Analysers provide a comprehensive reading, measurement or analysis of what is happening in an atmosphere at any given moment, detailing specific concentrations of gases, levels of moisture, imbalances, emissions and a host of other such data. This kind of product and analysis naturally has a high degree of functionality within the gases industry and throughout an entire range of associated industries, such as the medical and healthcare sectors.
The distinction between detection and analysis is underlined by Lisa Bergson, president and CEO of Tiger Optics LLC, as she says, “Detectors give an indication, and it’s more of a “go / no go” device. An example would be a toxic release in the workplace, where you don’t need parts per trillion of the gas, you only need to know if it’s time to evacuate. An analyser, as the word implies, is a more thorough and complete quantitative analysis of the target species or group of species. It’s not that one is more complete than the other; it really depends on the application.”
Tiger Optics pioneers in the field of 21st century spectroscopy by innovating cutting-edge gas analysers that offer absolute accuracy and high sensitivity for a wide range of demanding applications. Six years ago the company introduced the world’s first commercial Continuous Wave-Cavity Ring Down Spectroscopy (CW-CRDS) analyser and this summer released the HALO+ mini-cavity Ring Down Spectroscopy (CRDS) analyser capable of measuring at parts-per-trillion (PPT) levels.
“Gas detection and analysis is vital to the safe controlling of a potentially hazardous environment. When dealing with releases at extremely low levels, like parts per trillion, an analyser is preferred, if not required. Also, when there could be a problem of mistakes or false alarms, an analyser is also preferable for a more reliable alarm,” Bergson added.
Analysers come in a variety of types and specifications for a diverse range of gases or applications. A Residual Gas Analyser (RGA) for example, is a small and usually rugged mass spectrometer, typically designed for process control and contamination monitoring in the semiconductor industry.
Residual gas analysers
RGA’s may be found in high vacuum applications like research chambers or science set-ups, accelerators and scanning microscopes, and are most often used to monitor minute traces of impurities in a low-pressure gases environment. Such impurities are capable of being measured down to as low as 10-14 Torr levels (see boxout), possessing sub-ppm (parts-per-million) detectability.
In the case of RGA’s there are 2 implementations, using either open ion source (OIS) or closed ion source (CIS) and OIS is thought to be the most widely available type.
For applications requiring measurement of pressures between 10-4 and 10-3 Torr, the problem of ambient and process gases can be reduced by using the other type of RGA – the CIS.
Evidently, the presence of gas analyser equipment is of great importance within the industrial gases community and beyond. Even blood-gas component analysing in the medical gases industry is a field of significance for analytical devices.
Blood gas measurements are used to evaluate the oxygenation and acidic status of blood and consequently, inside the human body. Many acute and chronic conditions can cause an imbalance in the blood and while a blood gas analytical test may not point to the direct cause of the problem, it can and will determine whether this is a respiratory or metabolic issue.
Once a syringe sample is taken from an accessible artery, it is taken to a blood gas analyser where the machine aspirates the blood from the syringe and measures the pH and partial measures of oxygen and CO2. Generally, an arterial blood gas test measures the partial pressure of oxygen (PaO2); partial pressure of CO2 (PaCO2); the pH of blood; bicarbonate (HCO3); and the oxygen content (O2CT) and oxygen saturation (O2Sat) values. Such analysis can therefore carry a high level of responsibility in the diagnosis of conditions needing urgent medical attention.
A method of analysis or identification used prominently in the industry is gas chromatography (GC). GC is a type of analysis used for complex samples in which an identification is made via gas chromatographic separation using a GC device. A gas chromatograph is an instrument for chemical analysis, comprising a number of components and involving an accurate process of identification.
The different chemical constituents of a sample pass in a gas stream through a narrow flow-through tube called the column, at different rates depending on various chemical and physical properties. This is known as the mobile phase in which a carrier gas such as helium or nitrogen is used and following a reactive stationary phase, the different sample components are separated and each exit the column at a different time to be detected and electronically identified. A detector is used to monitor the outlet stream from the column, with a number of different types of detector found in gas chromatography.
A different and more qualitative method of gas analysis is that of mass spectrometry (MS), an analytical technique to measure the mass-to-charge ratio of ions.
MS works by generating a mass spectrum representing the masses of sample components and finding the composition of a physical sample. The technique, which is conducted using a mass spectrometer device, has several applications and can determine the isotopic composition of elements, the structure of a compound, the amount of a compound and determine a number of other important properties.
This, just as with the results and analysis of gas chromatography, is obviously of great significance when studying the composition of a given atmosphere and safeguarding against potentially harmful or extremely hazardous situations.
Covering a wide range of methods for measuring the moisture content of liquids and gases, measurement applications through moisture analysis include hydrocarbon processing, pure semiconductor gases and bulk pure gases. Due to the nature of this process, in which a sensor is exposed to elements in a particular environment, it is susceptible to certain in potentially corrosive atmospheres where the sensor may become damaged.
This problem for conventional moisture sensors proposes unreliability and expense, but is now being solved through the introduction of laser-based moisture analysers - technology which is commercially available from SpectraSensors Inc. Rapid and dependable, laser based moisture analysers utilise an isolated sample cell and therefore avoid exposure to harsh chemical and corrosive damage.
Al Kania, representative of SpectraSensors, explains the problems faced by conventional analysers, “This continual cycle of high maintenance costs for conventional moisture sensor probes is primarily due to the constant exposure to damaging chemicals in the stream they are sampling. Since moisture affects the life of the very costly catalysts used in producing aromatics, refineries have been forced to live with outmoded moisture analysis technology that is both unreliable and expensive.”
Kania affirms the role of laser-based moisture analysers in the future of analysis, “This new technology is completely unique because it does not expose the laser, which measures moisture, to the harsh chemicals in the stream. Instead, this SpectraSensors analyser is designed to extract samples out of the process into an isolated sample cell, where it is analysed.”
“The problem of critical response time does not occur with the laser-based sensors, which provide instantaneous results,” Kania added.
It seems that the main points for concern and improvement within the gas detection and analysis sector are precision, functionality, automated operation and expense. Just as they are at present, these factors are likely to be those most developed and addressed in the future as the gas analysers market progresses and further etches its significance in a wide variety of applications and walks of life.
The choice of carrier gas or mobile phase is important to the gas chromatographic process, with hydrogen thought to be the most efficient and productive.
Helium however, has a larger range of flowrates that are comparable to hydrogen in efficiency. The added advantage is that helium is non-flammable and works with a greater variety of detectors, reasons why this gas is the most common carrier gas used.
What is a Torr?
Named after Evangelista Torricelli, the Italian physicist and mathematician who discovered the principle of the barometer in 1644, the torr (symbol Torr) is a non-SI unit of pressure defined as 1/760 of an atmosphere.
The SI unit of pressure is the pascal, defined as one Newton square per metre. Over time, 760mm of mercury came to be regarded as the standard atmospheric pressure.
The current accepted definition of an atmosphere is equal to 101,325 pascals (Pa) and so, the torr became defined as 1/760 of one atmosphere.
A piece of sensitive analytical equipment like a gas chromatograph or mass spectrometer is only as effective as the calibration gas that feeds it.
A calibration gas, or gases mixture, guides the analyser and tells it what to look for and what the sample composition should comprise of. Integral to an accurate analysis, a calibration gas sets the tone for the instrument to measure the sample gas or atmosphere against the applied standard, within a specified range.
Detrimental to a successful process then, making the correct selection is a fundamental aspect of gas detection as calibration gases are application specific – it differs for each application depending on the gas or atmosphere to be tested and scrutinised.