Toxic Gas Detection with Enhanced Laser Diode Spectroscopy

A common problem in the oil and gas industry is unreliable readings causing false alarms that cost millions of dollars in lost revenue annually. Safety systems triggered by the detection of potentially hazardous gases typically shut down processes and facilities to prevent potential catastrophe. But when a false alarm triggers the safety system, productivity is lost while the alarm origin is investigated, false alarm status is verified, and processes are slowly brought back on-line.

High false alarm rates, slow detection and poor reliability are problems that frequently plague work site operations when dealing with traditional NDIR or LDS-based open path gas detection systems and specifically in toxic gas leak detection applications.

Sensient Fence Line

A detector with a greater detection range is generally considered an advantage, since it means greater ability to detect the hazard either earlier or from greater distances. However, it is important that the detector does not lead to an increased incidence of false alarms.

Enhanced Laser Diode Spectroscopy (ELDSTM) is a newer class of laser diode spectroscopy for gas detection that significantly increases the sensitivity and reliability of traditional laser-diode-based gas detection and measurement, even in extreme environments.

ELDSTM  laser based gas detection is gas species specific and will only respond to the target gas.


This removes any cross interference and unwanted alarms. Currently ELDSTM products are available for Ammonia, Carbon Dioxide, Hydrogen Sulfide, Hydrogen Chloride, Hydrogen Fluoride and Hydrogen Sulfide.

There is also detector capable of detecting both Hydrogen Sulphide and Methane in one device. Each of the two detection channels is configured and specific to the target gas.

Three big benefits of ELDSTM  over conventional toxic detectors are:

  • Elimination of false alarms
  • Failsafe operation
  • Reduced maintenance


Elimination of false alarms

Key to the elimination of false alarms is a harmonic fingerprint detection method. A Harmonic Fingerprint is a specific set of harmonic components introduced by target gas absorption where the relative amplitudes and phases of the components are known and specific to the target gas absorption line that is being scanned. This technique eliminates the distortion and interference that typically affect conventional laser diode spectroscopy (LDS). The harmonic fingerprint means that you can reliably distinguish between genuine target gas and non-target-gas effects. In addition, this technology is less prone to water vapor interference, thus reducing the negative repercussions of false alarms while improving detection capability for general plant safety.  This is a link to a short Video showing how the technology is applied.

ELDS Harmonic Fingerprint

Failsafe operation

Most fixed gas detectors do not provide the end-user with  live information on the operational status of any device where-as the ELDS uses a daily auto-self test called SimuGas which is a unique auto-test feature available on all units. This test provides daily verification of the systems operation without the need for manual intervention or the need for externally applied test gas or optical filters. As well as providing an alarm in the event of a test failure, each test result is held in an internal event log for future retrieval.

For example; a common trait of solid state semiconductor sensors is their tendency to “go to sleep” when exposed to H2S free air for prolonged periods of time while ELDS will respond to H2S after long periods of H2S free exposure.

Electrochemcial sensors are effective in detecting H2S gas, yet they are not resilient in high heat and prolonged low or high humidity conditions.


Reduced Maintenance

The reduction in maintenance is achieved in at least two ways. One; ELDSTM does not have any consumable parts unlike other fixed point gas detectors, which need their sensing elements replaced on a regular basis. This saves the cost associated spares and service labor.

In addition; the SimuGas auto-self test provides integral, daily auto self-testing information, negating the need for regular manual testing using test gases and all associated costs. Gas detector functional testing is the bane of operators responsible for maintaining conventional fixed gas detector systems yet it is essential for ensuring the safety integrity of an entire system. The SimuGas technology provides operators with the means to perform remote, on-command functional testing of an ELDS gas detector more easily, safely and less expensively than with traditional laser diode spectroscopy gas detector technologies or point detectors.


Our conclusion is the Enhanced Laser Diode Spectroscopy can be utilized to improve performance and safety of toxic gas detection systems, eliminate lost production resulting from false alarms and reduce maintenance costs. Additional information on permanent gas monitoring is available in the  MSA Gas Detection Handbook.

Please let us know what your thoughts and comments?




Liquid Ultrasonic Flowmeters; Design Changes to Reduce Engineering and Purchasing Costs


The adoption of ultrasonic flow meter technology into process control applications for liquids has been limited to some extent and only applied as a “flow technology of last resort”, when process conditions are beyond the capabilities of other technologies.

It seems that utilizing ultrasonic flow meters (UFM’s) may have; required more engineering hours to write the specifications, longer pipe straight-run to meet the performance requirements, and complicated the procurement process to address a flow meter system comprised of multiple components.

An additional concern is the perception that a UFM’s response time to a step change in flow-rate was too slow for control purposes, unless the computing power of a custody transfer type UFM was employed.  Because so many transit-time signals are processed by the CPU to achieve one velocity data point…if there are many “rejections” during the signal processing, the measurement update could be seconds which is a long time in the world of process control.  This problem has been solved with more powerful processors. Sample rates in current (non-custody transfer) UFM’s are 256 samples per second or more providing measurement resolution from nanoseconds 10-9 into the range of picoseconds 10-12.

Today we have UFMs available that are priced competitively and fast enough response to be used in many of the more common control applications.

What are some of the other implementation costs that manufactures can eliminate for users?

  • Compact meter body designs that reduce footprint size

PanaFlow LZCustom manufactured flow bodies often require complicated transducer installation methods and multiple components. The installed system footprint can be much smaller than the working clearance that must be provided so that the system can be serviced. For example, maybe 2′-3′ working clearance on both sides of a pipe to allow for transducer removal.

For example, a custom manufactured flow body in a 3″ nominal size with 150# ANSI flanges would be 30″ face to face and a width of 40″ versus the compact cast flow body meter with a 20″ face to face length and 13″ wide.

PanaFlow Z3 1

  • Shorter pipe straight run requirements for accuracy

While the best practice to assure a “Fully Developed Flow Profile” (for liquids) is 10 pipe diameters upstream and 5 pipe diameters downstream of straight-run pipe, multiple chordal paths and computational fluid dynamics software allow manufactures anticipate and correct for a lack of available straight-run. Today we are seeing accuracies of 0.15% – 0.50% in applications where we used to hope for 1%-2%.

LCT4 transducerThe newer compact meter body retains the advantage of a fully enclosed transducer protected from direct contact with the process fluid and is field replaceable. The small insert design eliminates buffers and reduces port effects that can reduce low flow accuracies.

  • Lower cost cast meter bodies

Custom made flow cells can offer benefits such as; the ability to meet flange specifications, biased angles of approach for transducer installation so that higher or lower than usual velocities can be measured, and a wide selection of flow body materials of construction. However for the vast majority of applications, cast meter bodies will meet the process requirements at a much lower cost with shorter lead times, no weld points and no external cables and improved accuracy statements.


These advancements allow ultrasonic flowmeters to be much more user friendly in implementation and a cost effective solution for process control measurements. We look forward to feedback and the user experience in real world application.


Heavy Resid Flow; A Safe & Reliable Solution

It is safe to say that everyone’s goal in process control is to increase reliability of their process measurements. Facilities spend endless resources to identify and displace problematic points, also know as “bad actors”. One common problem area in a refinery is reliably measuring the heavy residual hydrocarbons from the “bottoms” of the distillation towers in their crude unit. Typically, these applications are Vaccum Distillation (VDU) Feed & Delayed Coker (DCU) Feed; also known as furnace pass flow.

These flow measurements most commonly incorporate orifice and wedge meter installations using a differential pressure (DP) transmitter, requiring a flow restriction to be placed inline. This restriction creates pressure drop and is susceptible to clogging due to the high viscosity of the heavy oil residual, therefore steam purging is incorporated to alleviate buildup. Cost of regular maintenance for DP transmitters, along with downtime required for purging decreases the unit output capacities.



A growing solution for this application is the use of ultrasonic flow meters with waveguide transducers. Here we look at a few major benefits of implementing the ultrasonic technology.

  • Unobstructed Flow Path
    • Traditional Orifice and Wedge Meter installations (shown below) employ differential pressure technique, requiring a flow restriction to be placed inline. This restriction creates pressure drop and is susceptible to clogging due to the high viscosity of the heavy oil residual and can require personnel exposure time in the unit. In addition, the amount of purging that is required to prevent clogging increases maintenance costs and decreases the unit output capacities.


  • Bundle Wave Technology
    • Typical wetted ultrasonic meters require transducers to be located inside the process, exposing them to the extreme temperatures and requiring isolation capability in order to service the instrument. BWT allows the transducer to be removed from the process, outside of the harsh environment. This prevents maintenance issues related to thermal shock when using solid state buffers and allows the instrument to be fully serviced under process conditions.


  • SIL Rated System
    • Typical systems incorporate redundancy for operational control and safety shutdown systems and are capable of being SIL certified by design (SIL 2 or SIL 3 depending on configuration), with the ability to incorporate redundant sensors and electronics in a single packaged solution for Operations and Safety Shutdown. Providing safety and reliability without the operational expense of maintaining DP measurements.


In conclusion, ultrasonic flow meter technology when coupled with waveguide transducers provides the safest, most reliable solution to difficult flow applications with high temperatures and high viscosity liquids. Implementing this solution, a facility can lower maintenance costs while increasing safety, measurement reliability and overall plant efficiently.





Trace moisture measurements in gas applications often appear to be simple and straight forward because the theory is relatively well understood; however, when the analyzer system is in service and we see unexpected measurement results, the head scratching begins.

Electrically everything appears to be correct so we assume the probe/sensor must be out of specification, right???

Changing the sensor is not the best first action.  There are other key aspects that should be checked and confirmed first. Low dew point ranges make every detail critical. You are actually measuring molecules so attention to the fine points will be beneficial.


Pressure, temperature and flow rate as well as the analyzer configuration have a significant effect on the resulting measurement because you are measuring a very small component of the total sample.   The sample conditioning system design and operation are just as important as the analyzer and sensor. You might consider these nine common points before going to the expense of changing probes.

  • The sampling system should be allowed to equilibrate for up to (typically) 24 hours following exposure to ambient moisture (atmospheric air).
    • 80% of the dry-down can be reached in a few minutes yet the last 10%-20% of dry-down can take hours.
    • The lower the true dew point is, the longer equilibrium can take.


  • A purge‐type flowmeter aids in determining equilibration point and will aid in qualifying the integrity of seals.
    • If the hygrometer reading changes with a change of flow rate, one or more of the following is to blame:
      • saturation-indices (this is a subject we plan to discuss in a separate blog)
      • System is leaking
      • System is dirty
  • Measurements should be made at system pressure or the highest possible pressure to minimize outgassing of metal parts, wall effects, etc., which will result in higher dew points with increased sensitivity. Other things to remember in regards to pressure and temperature effects:
    • It’s necessary to convert readings taken at atmosphere to equivalent dew point at line pressure if line pressure dew point is what you want, or vice versa.
      • For example, ‐40F (-40C) dew point at 100‐psig line pressure is 16.2 PPMv and equal to ‐70.4F (-56.9C) dew point and  16.2 PPMv at atmosphere pressure.
      • We intend to discus “partial pressure measurements” and Dalton’s law of partial pressures  in an a future blog post.
    • Diffusion of atmospheric water vapor is always present. Even if a “pigtail” discharge system is fashioned, with a throttle valve at the end, the diffusion is only minimized. Results are influenced so much by ambient conditions that the precise measurement of low (less than ‐60C) moisture levels may be very difficult and require attention to good technique.
    • Be aware of the diurnal (day/night) effect. This effect produces a true change in moisture content due to heating/cooling of process surfaces.
  • Any cross-checking between two or more similar instruments should be done at identical time and conditions.
    • This is even more important when dissimilar measurement technologies are compared. It is not possible to use some makes of moisture meters under flowing conditions (in situ) on stream or at line pressures. Some are unstable or are affected by temperature changes and other variables.
  • Cleanliness is essential. Manufactures’ specifications on probe cleaning, sampling and measuring techniques, and interpretation of data should be followed to the letter.
    • Some contaminants can cause false readings or deactivate the probe. The system will have to dry out before data can be taken again. Remember sample must be chemically compatible with aluminum or the sensor will fail.
    • Cleaning an aluminum oxide sensor is relatively simple. All you need is reagent grade hexane or toluene and deionized water.


  • Recalibration should be performed every 12 months or sooner based on your experience in the specific application. There should always be spare probes available. This allows checking probes against another and avoids having the analyzer out of service when the probes are being recalibrated.
  • Engage the process specialist or other resources to assure a common understanding of the process conditions. Everyone involved should be as knowledgeable as possible on the difficulties of moisture measurement, the use of equipment and the meaning of data.
  • Define problems carefully. The right approach to equipment selection and measurement techniques will instill confidence in the end results. Remember you are dealing with a pressure gauge when interpreting sample system phenomenon. The dew point is effected by the pressure the sensor is experiencing. If the sample pressure goes down, the measured dew point of the sample will go down also.
  • Remain open to the possibility that an unexpected dew point is actually correct and something could be amiss with the process itself. A perfectly functioning analyzer system is unbiased. It simply reports what it experiences.

Everyone involved in testing the analyzer system – process engineers, operators and technicians – should be as knowledgeable as possible on the challenges of moisture measurement, the use of equipment and the meaning of the data.

Do not be afraid to ask questions or for support from your vendor, equipment manufacturer or to reach out to us here at goatnuggets. It is likely we have previously experienced something similar to your questions.

Custody Transfer Flowmeters – An Evolution in Technology

Custody Transfer (also know as Fiscal Metering) in the oil & gas industry refers to the transactional transfer of a substance, raw or refined, from one party (owner) to another.

Typically when custody transfer is taking place, the end goal is to determine the value or payment between two parties for the physical substance (gas or liquid) that has exchanged hands via pipeline. For that reason, the flow instrument that measures the total amount of product changing ownership can be viewed as a cash register between the two parties.

Flow measurements in typical process applications are focused around a repeatable value vs. direct accuracy of measurement. Meaning engineers and operators are OK with some level of inaccuracy as long as it is constant and repeatable. Because of the monetary value that changes hands in custody transfer, there can be significant fiscal risk if a measurement has even the smallest of errors.

The level of risk involved in custody transfer applications has led to the regulation of fiscal metering in the oil and gas industry by using standards developed by organizations such as; NIST, API, AGA, etc. As well as others outside of the United States.

While the number of flow meter technologies used in process control today continues to grow, not all are suitable or industry accepted for custody transfer. Here we look at 5 technologies that the petroleum industry has deemed suitable in fiscal metering applications.

Differential Pressure (orifice plates, Venturi tube, pitot tubes)

Steam - The Energy Fluid


  • As early as the 1920s, American Gas Association began studying custody transfer using Differential Pressure (DP) with orifice plates. DP measurements have been long used in process control, leading to an industry wide understanding of how they work and the maintenance required to ensure they are properly functioning. However, they do require regular maintenance and do not provide a wide measurement range for applications with large variance in flow rate. In addition they do not typically provide the accuracy that is required for liquid custody transfer the petroleum industry.


Turbine Meter

  • 1981 the AGA, Transmission Measurement Committee, published Report #7 outlining the use of turbine meter technology in gas applications. This began an industry trend of transitioning to turbine meters for fiscal metering of gases, displacing many of the DP installations. Turbine meters were able to achieve equal or greater accuracy while measuring a wider range of flow rates as well as performing well in liquid applications where orifices may not have had flexibility.


Positive Displacement

  • While the AGA is geared toward gas products, API tends to focus more on petroleum liquids. 1987 brought about the API publishing, MPMS 5.2; a report outlining the use of displacement flow meters in liquid hydrocarbon applications. PD meters are a great solution when used in small line sizes, and low flow rate applications. But similar to DP measurements they do create pressure drop, and because they are a mechanical technology, the moving part require regular maintenance.



  • Flow meters using the Coriolis effect measurement principle were developed as far back as the 1970s, however it was not until 2002 that the API published their acceptance of use in custody transfer applications (API, MPMS 5.6) . Today, Coriolis is a preferred technology for high accuracy flow applications because of the ability to accurately measure both liquids and gases in a large range of line sizes (<1″ to >12″), the lack of moving parts, as well as the ability to make a direct mass flow measurement.



  • Seen as one of the newer technologies in fiscal metering, AGA published Report #9 in 1998 outlining the use of multi-path ultrasonic flow meters in measurement of natural gas in pipelines. Ultrasonic flow meters function on transit time measurement using ultrasound, providing a velocity of a fluid being transferred. With high accuracy, high turndown ratio, and zero moving parts; they are ideal for natural gas lines ranging from 2″ to 40″ and larger. There is little to no pressure drop when using ultrasonic flow meters, which can improve the efficiency of pump stations used when transferring both liquid and gas products down pipelines.

Through development and testing, the petroleum industry has grown to accept a variety of technologies to be adequate for high the expectations of a custody transfer measurement. All these technologies have advantages along with their limitations, and each have their niche in the industry.  And while we hope you found the information in this post informative, it is also important to understand that the full scope of proper custody transfer system requires much more than simply installing a high accuracy flow meter, a complete metering system would also need to include; flow computers, provers, sample analysis, etc…and we will look at explaining more about each of these components in future nugget posts.

A Pump Control Solution for Sump Overflow Protection

Pump control in sump applications is an important task in refining and chemical process plants. Although important, many of these pumping systems are not managed by the plant’s central control system and in most cases, there is little or no method in place for monitoring liquid level within the sump or the performance of the pumping system. The right control technology and implementation on sump applications has many positive economic outcomes such as overflow risk reduction, lower energy use and longer equipment life.

Sumps are part of a plant drainage system intended to collect fluids from process drains, oily water drains, fire water run-off and rain water drainage so that all the run off is controlled and treated appropriately so that environmental risk is eliminated.


Floats, bubblers, ultrasonic, and capacitive probes are common forms of level measurement for sump pump control, yet these technologies have proven maintenance intensive due to failures resulting from buildup, plugging, and foaming. Additionally, viscosity, process condition, and product variance can also cause inaccurate level measurements with these types of technologies.

The result of unreliable level measurement and pump control can be pump “Dry Run” condition or worse yet a sump overflow and possibly an  EPA reportable event. “Dry Run” conditions create costly problems such as excessive wear on a pumps bearings, seals and impeller. The best case scenario of “Dry Run” is early maintenance to repair or replace worn components. The need for complete replacement of the pump is also a possibility. One user noted a pump replacement cost of $80,000.00. An additional risk of an undetected “Dry Run” is pump failure when the pump is most critical and a sump overflow incident results.

So how do we solve this problem? Is there a better way to monitor liquid levels within the sump and protect the equipment?

One solution is an advanced pump control system that includes a pump control panel with a SIL qualified signal conditioning instrument for pump control and system performance monitoring and is integrated with SIL 2 microwave radar for continuous level detection and SIL 2 point level detection switch. Both transmitters can be mounted on one common flange, as small as 4″.


The pump control system should be customizable to meet specific application needs and the existing electrical system.

This type of packaged system can be specified and purchased for delivery pre-wired and ready for installation so that minimal field wiring is required. The simplified installation greatly reduces the field labor costs.

System components might include:

  • Continuous level measurement and/or point level safety switches
  • A Signal Conditioning Instrument with appropriate hazardous area ratings
  • Pressure sensors
  • Enclosure per application requirements
  • Selector switches, indicators, etc… as desired
  • All Electrical and dimensional drawings for approval & documentation


Major benefits of this type of control system are:

  • Over-fill protection according to SIL 2
  • Dry Run protection according to SIL 2
  • FM Approvals for C1, D2 hazardous Areas
  • Lead-Lag pump switching to extend pump life
  • Reduced risk of an environmental incident
  • Increased life expectancy of rotating equipment via pump monitoring

Goat Nuggets thinks this is a pretty good solution for ensuring reliable sump control performance, yet we would love to hear from you in the comments section as to what others may be implementing?



Non-Contact Radar; A Brief Focus On Frequency

Radar level transmitters are often referred to as “non-contact” radar because the antenna or “horn” does not come into contact with the product or process being measured. This is in contrast to Guide Wave Radar (GWR) which requires contact with the product being measured.

Radar (Non-Contact)
Guided Wave Radar








The measurement principle of radar is; Time-of-Flight (ToF) using microwave energy. Extremely short microwave pulses at a given wave length are transmitted by the antenna system to the measured product. The pulses are reflected by the product surface and received back by the antenna system. The time from transmission to reception of the signals is proportional to the distance or product level in the vessel (ToF). A special time stretching procedure ensures reliable and precise measurement of the extremely short transmission periods and the conversion into a level measurement.

The typical radar sensor operates with low emitted power in the C, K ,or W band frequency ranges, each of which offer high reliability and tight accuracy in their respective applications:

Low frequency C band sensors (~6GHz) are typically used for continuous level measurement of liquids under difficult process conditions. They are suitable for applications in storage tanks, process vessels or standpipe. C band radars perform well in process vessels where build-up, foaming, or strong agitation is present. The C band frequency can have difficulties when measuring short ranges or when nozzle space limited due to the larger antenna systems they require.

Mid frequency K band sensors (~26GHz) are suitable for continuous level measurement of almost all liquids. They are likely the most common used radar frequency and are suitable for storage containers, reactors and process vessels, even under difficult process conditions. With the a large variety of antenna systems and materials, K band are a common solution for almost all typical applications and processes. The exception being where heavy vapors or dense foams are to be expected.

High frequency W band sensors (~80GHz) are suitable for continuous level measurement of liquids and bulk solids, in addition they can have particular advantages where K & C band radars have historically struggled. Small process fittings available with the W band sensors offer flexibility in small vessels or tight mounting spaces. The increased signal focus can increase the reliability in narrow silo or vessels with stirrers, agitators, baffles, heating spirals, etc. and avoid the common false signals from obstruction. The W band is also successful in measuring very low dielectric liquids such as liquified natural gases.

Over the past 3-4 decades, radar has emerged as one of the preferred level technologies in the process control industry. While every level application has specific details that require thought and consideration; the development and optimization of these 3 frequency ranges has made an ever-increasing number of applications solvable (and more importantly reliable) with non-contacting technology.

What is the Difference: Instrument Data Sheets and Application Data Sheets

An Application Data Sheet may seem redundant when you already have a completed Instrument Data Sheet from the buyer/user yet there are differences in the kinds of information provided by each that will help assure the best possible instrument selection. Because today’s equipment is complex, it is very easy for manufactures to imply, and customers to infer, for example that a function or a feature is standard when it actually requires an upgrade or is not even available or suitable for the specific application being addressed.
An Instrument Data Sheet (IDS) is a document containing specification and information of an instrument device. It specifies general information of instrument such as tag number identification, service description, location (line number/equipment number), P&ID number or drawing number reference, process data (if applicable), calibrated range (if applicable), material, performance details (such as accuracy, linearity – if applicable), hazardous certification (for electrical device), accessories required, etc. The details of information in data sheet may differ among each types of instrument such as transmitter, switch, gauge, control valves, etc. The Instrument Data Sheet is typically completed and supplied by the user or purchaser.

The Application Data Sheet (ADS) is typically specific to the manufacturers proposal. It allows a manufacture to qualify which technology best fits the application described and what available options will facilitate the performance requested on the IDS. The ADS is also a useful tool for technical review of the sales proposal by the buyer/user so that they can better understand exactly what is being offered.

The user often has the challenge of making a decision between two or more offers that may or may not be equals. Sometimes there is inconsistent nomenclature describing similar, if not identical, instrument features. Different units maybe used to express the same specifications and different test conditions often are used by manufacturers to create specifications that favor their equipment.

For example, the terms digital flow control, electronic flow control, and programmable pneumatic control all appear in discussions on GC carrier gas flow control. Manufacturers are obviously trying to differentiate their offerings, but it places a burden on the user to determine the exact function being specified. Other discrepancies occur because of a lack of precision in language. For example; Linearity, linear dynamic range, and dynamic range are used as the technical names for the same detector property.

A completed Application Data Sheet based on the information contained on the Instrument Data Sheet will help ensure that the equipment selected is a “best fit” for the application so that performance and reliability meets expectations.



Flare systems are designed to safely burn excess hydrocarbon gases that cannot be recovered or recycled. Flare stacks are primarily used for burning off flammable gas released by pressure relief valves during unplanned over-pressuring of plant equipment. During flaring, the excess gases are combined with steam and safely burned in the flare. This is safer and more environmentally friendly than releasing the hydrocarbons directly into the atmosphere.


During normal operations, hydrocarbons are refined, collected and routed for further processing into products such as gasoline. When a facility experiences a process interruption, such as an unplanned loss of power, the system may be unable to send the hydrocarbons through for further refining. Flares are also used to ensure safety during the startup and shutdown of equipment when gases generated by those processes cannot be safely recycled into the refinery.

In both cases, the excess hydrocarbons are routed to the flare system where they are combined with steam and safely burned. Combining the excess hydrocarbons with steam ensures maximum combustion so that chemical destruction is complete and  emissions are minimized.

Improperly operated flares may emit methane and other volatile organic compounds as well as sulfur dioxide and other sulfur compounds, which are known to exacerbate asthma and other respiratory problems. Other emissions from improperly operated flares may include, aromatic hydrocarbons (benzene, toluene, xylenes) and benzapyrene, which are known to be carcinogenic.


Similar to the pilot light on a household gas stove or hot water heater, a small flame at the top of the flare burns continuously, ensuring the system is ready for immediate use. Depending upon the construction of the flare, weather conditions, and ambient lighting, the pilot flame may or may not be visible from the ground. When flaring occurs, the flame and noise level will increase due to the increased volume and pressure of the gases being burned by the flare. Refinery operators continuously monitor the flare system to minimize noise and smoke levels, while still burning the gases cleanly and safely.

You can learn more about the important role flares play in the daily operational safety of refineries and their communities with this refinery flare fact sheet.