A $10 million dollar satellite will be launched next week to do something that has proved so elusive from the ground: comprehensively measuring the carbon dioxide and methane emissions billowing from Alberta’s sprawling tar sands operations, and eventually fossil fuel operations anywhere.
The satellite, built by the Montreal-based GHGSat, is funded by a group including major oil sands producers and the Canadian government and will blast off into space from the Satish Dhawan Space Centre in India on June 21. The hope is that, once in place 318 miles above Earth, the satellite will produce measurements with a resolution 400 times finer than current satellites produce.
While everyone agrees on the need for better measurements, nobody knows if the satellite can produce them.
Most of the scientists contacted by InsideClimate News questioned whether the company would succeed because satellite technology hasn’t evolved enough to map emissions from specific sources, and its modeling would be too difficult to substantiate.
“There’s a healthy degree of skepticism out there that we’re going to do this. I hope we’re going to be able to prove to them that we can,” said Stephane Germain, a physics engineer and the president of GHGSat.
Zeroing in on individual emissions sources is tricky. The satellite will have to measure a concentration of methane from space and then somehow track those molecules back to a single source. It’s extremely difficult to account for wind and other emissions that could enter into the observed area.
Initially, the satellite will attempt to pinpoint methane and CO2 leak rates at two tailings ponds that store oily waste and one mine. If successful, the oil sands industry contends the method could become the new industry standard for measuring emissions from all tar sands operations. The larger goal is to become the global standard for tracking emissions as the need for better data increases after the Paris climate agreement.
Methane, a potent greenhouse gas, has been the most elusive to understand. The difficulty in gathering data stems from both technological and logistical shortcomings: Ground measurements are costly and time-consuming, and private firms can deny access to regulators and scientists. And current measurements from satellite and aircraft are too imprecise or do not provide the necessary around-the-clock coverage to gauge emissions from individual sources and identify major leaks. The result is data that vastly underestimates the methane problem.
The launch comes just as Alberta readies its first crackdown on emissions from the oil industry. Even more stringent new reporting standards could be in the works as part of Alberta’s sweeping climate change strategy. Alberta’s new premier, Rachel Notley of the New Democratic Party, has pledged to cap oil sands emissions for the first time—at 100 million metric tons, compared with current production levels of around 70 million metric tons—while also imposing a $30 per ton carbon price. Her government also promised to work with industry to slash methane emissions 45 percent lower than projected 2020 levels by 2025.
Four of the biggest oil sands producers—ExxonMobil subsidiary Imperial Oil, Shell, Canadian Natural Resources Limited and Suncor Energy—are behind the project and part of the 13-member Canada Oil Sands Innovation Alliance (COSIA) that focuses on bringing environmental technologies to market. Other GHGSat partners include Boeing, which provided the systems engineering and advised on the space vehicle design, Hydro-Quebec and LOOK North, a remote sensing company.
If companies are serious about cutting methane pollution, then satellite monitoring makes economic sense, according to Robert Jackson, an earth systems science professor at Stanford University.
“To check facilities now, they send one or two people in a pickup truck, and they drive from well pad to well pad. That’s time-consuming and expensive,” he said. “I think the satellites are particularly useful for companies because they will help people find the big leaks.”
‘The Curse of Remote Sensing’
Measuring greenhouse gas emissions from space isn’t new. But GHGSat is attempting to assess emissions with more clarity than any other space endeavor to date.
Most satellite technologies look at a 1-kilometer-square field. The GHGSat will be more exact because it will focus on a 2,500-square-meter resolution. GHGSat will use its own low-power, lightweight imaging spectrometer—to better pinpoint emission sources in a large atmospheric column—and what’s called “inverse dispersion modeling” to track the methane concentration in a specific plume back to the source of emissions.
Jackson said he’s “more skeptical” of the company’s ability to accurately model the source of the methane than the technology’s basic ability to map methane at a higher resolution.
“The closer you are to the source the stronger the signal will be,” he said. “But then you need very fine resolution and space, and you need very good coverage across that space. Technologically I’m just not sure that that’s feasible at this time.” He said the “curse” of remote sensing is that it often senses the entire atmospheric column, not just the portion of the atmosphere close to the surface.
Others said that remote sensing technology is advancing fast enough to make GHGSat’s mission plausible.
Riley Duren, chief engineer at NASA’s Jet Propulsion Laboratory, pointed to “the advanced imaging spectrometer technology we’ve been field testing” as evidence that satellites are capable of addressing the “formidable” methane leak problem globally.
If successful, GHGSat’s technology and modeling could help bridge the gap between the two current methods for gathering methane emissions data: adding up components from oil and gas and other operations on the ground and observing atmospheric plumes from aircraft or space, both of which yield drastically different results.
Eventually, Germain wants the technology and the data to become a global standard that would help industrial and agricultural emitters figure out their exact atmospheric impact. He said providing greater clarity for emitters and governments would help facilitate trading and selling of carbon credits and allowances in emerging cap-and-trade schemes as the world heads for zero-carbon emissions.
“We think it is sufficiently unique and powerful in that it can serve as a neutral, fair and objective way to measure emissions everywhere in the world,” Germain said. “In that sense, there’s already interest from NGOs and groups that would like to see the data, especially from places where the data is not currently available.”
Potential to Solve a U.S. Problem
Several scientists in the U.S. said the promise of fine-resolution satellite technology could yield progress in cracking down on methane emissions at home. That issue has grown in importance with increasing concern that huge amounts of the gas are escaping from oil and gas operations that have expanded wildly as a result of the modern fracking boom.
One of the big hopes is that GHGSat’s approach will better locate super-emitters. Much of the understanding of methane emissions has emerged in the past three years, with some estimates saying large leaks, or super-emitters, are perhaps responsible for a bulk of the 25 to 75 percent of emissions that some scientists believe the EPA is undercounting. Understanding this can help reframe how regulators like the EPA attack the problem.
“We have good population level data sets now about components. But the challenge about that is now we have to take that to the actual site to see what mitigation has to be done,” said Steve Hamburg, the chief scientist for the Environmental Defense Fund who has looked extensively at super-emitters. “That’s the task that GHGSat is attempting to address—how an individual site knows what their actual emissions are.”
Super-emitters have been a bugaboo in the U.S. For one, they’re extremely difficult to predict; many of them result from a technology failure, and can be missed by ground-up or fly-by aircraft methods because they sometimes last for just a matter of hours. They are also hard to detect because very few oil and gas sites are under continuous monitoring.
The concept of super-emitters is so new that the EPA wasn’t even aware the problem was widespread until more academics and environmental groups started looking at methane in the wake of the U.S. oil and gas boom that began in the late 2000s. EPA collected most of its atmospheric data in the 1990s, before technological advancements in fracking, combined with lateral drilling, turned the vacant plains of North Dakota and Texas into black goldmines.
“The EPA is using the same approach they’ve been using since the 1990s and they’re pretty attached to it. And for a long time I don’t think there were a lot of people paying attention to the data,” said Robert Howarth, a Cornell University professor and methane expert whose 2011 research estimated that shale gas had a similar or higher greenhouse gas footprint than coal. Howarth added he’s cautiously optimistic about GHGSat’s chance at success because he thinks it would represent an improvement over current monitoring approaches. “The EPA numbers are clearly wrong,” he said.
The agency’s shortfall largely stems from gathering its data through ground-up measurements that involve assessing the methane leakage rates of individual components of the natural gas production and distribution system. Those provide accurate measurements for when a system operates properly, but they increase the likelihood of missing a super-emitting event because the EPA cannot continuously monitor sites.
An EPA spokeswoman said the agency continually updates its greenhouse gas inventory with new data.
GHGSat will not continuously monitor sites, which Hamburg said is necessary to get an accurate read on super-emitters. The company’s first satellite will orbit the earth on a two-week cycle, meaning it could easily miss a large methane event that gets fixed within that window, Germain acknowledged.
Germain said, however, that the company plans to add more satellites if there’s enough demand. He said conservative growth estimates for GHGSat suggest the firm could have 20 satellites in space within 10 years, which would allow for daily monitoring. Germain also believes his company’s sensor technology would work on an aircraft, which could be used intermittently to gauge emissions from areas with numerous, but smaller, emissions sources.
Aside from ironing out any possible technological constraints, scientists said GHGSat would need to make its methodology and data public to ensure its results can be verified.
“They should be trying to be objective as they can and other people should be watching them to make sure they do,” Howarth said. “I’m not one that’s opposed to working with industry.”
Germain said he plans to make his company’s methodology public, though not the engineering that goes into its sensor technology. Data will also be available to the public at a cost—after all, Germain said he’s trying to run a business—though GHGSat is exploring making some data free for academics and researchers to access.
“To me, success would be that these companies find value in us being able to locate their super-emitters in a large array of wells in a fracking basin, for example,” Germain said. “But I think we’ll get there in stages. I think we first have to demonstrate to them that this works.”
Correction: An earlier version of this article misstated the position of Kenneth Davis of Penn State on the challenges facing remote sensing of greenhouse gases. The story originally quoted him as saying that detecting small signals like wind close to the source of emissions has been “the curse of remote sensing of greenhouse gases.” Davis actually said that wind data would not pose a problem. Instead he said that the “curse” of remote sensing is that the technology often scans an entire atmospheric column, not just the portion of the atmosphere close to the surface.