Frequent contributor to Fox News Steven Milloy retweeted a Politico story about climate change to suggest that CO2 won’t kill Earth because Venus is made of CO2 — the only trouble is humans don’t live on Venus, as far as we know.
Milloy is no stranger to ignoring accurate and verified scientific truths. A lawyer and frequent commentator for Fox News, he refers to himself as a libertarian thinker and runs a twitter account called @JunkScience through which he ironically, but not facetiously, often peddles what mosts scientists would refer to as junk science. His close financial and organizational ties to tobacco and oil companies have been the subject of criticism from a number of sources going back to the early 2000s, as Milloy has consistently disputed the scientific consensus on climate change and the health risks of second-hand smoke. Having close ties to tobacco and oil, it’s not difficult to understand why.
Among the topics Milloy has addressed are what he believes to be false claims regarding DDT, global warming, Alar, breast implants, second-hand smoke, ozone depletion, and mad cow disease. This time, however, he attempts to equate planet Earth with planet Venus, saying that CO2 won’t destroy the Earth because Venus is largely made up of CO2.
DeFazio on climate: "This is the existential threat to the future of the planet."
For comparison, the atmosphere Venus is 96.5% CO2 — and the planet is still there.
The obvious problem to scientists (and most people with a high school science education) is that humans don’t live on Venus, and couldn’t since it is so darn hot, hailing an average temperature of 864 degrees Fahrenheit.
It’s obvious that Milloy is being paid to promote bad science in an effort to persuade Fox News watchers into believing that climate change is a hoax. The trick he uses here is to make it seem like people who believe in man-induced global warming through greenhouse gases such as carbon dioxide think the Earth will cease to exist with too much CO2. That isn’t what climate change scientists and activists think at all.
On the contrary, climate change scientists and activists are concerned about human and animal life will cease to exist — the way it doesn’t exist on Venus.
The danger in having to explain this to people is that it’s easier to look at things Milloy’s way. Despite it being wrong, lazy thinkers will read what he tweets and hear what he says on Fox News without doing anymore research or thinking on the matter. When people say convincing things with authority, it usually doesn’t matter if what they’re saying is true or not.
Beijing — China makes a massive move towards a smogless society with its ban of over 500 car models that have been proven to contribute to urban air pollution.
Responding to anti-pollution measures established recently, the Chinese government has halted sales of over 500 models of vehicles that don’t meet fuel-consumption standards.
The halt in production of some 553 models will begin in early January and will include models from Audi, Beijing Benz and Chevrolet, said the China Vehicle Technology Service Center in a statement to the press Thursday.
China’s anti-pollution plan has taken effect in the form of regulating output from steel production, coal usage restrictions, and a never before seen measure to eventually phase out vehicles powered by fossil fuels within the next few years. This ban is the first of its kind, according to Wang Liusheng, a Shanghai-based analyst at China Merchants Securities.
Wang said in an email to Bloomberg,
“To emphasize a cut back on energy consumption, such documents will surface frequently in the future. It’s an essential move to ensure the healthy development of the industry in the long run.”
The move sounds and looks sweeping, however Cui Dongshu, secretary general of the China Passenger Car Association, said that the models make up a “very small percentage” of polluting vehicles. Meanwhile, Beijing is set to record its most impressive improvements to its air quality in nine years, with an almost 20 percent drop in pollution over the past year alone.
If humans have enough turbines running in the ocean, we could generate enough electricity to power the entire human race, says new research from the National Academy of Sciences
In a paper titled Geophysical potential for wind energy over the open oceansauthored by two scientists at the Department of Global Ecology at Carnegie Institution for Science in Standford, California, the researchers provide strong evidence that there is quite a bit of potential for greater downward transport of kinetic energy in the overlying atmosphere. As a result, they write, “wind power generation over some ocean areas can exceed power generation on land by a factor of three or more.”
Three or more is more than just significant when it comes to searching for renewable energy to replace fossil fuels and nuclear power, both which have had disastrous effects on the environment over the past 100 years, the former contributing to global warming at an alarming rate.
While naysayers might point your attention to the fact that the cost of developing, building and deploying floating turbines is most likely going to be very high, the fact remains that MIT scientists have been working on floating turbines for at least the past 4 years, even floating turbine technology that can produce power when there is no wind.
The downside to these findings is that using all that wind energy means it could drastically alter the climate, since wind has a great effect on how plants and animals live. But considering the high cost, the researchers say the study really only provides enough evidence for those already in the wind turbine tech arena to expand, rather than replace current energy generation.
The paper makes a comparison of a theoretical floating wind farm consisting of almost 2 million square kilometers and situated in the same amount of space on land in the U.S. versus that of the Atlantic ocean, finding that covering much of the central U.S. with wind farms wouldn’t quite be enough energy to power up both the U.S. and China, some 7 terawatts annually, or seven trillion watts of power.
However, floating turbines in the North Atlantic could theoretically power those two countries and a whole lot more, considering the amount of potential energy that can be extracted over the ocean in the same amount of area.
An academic study published Tuesday in the journal of Environmental Research Letters at IOPScience provides more evidence that oil giant ExxonMobil spent millions of dollars misleading the American public on climate change
While it’s not a new revelation that Exxon has intentionally misled the public regarding climate change, a Harvard team of climate scientists prove it beyond a doubt through a peer-reviewed study on Exxon’s role in swaying public opinion on climate change over the past 25 years, a first of its kind.
Although Exxon conducted their own research acknowledging both climate change and the human role in its increasing severity, the oil giant was not public about this internal research program and instead has been on the skeptical side of the climate change debate for decades. In 2015, Exxon issued a challenge on their website urging critics of their stance and communications regarding climate change to “read the documents”
In a study titled Assessing ExxonMobil’s climate change communications (1977–2014), climate scientists Drs. Geoffrey Supranand Naomi Oreskes of Harvard University took Exxon up on their challenge and read the documents, only to find that Exxon did indeed mislead the public about climate change.
Supran painted an intense picture for the energy company, saying,
“The ExxonMobil corporation is under a lot of scrutiny right now, on at least five fronts — we’ve got the Attorneys General of Massachusetts and New York, we’ve got the securities exchange commission, and not to mention some of Exxon’s own shareholders and employees. Basically, they’re all asking roughly the same question and that’s has ExxonMobil in the past, through the way it’s communicated about climate change, misled its customers, its shareholders, or the general public?”
It turns out, they did.
While Exxon promoted skepticism about climate change through advertorials in large publications such as The New York Times and the Wall Street Journal, Exxon had its own research team that acknowledged the danger of fossil fuels to the environment and its increasing risk and contribution to inducing global climate change, but kept it behind closed doors.
Wet weather at the end of the last ice age appears to have helped drive the ecosystems of large grazing animals, such as mammoths and giant sloths, extinct across vast swathes of Eurasia and the Americas, according to our new research.
The study, published in Nature Ecology and Evolution today, shows that landscapes in many regions became suddenly wetter between 11,000 and 15,000 years ago, turning grasslands into peat bogs and forest, and ushering in the demise of many megafaunal species.
By examining the bone chemistry of megafauna fossils from Eurasia, North America and South America over the time leading up to the extinction, we found that all three continents experienced the same dramatic increase in moisture. This would have rapidly altered the grassland ecosystems that once covered a third of the globe.
The period after the world thawed from the most recent ice age is already very well studied, thanks largely to the tonnes of animal bones preserved in permafrost. The period is a goldmine for researchers – literally, given that many fossils were first found during gold prospecting operations.
Our work at the Australian Centre for Ancient DNA usually concerns genetic material from long-dead organisms. As a result, we have accrued a vast collection of bones from around the world during this period.
But we made our latest discovery by shifting our attention away from DNA and towards the nitrogen atoms preserved the fossils’ bone collagen.
Nitrogen has two stable isotopes (atoms with the same number of protons but differing number of neutrons), called nitrogen-14 and nitrogen-15. Changes in environmental conditions can alter the ratio of these two isotopes in the soil. That, in turn, is reflected in the tissues of growing plants, and ultimately in the bones of the animals that eat those plants. In arid conditions, processes like evaporation preferentially remove the lighter nitrogen-14 from the soil. This contributes to a useful correlation seen in many grassland mammals: less nitrogen-14 in the bones means more moisture in the environment.
We studied 511 accurately dated bones, from species including bison, horses and llamas, and found that a pronounced spike in moisture occurred between 11,000 and 15,000 years ago, affecting grasslands in Europe, Siberia, North America, and South America.
At the time of this moisture spike, dramatic changes were occurring on the landscapes. Giant, continent-sized ice sheets were collapsing and retreating, leaving lakes and rivers in their wake. Sea levels were rising, and altered wind and water currents were bringing rains to once-dry continental interiors.
As a result, forests and peatlands were forming where grass, which specialises in dry environments, once dominated. Grasses are also specially adapted to tolerate grazing – in fact, they depend upon grazers to distribute nutrients and clear dead litter from the ground each season. Forest plants, on the other hand, produce toxic compounds specifically to deter herbivores. For decades, researchers have discussed the idea that the invading forests drove the grassland communities into collapse.
Our new study provides the crime scene’s smoking gun. Not only was moisture affecting the grassland mammals during the forest invasion and the subsequent extinctions, but this was happening right around the globe.
This discovery prompts a rethink on some of the key mysteries in the extinction event, such as the curious case of Africa. Many of Africa’s megafauna — elephants, wildebeest, hippopotamus, and so on — escaped the extinction events, and unlike their counterparts on other continents have survived to this day.
It has been argued that this is because African megafauna evolved alongside humans, and were naturally wary of human hunters. However, this argument cannot explain the pronounced phase of extinctions in Europe. Neanderthals have existed there for at least 200,000 years, while anatomically modern humans arrive around 43,000 years ago.
We suggest instead that the moisture-driven extinction hypothesis provides a much better explanation. Africa’s position astride the Equator means that its central forested monsoon belt has always been surrounded by continuous stretches of grassland, which graded into the deserts of the north and south. It was the persistence of these grasslands that allowed the local megafauna to survive relatively intact.
Our study may also offers insights into the question of how the current climate change might affect today’s ecosystems.
Understanding how climate changes affected ecosystems in the past is imperative to making informed predictions about how climate changes may influence ecosystems in the future. The consequences of human-induced global warming are often depicted using images of droughts and famines. But our discovery is a reminder that all rapid environmental changes — wet as well as dry — can cause dramatic changes in biological communities and ecosystems.
In this case, warming expressed itself not through parched drought but through centuries of persistent English drizzle, with rain, slush and grey skies. It seems like a rather unpleasant way to go.
A major, almost overnight shift of Democrat to Republican values in the White House has yielded chilling effects throughout the country, climate change being affected the most
Yesterday, Badlands National Park had tweeted pro-climate change scientific data, only to be taken down hours later and a statement released by the park saying the person who tweeted was a “former employee who was not authorized” to use the Twitter account. This was amid the gag-order placed on the Environmental Protection Agency and the complete removal of all mentions of the word “climate change” on the White House website.
And today the CDC has announced via their mailing list that they are postponing their February conference on climate change and did not offer up a reason. The American Public Health Association’s executive director, Georges Benjamin, said agency officials decided to preemptively call off the event:
“They ran it up the flagpole and realized that it was so close to the inauguration, the chances of it being canceled were pretty real with the administration that was coming in. Some might argue they should have said, ‘We’re going to do this and make them tell us no.’ But that was the decision they made. We should think of this as a strategic retreat.”
We seriously need to do something about CO2 emissions. Besides shifting to renewable energy sources and increasing energy efficiency, we need to start putting some of the CO2 away before it reaches the atmosphere. Perhaps the impacts of human-induced climate change will be so severe that we might even have to capture CO2 from the air and convert it into useful products such as plastic materials or put it someplace safe.
A group of scientists from several European countries and the United States including myself met in the middle, in Iceland, to figure out how CO2 could be put away safely – in the ground. In a recently published study, we demonstrated that two years after injecting CO2 underground at our pilot test site in Iceland, almost all of it has been converted into minerals.
Iceland is a very green country; almost all of its electricity comes from renewable sources including geothermal energy. Hot water from rocks beneath the surface is converted into steam which drives a turbine to generate electricity. However, geothermal power plants there do emit CO2 (much less than a comparable coal-fired power plant) because the hot steam from deep wells that runs the turbines also contains CO2 and sometimes hydrogen sulfide (H2S). Those gases usually just get released into the air.
Is there another place we could put these gases?
Conventional carbon sequestration deposits CO2 into deep saline aquifers or into depleted oil and natural gas reservoirs. CO2 is pumped under very high pressure into these formations and, since they held gases and fluids already over millions of year in place, the probability of CO2 leaking out is minuscule, as many studies have shown.
In a place like Iceland with its daily earthquakes cracking the volcanic rocks (basalts), this approach would not work. The CO2 could bubble up through cracks and leak back into the atmosphere.
However, basalt also has a great advantage: it reacts with CO2 and converts it into carbonate minerals. These carbonates form naturally and can be found as white spots in the basalt. The reactions also have been demonstrated in laboratory experiments.
Dissolving CO2 in water
For the first test, we used pure CO2 and pumped it through a pipe into an existing well that tapped an aquifer containing fresh water at about 1,700 feet of depth. Six months later we injected a mixture of CO2 and hydrogen sulfide piped in from the turbines of the power plant. Through a separate pipe we also pumped water into the well.
In the well, we released the CO2 through a sparger – a device for introducing gases into liquids similar to a bubble stone in an aquarium – into water. The CO2 dissolved completely within a couple of minutes in the water because of the high pressure at depth. That mixture then entered the aquifer.
We also added tiny quantities of tracers (gases and dissolved substances) that allow us to differentiate the injected water and CO2 from what’s already in the aquifer. The CO2 dissolved in water was then carried away by the slowly flowing groundwater.
Downstream, we had installed monitoring wells that allowed us to collect samples to figure out what happened to the CO2. Initially, we saw some of the CO2 and tracers coming through. After a few months, though, the tracers kept arriving but very little of the injected CO2 showed up.
Where was it going? Our pump in the monitoring well stopped working periodically, and when we brought it to the surface, we noticed that it was covered by white crystals. We analyzed the crystals and found they contained some of the tracers we had added and, best of all, they turned out to be mostly carbonate minerals! We had turned CO2 into rocks.
The CO2 dissolved in water had reacted with the basalt in the aquifer and more than 95 percent of the CO2 precipitated out as solid carbonate minerals – and it all happened much faster than anticipated, in less than two years.
This is the safest way to put CO2 away. By dissolving it in water, we already prevent CO2 gas from bubbling up toward the surface through cracks in the rocks. Finally, we convert it into stone that cannot move or dissolve under natural conditions.
One downside of this approach is that water needs to be injected alongside the CO2. However, because of the very rapid removal of the CO2 from the water in mineral form, this water could be pumped back out of the ground downstream and reused at the injection site.
Will it work elsewhere?
Ours was a small-scale pilot study, and the question is whether these reactions would continue into the future or pores and cracks in the subsurface basalt stone would eventually clog up and no longer be able to convert CO2 to carbonate.
Our Iceland geothermal power plant has increased the amount of gas injected several times in the years since our experiment was started using a different nearby location. No clogging has been encountered yet, and the plan is to soon inject almost all waste gases into the basalt. This process will also prevent the toxic and corrosive gas hydrogen sulfide from going into the atmosphere, which currently still can be detected at low levels near the power plant because of its characteristic rotten egg smell.
The very reactive rocks found in Iceland are quite common on Earth; about 10 percent of the continents and almost all of the ocean floors are made of basalt. This technology, in other words, is not limited to emissions from geothermal power plants but could also be used for other CO2 sources, such as fossil fuel power plants.
The commercial viability of the process still needs to be established in different locations. Carbon mineralization adds costs to a power plant’s operation, so this, like any form of carbon sequestration, needs an economic incentive to make it feasible.
People like to live near coasts, and many power plants have been built near their customers. Perhaps this technology could be used to put away CO2 emissions in coastal areas in nearby offshore basalt formations. Of course, there would be no shortage of water to co-inject with the CO2.
If we are forced to lower atmospheric CO2 levels in the future because we underestimate the damaging effects of climate change, we could perhaps use wind or solar-powered devices on an ocean platform to capture CO2 from the air and then inject the CO2 into basalt formations underneath.
Carbon mineralization, as demonstrated in Iceland, could be part of the solution of our carbon problem.
Using a new satellite-based method, scientists at NASA, Environment and Climate Change Canada, and two universities have located 39 unreported and major human-made sources of toxic sulfur dioxide emissions.
A known health hazard and contributor to acid rain, sulfur dioxide (SO2) is one of six air pollutants regulated by the U.S. Environmental Protection Agency. Current, sulfur dioxide monitoring activities include the use of emission inventories that are derived from ground-based measurements and factors, such as fuel usage. The inventories are used to evaluate regulatory policies for air quality improvements and to anticipate future emission scenarios that may occur with economic and population growth.
But, to develop comprehensive and accurate inventories, industries, government agencies and scientists first must know the location of pollution sources.
“We now have an independent measurement of these emission sources that does not rely on what was known or thought known,” said Chris McLinden, an atmospheric scientist with Environment and Climate Change Canada in Toronto and lead author of the study published this week in Nature Geosciences. “When you look at a satellite picture of sulfur dioxide, you end up with it appearing as hotspots – bull’s-eyes, in effect — which makes the estimates of emissions easier.”
The 39 unreported emission sources, found in the analysis of satellite data from 2005 to 2014, are clusters of coal-burning power plants, smelters, oil and gas operations found notably in the Middle East, but also in Mexico and parts of Russia. In addition, reported emissions from known sources in these regions were — in some cases — two to three times lower than satellite-based estimates.
Altogether, the unreported and underreported sources account for about 12 percent of all human-made emissions of sulfur dioxide – a discrepancy that can have a large impact on regional air quality, said McLinden.
The research team also located 75 natural sources of sulfur dioxide — non-erupting volcanoes slowly leaking the toxic gas throughout the year. While not necessarily unknown, many volcanoes are in remote locations and not monitored, so this satellite-based data set is the first to provide regular annual information on these passive volcanic emissions.
“Quantifying the sulfur dioxide bull’s-eyes is a two-step process that would not have been possible without two innovations in working with the satellite data,” said co-author Nickolay Krotkov, an atmospheric scientist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland.
First was an improvement in the computer processing that transforms raw satellite observations from the Dutch-Finnish Ozone Monitoring Instrument aboard NASA’s Aura spacecraft into precise estimates of sulfur dioxide concentrations. Krotkov and his team now are able to more accurately detect smaller sulfur dioxide concentrations, including those emitted by human-made sources such as oil-related activities and medium-size power plants.
Being able to detect smaller concentrations led to the second innovation. McLinden and his colleagues used a new computer program to more precisely detect sulfur dioxide that had been dispersed and diluted by winds. They then used accurate estimates of wind strength and direction derived from a satellite data-driven model to trace the pollutant back to the location of the source, and also to estimate how much sulfur dioxide was emitted from the smoke stack.
“The unique advantage of satellite data is spatial coverage,” said Bryan Duncan, an atmospheric scientist at Goddard. “This paper is the perfect demonstration of how new and improved satellite datasets, coupled with new and improved data analysis techniques, allow us to identify even smaller pollutant sources and to quantify these emissions over the globe.”
The University of Maryland, College Park, and Dalhousie University in Halifax, Nova Scotia, contributed to this study.
For more information about, and access to, NASA’s air quality data, visit:
NASA uses the vantage point of space to increase our understanding of our home planet, improve lives, and safeguard our future. NASA develops new ways to observe and study Earth’s interconnected natural systems with long-term data records. The agency freely shares this unique knowledge and works with institutions around the world to gain new insights into how our planet is changing.
For more information about NASA Earth science research, visit:
There is general agreement that America’s landscapes, certainly its wildlands, are out of whack with their fires. Wildfires are bigger, hotter, more savage and more expensive than in the past.
There is wide agreement, too, that America’s deeper fire problem is not that malignant megafires are crashing into our communities. Instead, it’s that we’ve lost the older benign versions of fire that once washed over and benefited our ecosystems. Surely, the thinking goes, restoring fire’s former regimes would quell the outbursts and bolster forests’ ecological resilience to multiple threats.
But active restoration has proved trickier, more controversial, and more limited than advocates assumed. It works, but not everywhere, and not everyone wants it.
The roots of suppression
For 50 years after the Great Fires of 1910 traumatized the U.S. Forest Service, the country committed to a program of what we might call fire resistance. It sought both to quit lighting fires and to extinguish every fire that did occur before it could grow large and damaging. Then in the 1960s, the fire community reconsidered because the project was self-defeating and had suppressed good fires as well as bad ones. Many biotas were adapted to particular kinds of fires and suffered when those fires vanished or changed character.
By 1978 the federal agencies adopted a program to restore the fire regimes that had prevailed before the ax and hoof of settlement, and the onset of organized fire suppression, had confirmed our new disorder. The project embodied not only the prevailing science but a kind of atonement for the wreckage done. Fire officers would light fires under prescriptions and they would allow natural fires more room to roam.
On restoration as a guiding principle, consensus exists. On its practice, however, confusion and confrontation abound. Why?
Consider how varied some of the best studied landscapes are. Tallgrass prairie requires fire. Probably most tallgrass environments burned every three years or so before European settlement. Longleaf pine, once pervasive on the southeastern coastal plains, burned like a savanna, its wiregrass understory carrying flames among the woods nearly annually. Its western counterpart, ponderosa pine, also behaved like a grassland with big trees clumped throughout it. Likely it burned every 3-8 years. These are all surface fires that occasionally torched pockets of woody thickets or trees during drought and high wind.
Lodgepole pine, by contrast, burns in eruptive patches, killing the existing stand and preparing for a mass reseeding in the ash. The patches burned perhaps every 80-120 years. And then there is California chaparral. Forty years ago the best science suggested it burned weakly until the primary species reached 20-25 years, and then more fiercely with each passing year. No fuel like an old fuel.
Politics of wildland fire
Advocates of restoration argued that more good fire would reduce bad fire and improve ecosystem health. Our understanding of past fire patterns would help write the necessary guidelines.
But some prescribed fires escape control (probably a comparable fraction to those that escape initial suppression). Smoke drifts with the wind. Some sites need preburn preparations. And there are always dissenters. All this costs not only money but social and political capital.
Most tallgrass preserves, for instance, are tiny; there is always a butterfly or beetle, with human partisans for its cause, that thrives best in a more varied mixture of fire. This complicates the social politics of actually putting fire on the ground.
Longleaf – the “forest that fire made” – displays its greatest biodiversity by having a range of fires across seasons and years. Overall, it’s probably impossible to overburn it, but practice requires guidelines, and that demands social consensus beyond the belief that fire belongs.
Ponderosa forests have generally become overgrown with understories of young trees that can carry fire from the surface to the canopy – a revived fire but not one that allows the ponderosa to survive. This has led to arguments for thinning, a kind of woody weeding, to restore the former structure of the forest, so that it can sustain the right kind of fire.
Lodgepole patches have grown more extensive with fire’s removal, which not only feeds larger fires but has encouraged beetle invasions, which further unhinge the structure of fuels and complicate putting fire back in. Since controlled crown fires are at best tricky, and prescribed commercial logging (rather than thinning) is generally unwanted, the options for deliberate restoration are few.
And the chaparral? There are researchers who insist that wind, not fuel, is the driving factor, and argue that fuel mitigation measures, including prescribed fire, will only invite invasive species, destroy native ones, and not make a whit of difference to fire size and intensity. Besides, they say, the strategic issue is urban sprawl, and the fire concern is overall ecological integrity and resilience, not fuel.
A pragmatic hybrid
For several decades restoration has been an informing theme for America’s fire community. It can point to many successes. Florida now burns over two million acres a year under prescription, and the Florida model has propagated throughout the region. A template for southwestern ponderosa pine, loosely known as the Flagstaff model for the site of its demonstration plots, has disseminated throughout many montane forests in the West.
But the Florida model does not work in the chaparral shrublands of Southern California. The Flagstaff model does not work in the pinyon-juniper complex of the eastern Great Basin or the lodgepole of the west-side central Rockies. Each biota needs its own guidelines. Active restoration programs cost money. And prescribed burning becomes more encumbered with restrictions and caveats each year.
Plenty of partisans would prefer we let nature sort out the imbalances, not pretend, with costly hubris, that we know enough or are skilled enough to do the right thing. People caused the problem; removing them altogether is the surest means to set matters right. Less active management, not more, is the way to reconcile past conditions with future wishes.
And for those obsessed with the no-analog future promised by that constellation of global changes lumped under the label Anthropocene, restoration is beside the point. The future will be radically different. We need to prepare for it, not waste scarce resources on recreating a prelapsarian past.
In brief, fire regimes are varied, science frequently conflicted, and restoration intellectually compromised by irony, which adds no cultural value, since we can never truly go back.
The responses to these challenges will vary – as they should. In the American West, however, the cumulative burdens are pushing fire officers away from the former restoration ideal into something akin to a resilience model. They know they need more fire. Their experience tells them they won’t get it waiting for Congress or navigating, project by project, the reviews required by the National Environmental Policy Act.
Instead of attacking the fire problem head on, they are trying to flank it. Of course there are some fires that bolt away from the moment of ignition, or threaten communities, municipal watersheds, or critical biotic assets and must be fought from the first kindling.
But many other fires allow for varied responses. Backing off and burning out – not letting fires roam freely but loose-herding them with selective firefights and burnouts along their perimeter – is a way to get some good fire on the ground.
It’s not restoration as the old order understood it. It’s not a case of science informing and management applying, of rationally getting ahead of the problem. They accept they won’t get ahead of the problem: they have to ride it out. Some patches will burn too severely; some patches won’t burn at all. In a way it’s a pragmatic solution, replacing a goal that we can’t agree on, with a process – returned fire – that we can. The hand is solving what the head cannot.
It now appears that while restoration may be a permanent principle, one widely adopted, it is not a transcendent one. It only has meaning in particular places and practices and, we might add, times. It has to compete with other values like wilderness. What is replacing it is a kind of intellectual and institutional mashup, the paradox of a managed wildfire. It’s a way to improve control by loosening our standards of control. This is not what the new era imagined as it sought to tame the bright-burning tiger, but it offers us a means to ride that tiger into the future.
Rising sea levels are something climate change deniers simply cannot ignore anymore as five entire islands in the Pacific are now almost completely submerged.
The idea that islands could be swallowed whole had always been a fear of the far future, after all the ice caps had melted and the atmosphere was more and more difficult to breath, but in the picturesque and often paradise-like Pacific Ocean, we are losing islands much faster than previously thought.
The Solomon Islands are home to 6 major islands and around 900 smaller islands, as well as being a sovereign nation in the South Pacific. While it doesn’t look like it’s travel industry, known for scuba diving and WWII-era relics, will be affected too much any time soon, but it’s definitely something everyone there is beginning to get concerned over. While the residents of these islands had been saying that their homes are disappearing, it’s finally been confirmed that scientifically. CNN reports,
“In the past 20 years, sea levels in the archipelago have risen 7 to 10 mm (.28 to .39 inches) annually, three times the global average. According to the International Panel for Climate Change, global rises will reach 5mm annually in the second half of the century.”
Among its acclaimed diving destinations are enormous Marovo Lagoon and Iron Bottom Sound, which is littered with dozens of sunken warships. Guadalcanal, a province and one of the archipelago’s largest islands, honors fallen Allied soldiers at its U.S. War Memorial.