Tropical rain forests have declined from 14% of the Earth land to just 6%. They…
The post We Earthlings: Shop To Stop Deforestation appeared first on Earth911.
Tropical rain forests have declined from 14% of the Earth land to just 6%. They…
The post We Earthlings: Shop To Stop Deforestation appeared first on Earth911.
Faster warming in the Arctic will be responsible for a global 2C temperature rise being reached eight years earlier than if the region was warming at the average global rate, according to a new modelling study led by UCL researchers.
The Arctic is currently warming nearly four times faster than the global average rate. The new study, published in the journal Earth System Dynamics, aimed to estimate the impact of this faster warming on how quickly the global temperature thresholds of 1.5C and 2C, set down in the Paris Agreement, are likely to be breached.
To do this, the research team created alternative climate change projections in which rapid Arctic warming was not occurring. They then compared temperatures in this hypothetical world with those of the “real-world” models and examined the timing with which the critical Paris Agreement thresholds of 1.5C and 2C were breached. They found that, in the models without fast Arctic warming, the thresholds were breached five and eight years later respectively, than their “real-world” projected dates of 2031 and 2051.
In addition, they found that disproportionately fast Arctic warming, known as Arctic amplification, added disproportionate uncertainty to forecasts, as the variation in model projections for the region is larger than for the rest of the planet.
Alistair Duffey (UCL Earth Sciences), a PhD candidate and lead author of the study, said: “Our study highlights the global importance of rapid Arctic warming by quantifying its large impact on when we are likely to breach critical climate thresholds. Arctic warming also adds substantial uncertainty to climate forecasts.”
“These findings underscore the need for more extensive monitoring of temperatures in the region, both in-situ and via satellites, and for a better understanding of the processes occurring there, which can be used to improve forecasts of global temperature rise.”
The study does not attempt to quantify the ways in which Arctic warming affects the rest of the world, for instance through the retreat of sea ice which helps to keep the planet cool, but instead estimates the direct contribution of Arctic warming to global temperature increases.
Co-author Professor Julienne Stroeve (UCL Earth Sciences, the University of Manitoba, Canada, and the U.S. National Snow and Ice Data Center) said: “While our study focuses on how Arctic warming affects global temperature change, the local impacts should not be overlooked. A 2C temperature rise globally would result in a 4C annual mean rise in the Arctic, and a 7C rise in winter, with profound consequences for local people and ecosystems.
“In addition, rapid warming in the Arctic has global consequences that we do not account for in this study, including sea level rise and the thawing of permafrost which leads to more carbon being released into the air.”
Co-author Dr Robbie Mallett (University of Manitoba and Honorary Research Fellow at UCL Earth Sciences) said: “Arctic climate change is often overlooked by politicians because most of the region is outside national boundaries. Our study shows how much the Arctic impacts global targets like the Paris Agreement, and hopefully draws attention to the crisis that’s already unfolding in the region.”
Arctic amplification, which is strongest in the winter months, is caused by several factors. One is the retreat of sea ice, meaning more sunlight (and heat) is absorbed by water instead of being reflected back into space. Another factor is less vertical mixing of air in the poles than in the tropics, which keeps warmer air close to the Earth’s surface.
For the study, researchers looked at an ensemble of 40 climate models that informed the UN’s 2021 climate change report*. These models divide Earth’s surface into a three-dimensional grid of cells, modelling physical processes occurring within each cell.
The research team modified the output of the models to create an alternative world in which rapid Arctic warming was not occurring, by setting the rate of change of temperature in the region north of 66° North equal to that of the rest of the planet. They looked at how the removal of rapid Arctic warming would affect temperature projections in a plausible intermediate emissions scenario and calculated the average temperature projection across all models.
In addition, they looked at how removing rapid Arctic warming from the models would affect more pessimistic or optimistic scenarios. For example, in a more optimistic scenario, where emissions are cut sharply and net zero is reached shortly after 2050, Arctic amplification causes a seven-year difference in the time of passing 1.5°C.
Temperature projections for the Arctic varied more substantially between the models than for other parts of the globe, accounting for 15% of the uncertainty in projections, despite the region only making up 4% of the global surface area.
The 1.5C and 2C limits are regarded as having been breached when average global temperatures over a 20-year period are 1.5C or 2C higher than in pre-industrial times.
The goal of the Paris Agreement, an international treaty, is to keep the global average temperature to “well below 2°C above pre-industrial levels” and pursue efforts “to limit the temperature increase to 1.5°C.”
The Arctic is thought to have warmed by 2.7C since the pre-industrial era, and this warming is believed to have accelerated since the start of the 21st century.
The study was supported by the Natural Environment Research Council (NERC), the European Space Agency (ESA), and the Canada 150 Research Chairs Program.
*The Intergovernmental Panel on Climate Change’s Sixth Assessment Report.
Victoria Atkins has been made the new health secretary. She was financial secretary to the Treasury (a mid-ranking Treasury minister), and so this is a big promotion.
Key events
This is from Yvette Cooper, the shadow home secretary, on James Cleverly’s appointment as her opposite number. He is the eighth home secretary in eight years, she says.
Lisa O’Carroll
David McAllister, the German MEP who heads the European parliament’s foreign affairs committee, has welcomed David Cameron’s “surprise” return to the international stage.
He said Cameron would help rebuild the partially repaired relations with the EU. He said:
I have known him for 15 years and I wish him well. He is a very experienced politician and knows the international business of politics, knows the EU and knows the EU institutions and I think he should be given a chance.
On his role in causing Brexit, he said:
He was responsible for the referendum and it turned out the way it did, but to be fair to David he was in favour of remaining in the EU.
McAllister said credit and thanks should also be given to his predecessor James Cleverly, for “improving the relations between the UK and the EU” after two previous prime ministers Liz Truss and Boris Johnson.
The EU’s chief diplomat Josep Borrell has also welcomed David Cameron’s return to British politics as an opportunity to strengthen relations between the bloc and the UK on security and foreign relations in Ukraine, Middle East and elsewhere.
Greg Hands, the former Tory chair, has been made a minister of state in the Department for Business and Trade, No 10 says.
And John Glen, who was chief secretary to the Treasury, has been made paymaster general in the Cabinet Office.
Lee Rowley is the new housing minister, No 10 has announced. He was local government minister.
Environmental groups seem to be taking a “good riddance” approach to Thérèse Coffey’s departure as environment secretary. (See 12.54pm.)
This is from Paul de Zylva, nature campaigner at Friends of the Earth.
Thérèse Coffey’s time as environment secretary was mired in controversy. Her lasting legacy will be the complacency she showed in dealing with the ongoing sewage scandal, which has seen the near-complete deterioration of our precious rivers and seas.
While she did ensure the UK played a positive role in last year’s UN biodiversity talks, she will also be remembered for unhelpful speeches that pitted the interests of farmers, business leaders and environmental groups against each other instead of working to unite them.
Steve Barclay is picking up a brief that has been neglected throughout the majority of his party’s time in office – there is a lot of lost time to make up for. Given the dire state of nature in the UK, he must start by urgently addressing the poor performance of polluting water companies and the regulator Ofwat. He must support farmers to work in harmony with nature and slash harmful emissions, and properly resource and restore trust in the government’s wildlife and environment watchdogs.
And this is from Rebecca Newsom, head of politics at Greenpeace UK.
At the last election, the Conservative party was promising “the most ambitious environmental programme of any country on earth.” Now the in-tray for the incoming environment secretary is filling up faster than a river downstream from a sewage plant.
The issues are stark and require urgent leadership: clean up our waterways, get a grip on plastic pollution, help to deliver breathable cities, ratify the Global Ocean Treaty and make farming deliver for nature. That’s the success we need. So Steve Barclay needs to act fast, because unfortunately the British public are already seeing what failure looks like.
Aletha Adu
Wes Streeting, the shadow health secretary, used his speech in the king’s speech debate in the Commons this afternoon to criticise David Cameron’s appointment as foreign secretary. Addressing the health minister Helen Whateley, Streeting said:
What kind of message does it send to their [Tory MPs constituents] that their own leader cannot find a suitable candidate for foreign secretary among those who sit in this house?
Lord Cameron has a lot to answer for when it comes to the NHS, the architect of austerity, a £3bn disaster that has led straight to the biggest health crisis in the history of the NHS. And that’s before we take into account his record of ushering in the golden age between Britain and China.
I am grateful to readers who have been adding to the list of European former PMs who went on to serve as a foreign minister. We mentioned some at 12.50pm. There is also:
Laurent Fabius, who was prime minister of France in the 1980s and who became foreign minister almost 30 years later;
Bjarni Benediktsson, who is foreign minister of Iceland, after being PM about six years ago;
And Kalevi Sorsa, who was prime minister of Finland three times in the 1970s and 1980s before becoming foreign minister.
Sir Lindsay Hoyle has told MPs that he investigating how MPs might get the chance to question the new foreign secretary, David Cameron. It won’t happen in the Commons chamber because Cameron will be in the Lords.
Addressing MPs in the chamber this afternoon, Hoyle said:
This is not the first time in recent years that a cabinet minister has been appointed in the House of Lords, but given the gravity of the current international situation, it is especially important that this house is able to scrutinise the work of the Foreign, Commonwealth and Development Office effectively.
I have therefore commissioned advice from the clerks about possible options for enhancing [scrutiny] of the work of the foreign secretary when that post is filled by a member of the other house.
I also look forward to hearing the government’s proposals on how the foreign secretary will be properly accountable to this house.
Peers will, of course, get the chance to question Lord Cameron. But a more junior foreign office minister will take questions from MPs on Foreign Office policy, as happened in the past when the foreign secretary was a peer. (See 1.50pm.)
In fact, there is no need for Hoyle to ask the clerks to produce a briefing on how MPs can hold to account a foreign secretary sitting in the House of Lords. Before the 2010 election the Commons procedure committee produced a report on this very topic, inspired by the fact that Gordon Brown had made two peers, Lord Mandelson and Lord Adonis, business secretary and transport secretary respectively.
The procedure committee pointed out that until the early nineteenth century MPs could question peers by getting them to enter the Commons chamber and stand, or sit, at the bar of the house (the bit near the entrance, facing the speaker’s chair, marked by a thick line in the carpet). But it said that a better option would be to allow secretaries of states in the Lords to be questioned by MPs in Westminster Hall (the mini-chamber used for minor debates).
In its report, the committee recommended trialling this system on a pilot basis. It went on:
If the experiment were considered successful and if it were felt necessary to further develop and strengthen the scrutiny it permitted, there might then be a case for considering more radical options including, perhaps, questions on the floor of the house. Until that point we would consider it prudent to take a measured approach. The experiment in Westminster Hall may reveal new issues, be they constitutional or practical in nature, which could be valuable in deciding whether and how such scrutiny should continue.
Soon after the report was published, the election was held, Mandelson and Adonis lost their jobs, and the idea was dropped. But now there may be an appetite to revisit it.
The sacking of Rachel Maclean (see 2.01pm) means the UK is about to get its 16th housing minister since 2010. Housing campaigners are appalled, saying the government is failing to tackle problems in the sector because ministers change too frequently.
This is from Polly Neate, chief executive of the housing charity Shelter.
The revolving door of housing ministers over the past decade, and in particular the last 18 months, proves the government’s failure to grasp the scale and urgency of the housing emergency. Rents are rocketing, evictions are soaring and homelessness is at a record high, yet we haven’t had a minister stay in the job long enough to get to grips with the problem.
The 16th housing minister since 2010 has to hit the ground running and the first thing on their to do list must be to pass a watertight renters (reform) bill and scrap no fault evictions.
And this is from Tom Darling, campaign manager at the Renters’ Reform Coalition.
Rachel Maclean attended our events and, though we don’t believe the government are going far enough on rental reform, she was always willing to engage with us – we wish her well for the future.
It is frankly shambolic that we will now be on to our 16th housing minister since 2010, and incredibly 9 just since the government promised to end no-fault evictions.
Now, just before the first day of the important committee stage, which involves poring over the detail of the bill, she is sacked – it makes a mockery of government and shows a shocking lack of respect for England’s 11 million private renters.
Hannah White from the Institute for Government thinktank has a chart illustrating the pace at which the job has changed hands.
Tamara Cohen from Sky News also argues that David Cameron’s appointment as foreign secretary makes an early election even less likely than it was.
Another aspect of the Cameron appointment is it seems to suggest No10 will play it long to the election.
One of DC’s allies says: “he wouldn’t want to do it for five months.”
Today’s reshuffle means it is much less likely that this government will go into the next election floating the prospect of the UK withdrawing from the European convention on human rights.
Suella Braverman, who has been sacked, was the minister in goverment most in favour of ECHR withdrawal. She is out.
Her replacement, James Cleverly, spoke out against the idea earlier this year, the FT’s George Parker reports.
And Jo Johnson, the former universities minister who was head of policy at No 10 when Cameron was PM, has told Times Radio that his old boss would not be backing the government to put ECHR withdrawal on the agenda. Johnson said:
I can’t really see David Cameron returning to the Foreign Office and the first thing he’s doing is to lead a campaign for us to leave the European court of human rights. It seems to me highly unlikely.
This is from Adrian Ramsay, co-leader of the Green party, on the reshuffle.
This reshuffle looks desperate and is a sign that Rishi Sunak has run out of talent. David Cameron started the programme of cuts to our public services which has now brought the NHS to near breaking point. Since his disastrous exit he has cashed in on dodgy lobbying for global oligarchs. And on the odd occasion where Cameron did take a principled stand – such as on maintaining the international aid budget – the government has since reneged.
As to the departure of Thérèse Coffey as environment secretary, nature can at least temporarily breathe a sigh of relief as we await to see who replaces her. She put in place a subsidy system which is not working for farmers or the environment, and she has failed to tackle the blight of sewage in our rivers – a situation she herself described as ‘a scandal’ when I challenged her on it at a public meeting in Suffolk last month.
Lisa O’Carroll
The outgoing Dutch prime minister, Mark Rutte, is generally not a man of few words. But when it comes to David Cameron he has uttered two more than the vast majority of EU leaders.
The Netherlands was a key ally of the UK in the membership years and a key ally of Ireland and campaigner for a soft Brexit post-2016.
And Laura Trott has been promoted from pensions minister to chief secretary to the Treasury, No 10 says.
Here is video of journalists reacting as David Cameron arrived at No 10 this morning.
Victoria Atkins has been made the new health secretary. She was financial secretary to the Treasury (a mid-ranking Treasury minister), and so this is a big promotion.
A new element of the catastrophic impacts of climate change is emerging — how global warming is impacting the human brain.
In a paper published today in Nature Climate Change, an international team of academics explore the ways in which research has shown that a changing environment affects how our brains work, and how climate change could impact our brain function in the future. The paper is led by the University of Vienna with input from the universities of Geneva, New York, Chicago, Washington, Stanford, Exeter in the UK and the Max Plank Institute in Berlin. It also explores the role that neuroscientists can play in further understanding and addressing these challenges.
Lead author Dr Kimberly C. Doell, of the University of Vienna, said: “We’ve long known that factors in our environment can lead to changes in the brain. Yet we’re only just beginning to look at how climate change, the greatest global threat of our time, might change our brains. Given the increasingly frequent extreme weather events we’re already experiencing, alongside factors such as air pollution, the way we access nature and the stress and anxiety people experience around climate change, it’s crucial that we understand the impact this could all have on our brains. Only then can we start to find ways to mitigate these changes.”
Since the 1940s, scientists have known from mouse studies that changing environmental factors can profoundly change the development and plasticity of the brain. This effect as also been seen in humans in research looking at the effects of growing up in poverty, which found disturbances to brain systems, including lack of cognitive stimulation, exposure to toxins, poor nutrition, and heightened childhood stress. While not entirely surprising, this research highlights the profound impact that one’s environment can have on their brain.
Now, the authors are calling for research to explore the impact on the human brain of being exposed to more extreme weather events, such as heatwaves, droughts, and hurricanes, and associated forest fires and floods. They believe such events may change brain structure, function, and overall health, and also call for more research to evaluate how this may explain changes in well-being and behaviour.
The paper also explores the role that neuroscience can play in influencing the way we think about climate change, our judgments and how we respond.
Dr Mathew White, of the Universities of Exeter and Vienna, is a co-author on the study. He said: “Understanding neural activity that is relevant to motivations, emotions and temporal horizons may help predict behaviour, and improve our understanding of, underlying barriers preventing people from behaving as pro-environmentally as they might wish. Both brain function and climate change are highly complex areas. We need to start seeing them as interlinked, and to take action to protect our brains against the future realities of climate change, and start using our brains better to cope with what is already happening and prevent the worse-case scenarios.”
Weather extremes, such as heatwaves and torrential rainfalls, are becoming more frequent and more intense across the United States under climate change.
In late September of this year, flash-flooding surged down neighborhood streets and subway stairways in New York City, as a historic rainfall led to canceled flights and closed roads and city officials urged people to stay at home or shelter in place. Some areas of the city saw up to 2.58 inches of rain in one day, nearly 50% more than the city sewer system’s maximum capacity, causing wastewater problems for many low-lying homes and businesses.
Intuitively, when an extreme weather event hits a city, the more residents it has, the larger number of people are affected. Currently, 83% of the United States population lives in urban settings, according to the U.S. Census. This number is expected to grow over the coming decades, rendering urban climate resilience extraordinarily important. As a result, many people have the impression that the growing sizes of cities are making weather extremes worse for the people who live there.
However, cities are designed and built by people. So, it stands to reason that if some methods of land development increase population exposures to extreme weather conditions, others might hold the potential to moderate or even reduce population exposures as the climate changes over the coming decades.
To explore this idea, University of Delaware researcher Jing Gao, assistant professor in the College of Earth, Ocean and Environment and a resident faculty member in the Data Science Institute, and colleague Melissa Bukovsky, associate professor in the Haub School of Environment and Natural Resources at the University of Wyoming, investigated how changes in urban land and population will affect future populations’ exposures to weather extremes under climate conditions at the end of the 21st century.
The researchers looked at urban areas across the continental United States, including cities large and small, with various development densities and in different climate regions. They used a data-driven model developed by Gao to predict how urban areas across the country will grow by 2100, based on development trends observed over the past 40 years. The research team considered how these urban land changes might affect weather extremes like heat waves, cold waves, heavy rainfalls and severe thunderstorms. They then analyzed how many people would be exposed to these extremes under different climate and urban development conditions at the end of the century.
The research team’s simulations showed that at the end of the 21st century, how a city is laid out or organized spatially, often called an urban land pattern, has the potential to reduce population exposures to future weather extremes, even for heat waves under very high urban expansion rates. Further, how the urban landscape is designed — meaning how buildings are clustered or dispersed and how they fit into the surrounding environment — seem to matter more than simply the size of a city. This is true even while climate change is increasing population exposures.
These findings apply to all cities, from large metropolitan areas like New York City to smaller towns in more rural contexts, such as Newark, Delaware.
“Regardless of the size of a city, well planned urban land patterns can reduce population exposures to weather extremes,” Gao said. “In other words, cities large and small can reduce their risks caused by weather extremes by better arranging their land developments.”
These findings differ from current common perceptions. For example, existing literature in this area has almost exclusively focused on limiting the amount of urban land development, Gao said.
In contrast, the new findings from this research encourage researchers and practitioners from a wide range of related fields to reconsider how cities are designed and built so that they can be in harmony with their regional natural surroundings and more resilient to potential climate risks over the long run.
Gao likened the effects of climate change and urban land patterns on extreme weather risks to the effects of a person’s diet and activity level on their risk for health problems. Properly designed urban land patterns, she said, are like physical exercises that work to counteract poor dietary choices, contributing to a reduced risk for disease, while helping a person become more fit in general.
“Carefully designed urban land patterns cannot completely erase increased population exposures to weather extremes resulting from climate change, but it can generate a meaningful reduction of the increase in risks,” Gao said.
And the cost to start is small, Gao said. No extravagant measure, such as leveling and rebuilding a large area at once, is required.
“Instead, when building new and renovating existing parts of a city, we should adjust our mindset to consider how the new development and renovation will change the way the city as a whole situates in its natural surroundings, and how the city and its surrounds can be one integrated human-environment system at large scales over the long run,” Gao said. “The key is to start adjusting how we think about development now.”
Next steps in the work
The researchers are working to identify specific characteristics about the spatial arrangement of a city that can make it more — or less — resilient to future weather extremes. Identifying these patterns can help guide development that is more sustainable in the face of increasing instances of extreme weather. Through their efforts, the research team hopes to provide actionable suggestions for how to design and build urban areas that reduce their residents’ exposures to weather extremes in the long run.
Importantly, the researchers emphasized that these characteristics will likely vary from region to region, now and as climate changes. For instance, what works in arid Phoenix, Arizona, will probably differ from what will work in humid New Orleans, Louisiana. Likewise, what might work today for a city could differ from what will work in the future, as climate conditions evolve.
“Eventually, we want our work to be directly useful to urban design and planning efforts, offering insights and tools for decision makers to influence long-term social and environmental well-being at scale,” Bukovsky said. “First, though, we need to identify what development patterns can improve various cities’ long-term climate resilience. We will continue collaborating in the future.”
An analysis by NASA’s sea level change science team finds that if a strong El Niño develops this winter, cities along the western coasts of the Americas could see an increase in the frequency of high-tide flooding that can swamp roads and spill into low-lying buildings.
El Niño is a periodic climate phenomenon characterized by higher-than-normal sea levels and warmer-than-average ocean temperatures along the equatorial Pacific. These conditions can spread poleward along the western coasts of the Americas. El Niño, which is still developing this year, can bring more rain than usual to the U.S. Southwest and drought to countries in the western Pacific like Indonesia. These impacts typically occur in January through March.
The NASA analysis finds that a strong El Niño could result in up to five instances of a type of flooding called a 10-year flood event this winter in cities including Seattle and San Diego. Places like La Libertad and Baltra in Ecuador could get up to three of these 10-year flood events this winter. This type of flooding doesn’t normally occur along the west coast of the Americas outside of El Niño years. The researchers note that by the 2030s, rising seas and climate change could result in these cities experiencing similar numbers of 10-year floods annually, with no El Niño required.
“I’m a little surprised that the analysis found these 10-year events could become commonplace so quickly,” said Phil Thompson, an oceanographer at the University of Hawaii and a member of NASA’s sea level change science team, which performed the analysis. “I would have thought maybe by the 2040s or 2050s.”
Ten-year floods are those that have a one in 10 chance of occurring in any given year. They’re a measure of how high local sea levels become: The extent of flooding in a particular city or community depends on several factors, including a region’s topography and the location of homes and infrastructure relative to the ocean. Ten-year floods can result in what the National Oceanic and Atmospheric Administration classifies as moderate flooding, with some inundation of roads and buildings, and the possible need to evacuate people or move belongings to higher ground.
NASA’s coastal flooding analysis finds that by the 2030s, during strong El Niño years, cities on the west coast of the Americas could see up to 10 of these 10-year flood events. By the 2050s, strong El Niños may result in as many as 40 instances of these events in a given year.
Water expands as it warms, so sea levels tend to be higher in places with warmer water. Researchers and forecasters monitor ocean temperatures as well as water levels to spot the formation and development of an El Niño.
“Climate change is already shifting the baseline sea level along coastlines around the world,” said Ben Hamlington, a sea level researcher at NASA’s Jet Propulsion Laboratory in Southern California and lead for the agency’s sea level change science team.
Sea levels are rising in response to planetary warming, as Earth’s atmosphere and ocean are heating up and ice sheets and shelves melt. This has already increased the number of high-tide, or nuisance, flooding days coastal cities experience throughout the year. Phenomena like El Niños and storm surges, which temporarily boost sea levels, compound these effects.
Missions that monitor sea levels, including the Surface Water and Ocean Topography (SWOT) satellite and Sentinel-6 Michael Freilich, help to monitor El Niños in the near term. SWOT in particular, collects data on sea levels right up to the coast, which can help to improve sea level rise projections. That kind of information could aid policymakers and planners in preparing their communities for rising seas in the next decades.
“As climate change accelerates, some cities will see flooding five to 10 times more often. SWOT will keep watch on these changes to ensure coastal communities are not caught off guard,” said Nadya Vinogradova Shiffer, SWOT program scientist and director of the ocean physics program at NASA Headquarters in Washington.
To learn more about how NASA studies sea level, visit:
See how SWOT captures sea levels around the globe
Jane J. Lee / Andrew Wang
Jet Propulsion Laboratory, Pasadena, Calif.
818-354-0307 / 626-379-6874
[email protected] / [email protected]
The escalating plastic pollution crisis and inefficiencies in the plastic recycling system have turned many against single-use plastics and led to national and state bans on some plastic packaging. Now, the fossil fuel and petrochemical industries have launched a category of plastic processing technology called chemical recycling or advanced recycling. The plastic industry describes it as a potential panacea that can clean up millions of tons of plastic waste produced annually. Is it everything claimed?
The Ocean Conservancy recently hosted a forum to discuss their findings after examining chemical recycling. The implications of this technology are intricate, and the technology is still evolving. However, the early evidence is that chemical recycling still requires immense energy, generating large amounts of planet-warming CO2. At the same time, it does not significantly reduce the volume of plastic toxins.
“Chemical recycling is an umbrella term that captures a suite of disparate technologies,” said Dr. Anja Brandon, Associate Director of U.S. Plastics Policy at the Ocean Conservancy. She suggested that fossil fuel and plastic companies fudge these terms to confuse consumers and policymakers. “These terms are constantly changing. Its ‘chemical recycling,’ ‘advanced recycling,’ ‘molecular recycling,’ and ‘renewable technologies.’ Different companies all use different terms.”
One clear message from the event was the importance of reducing the use of plastic. As much as 40% of plastic becomes single-use packaging, which accounts for much of the plastic pollution in the oceans and landfills.
“Recycling mitigates the harm of waste and extraction, but not as much, of course, as reuse and certainly reduction is our primary strategy,” said Lynn Hoffman, Co-President of Eureka Recycling in Minneapolis and National Coordinator for the Alliance for Mission-Based Recyclers.
Hoffman noted that mechanical recycling is not without its environmental flaws but suggests that most plastics, especially single-use plastic packaging, are not recycled because of the broken economics of today’s system. It’s often cheaper to use virgin plastic because of the complexity and cost of sorting and processing plastic.
Chemical recycling describes several technologies, each a method of breaking down plastics categorized into three main types: purification, depolymerization, and conversion technologies. Each process converts the polymers in plastic into their underlying monomers, the hydrocarbon building blocks extracted from petroleum.
This method involves using solvents or heat to dissolve plastic, separating it from additives and impurities without altering its molecular structure. The recovered plastic can then be reprocessed into new materials. This method requires very clean post-consumer plastic supplies, which requires careful sorting and cleaning before the material can be processed. Purification works only with specific types of plastic.
This process breaks plastic down into its constituent monomers using solvents, heat, catalysts, or enzymes. Like purification, depolymerization requires pure feed-stocks, and there are limits on the types of plastics that can be run through the process.
Two older technologies, pyrolysis and gasification, are being repurposed and rebranded under the chemical recycling banner. Both rely on extreme heat to “crack” the polymers, breaking them down into hydrocarbon monomers. And most of that heat is produced by burning oil, coal, or natural gas.
Pyrolysis relies on intense heat and pressure to break plastic molecules into shorter-chain hydrocarbons, not monomers, resulting in a substance known as pyrolysis oil. This process is energy-intensive, requiring temperatures between 600 to 1,600 degrees Fahrenheit.
Like pyrolysis, but operating at even higher temperatures between 1,000 and 2,000 degrees Fahrenheit, gasification breaks plastic into synthetic natural gas.
Despite its potential to address the plastic pollution crisis, chemical recycling has significant environmental and social impacts, especially concerning carbon emissions. Current data suggests that the carbon footprint of chemical recycling is significantly higher than that of traditional mechanical recycling that grinds and melts plastic for reuse. Because chemical recycling facilities are built near oil refineries and plastic factories, they are usually next to low-income communities of color. The emissions and other toxins these chemical recycling facilities may produce raise critical questions about the technology’s role in a sustainable future.
The Ocean Conservancy panelists said that policy implications also need to be at the forefront of the discussion. The classification and regulation of chemical recycling facilities, whether treated as manufacturing entities or solid waste management facilities, have significant implications for where they are placed and the reviews required for permitting. So far, 24 states in the U.S. have passed laws reclassifying chemical recycling facilities as manufacturing entities. In other words, these states decided it is not a form of recycling.
Tracking the success of chemical recycling is a complex task, because the oil and gas produced is not directly comparable to recycled plastic.
The recycling industry has long relied on mass balance accounting, which compares the volume of materials sent to a recycling facility with the output. That works well for mechanical recycling, which ingests plastic and produces plastic that is weighed to estimate the percentage of materials recovered. However, because chemical recycling turns a solid plastic into an oil or gas, there is no easy way to estimate yields. Consequently, petrochemical companies can make audacious claims about the efficacy of chemical recycling that cannot be compared with mechanical recycling.
State laws will play a crucial role in shaping the recycling landscape. They help finance improvements in recycling systems and establish accountability for producers of packaging materials. However, how chemical recycling is defined and regulated by different states varies. California, for example, excludes chemical recycling, treating it as a form of manufacturing rather than recycling. On the other hand, Oregon may treat chemical recycling as a waste management practice but imposes a high burden of proof that the technology is effective before allowing a facility to operate.
The ever-growing issue of textile waste, particularly from fast fashion, has made chemical recycling appear to be an attractive solution. Startups are exploring depolymerization to tackle the challenge of recycling polyester materials. However, as Dr. Brandon notes, the priority should be reducing production and consumption.
Chemical recycling facilities, like all recycling operations, have proven environmental and health impacts. As the technology evolves, it may become cleaner. For example, oil companies have suggested the CO2 emitted by depolymerization and conversion processes may be captured by a scrubber and sequestered permanently. The potential to extract additives and other impurities from mixed plastic waste using purification could produce more food-grade plastics, albeit at the potential cost of more plastic packaging.
But do we need more plastic? By reducing plastic use, society can remove a substantial amount of the problem. That’s a question each of us answers when we are shopping.
Chemical recycling’s effectiveness, environmental impacts, and societal implications require careful consideration and transparent discussion. As humanity strives to create a sustainable future, the panelists emphasized that all options must be assessed carefully with a critical eye to ensure that solutions like chemical recycling are responsibly and equitably implemented.
Dig Into Related Stories and Interviews
The rapid adoption of zero-emission electric vehicles will move the nation close to an 80% or more drop in transportation greenhouse gas emissions by 2050 from the 2019 level according to researchers from the U.S. Department of Energy’s National Renewable Energy Laboratory (NREL).
The researchers came to that conclusion after running thousands of computer simulations on the steps needed to decarbonize passenger and freight travel, which make up the largest contributor to greenhouse gases. While they advised that “no single technology, policy, or behavioral change” is enough by itself to reach the target, eliminating tailpipe emissions would be a major factor.
“There are reasons to be optimistic and several remaining areas to explore,” said Chris Hoehne, a mobility systems research scientist at NREL and lead author of a new paper detailing the routes that could be taken. “In the scientific community, there is a lot of agreement around what needs to happen to slash transportation-related greenhouse gas emissions, especially when it comes to electrification. But there is high uncertainty for future transportation emissions and electricity needs, and this unique analysis helps shed light on the conditions that drive these uncertainties.”
The paper, “Exploring decarbonization pathways for USA passenger and freight mobility,” appears in the journal Nature Communications. Hoehne’s co-authors from NREL are Matteo Muratori, Paige Jadun, Brian Bush, Artur Yip, Catherine Ledna, and Laura Vimmerstedt. Two other co-authors are from the U.S. Department of Energy.
While most vehicles today burn fossil fuels, a zero-emission vehicle (ZEV) relies on alternate sources of power, such as batteries or hydrogen. Transportation ranks as the largest source of greenhouse gas emissions in the United States and the fastest-growing source of emissions in other parts of the world.
The researchers analyzed in detail 50 deep decarbonization scenarios, showing that rapid adoption of ZEVs is essential alongside a simultaneous transition to a clean electric grid. Equally important is managing travel demand growth, which would reduce the amount of clean electricity supply needed. The researchers found the most dynamic variable in reducing total transportation-related emissions are measures to support the transition to ZEVs.
Using a model called Transportation Energy & Mobility Pathway Options (TEMPO), the researchers performed more than 2,000 simulations to determine what will be needed to decarbonize passenger and freight travel. The study explores changes in technology, behavior, and policies to envision how passenger and freight systems can successfully transition to a sustainable future. Policy changes may require new regulations that drive the adoption of electric vehicles, for example. Technology solutions will call for continued advancements in batteries, fuel cells, and sustainable biofuels, among others. Behavior comes into play in considering shifts in population and travel needs. Someone moving away from an urban core, for example, might have to travel longer distances to work.
“The transportation sector accounts for about a quarter of greenhouse gas emissions in the United States, and about two-thirds of all that is from personal vehicle travel,” Hoehne said.
By employing a combination of strategies, the study shows that the maximum potential for 2050 decarbonization across the simulated scenarios is a staggering 89% reduction in greenhouse gases relative to 2019, equivalent to an 85% reduction from the 2005 baseline.
“Recent progress in technology coupled with the pressing need to address both the climate crisis and air quality issues have elevated the importance of clean transportation solutions,” said Muratori, manager of the Transportation Energy Transition Analysis group and architect of the TEMPO model. “This shift has made transitioning the entire sector towards sustainability an achievable goal and a top priority in the United States and worldwide.”
Funding was provided by the U.S. Department of Energy Office of Energy Efficiency and Renewable Energy Strategic Analysis Team.
Scientists have run the first proof of concept of their DNA ‘time machine’ to shed light on a century of environmental change in a freshwater lake — including warming temperatures and pollution, leading to the potentially irreversible loss of biodiversity.
Their approach, which uses AI applied to DNA-based biodiversity, climate variables and pollution, could help regulators to protect the planet’s existing biodiversity levels, or even improve them.
Researchers from the University of Birmingham, in collaboration with Goethe University in Frankfurt, used sediment from the bottom of a lake in Denmark to reconstruct a 100-year-old library of biodiversity, chemical pollution, and climate change levels. This lake has a history of well-documented shifts in water quality, making it a perfect natural experiment for testing the biodiversity time machine.
Publishing their findings today (7 Nov) in eLife, the experts reveal that the sediment holds a continuous record of biological and environmental signals that have changed over time — from (semi)pristine environments at the start of the industrial revolution to the present.
The team used environmental DNA — genetic material left behind by plants, animals, and bacteria — to build a picture of the entire freshwater community. Assisted by AI, they analysed the information, in conjunction with climate and pollution data, to identify what could explain the historic loss of species that lived in the lake.
Principal investigator Luisa Orsini, Professor of Evolutionary Systems Biology and Environmental Omics at the University of Birmingham and Fellow of the Alan Turing Institute, explained: “We took a sediment core from the bottom of the lake and used biological data within that sediment like a time machine — looking back in time to build a detailed picture of biodiversity over the last century at yearly resolution. By analysing biological data with climate change data and pollution levels we can identify the factors having the biggest impact on biodiversity.
“Protecting every species without impacting human production is unrealistic, but using AI we can prioritise the conservation of species that deliver ecosystem services. At the same time, we can identify the top pollutants, guiding regulation of chemical compounds with the most adverse effect. These actions can help us not only to preserve the biodiversity we have today, but potentially to improve biodiversity recovery. Biodiversity sustains many ecosystem services that we all benefit from. Protecting biodiversity mean protecting these services.”
The researchers found that pollutants such as insecticides and fungicides, alongside increases in minimum temperature (a 1.2-1.5-degree increase) caused the most damage to biodiversity levels.
However, the DNA present in the sediment also showed that over the last 20 years the lake had begun to recover. Water quality improved as agricultural land use declined in the area surrounding the lake. Yet, whereas the overall biodiversity increased, the communities were not the same as in the (semi)pristine phase. This is concerning as different species can deliver different ecosystem services, and therefore their inability to return to a particular site can prevent the reinstatement of specific services.
Niamh Eastwood, lead author and PhD student at the University of Birmingham said: “The biodiversity loss caused by this pollution and the warming water temperature is potentially irreversible. The species found in the lake 100 years ago that have been lost will not all be able to return. It is not possible to restore the lake to its original pristine state, even though the lake is recovering. This research shows that if we fail to protect biodiversity, much of it could be lost forever.”
Dr Jiarui Zhou, co-lead author and Assistant Professor in Environmental Bioinformatics at the University of Birmingham, said: “Learning from the past, our holistic models can help us to predict the likely loss of biodiversity under a ‘business as usual’ and other pollution scenarios. We have demonstrated the value of AI-based approaches for understanding historic drivers of biodiversity loss. As new data becomes available, more sophisticated AI models can be used to further improve our predictions of the causes of biodiversity loss.”
Next the researchers are expanding their initial study on a single lake to lakes in England and Wales. This new study will help them understand how replicable the patters they observed are and, therefore, how they can generalise their findings on pollution and climate change on lake biodiversity.
News of the likely closure of the UK’s steel blast furnaces has prompted calls for the government to reconsider approval for a controversial Cumbrian coalmine that had been planned to supply the industry.
On Monday, British Steel announced that it plans to replace its two blast furnaces at Scunthorpe, while Tata Steel is considering closing its two at Port Talbot, in a dramatic reshaping of the UK steel industry. Both companies will instead rely on much cleaner electric arc furnaces, which use 87 times less coal.
West Cumbria Mining plans to produce 2.8m tonnes of coking coal a year at Woodhouse colliery in Whitehaven for use by “steelmakers in the UK and EU”. However, the blast furnace closures would mean a dramatic reduction in coal use by the UK industry, and would probably mean that the vast majority of the Cumbrian coal would have to be exported.
Tim Farron, the MP for Westmorland and Lonsdale in Cumbria and former leader of the Liberal Democrats, said the announcement from British Steel “means that any economic case for a new coalmine in Cumbria is now completely dead in the water”.
He said: “We need to see the government wake up to the fact that the steel industry is now going full steam ahead to decarbonise steel and start to invest in long-term renewable jobs for the future.”
Michael Gove, the government minister in charge of planning, approved the UK’s first new coalmine in 30 years last December, despite criticism from former Conservative ministers including Alok Sharma and Lord Deben.
Electric arc furnaces require only 9kg of coking coal a tonne of steel against 780kg for a tonne of blast furnace steel, according to the lobby group UK Steel.
British blast furnaces produced 4.8m tonnes of steel in 2022, suggesting they may have used 3.7m tonnes of coking coal. Based on UK Steel’s figures, producing the same amount of steel in electric arc furnaces would require only 43,000 tonnes of coal, or about 1.7% of the Cumbrian mine’s output.
The UK government’s decision notice last December approving the coalmine made reference to electric arc furnaces and other low-coal technologies, but said that there was “no certainty that electric arc furnaces will make a significant contribution to UK steel production”.
Tony Bosworth, coal campaigner at Friends of the Earth, said: “Michael Gove’s justification for approving the mine last December was largely that the steel industry would need coking coal for decades to come.
“But it now seems the UK market will soon disappear. This follows similar signals from EU steelmakers who have already announced they’re moving to greener production methods.
“This is all before construction on the mine has even begun, with the promised local jobs looking increasingly shaky in the medium term.”
However, the mine still has the support of some Conservative MPs who argue the UK would benefit from 500 local jobs and would not have to import coal.
Mark Jenkinson, the Conservative MP for Workington in west Cumbria and a former apprentice with British Steel, said he still 100% supported the new mine because of the continued need for coking coal in electric arc furnaces, and the desire to avoid emissions associated with transporting it into the UK.
“They do use a lot less [coal], as they’re not using it for its thermal properties,” he said. “That would not be a good excuse to ship it from halfway across the world with the incumbent emissions.”
West Cumbria Mining did not respond to a request for comment.
Recent Comments