The science of climate change is clear, so shouldn’t passing new climate policy be simple? The folks at Climate Interactive have a way for you to decide for yourself: the En-ROADS (Energy Rapid Overview and Decision-Support) climate solutions simulator. According to their website, En-ROADS is a “global climate simulator that allows users to explore the impact that dozens of policies— such as electrifying transport, pricing carbon, and improving agricultural practices— have on hundreds of factors like energy prices, temperature, air quality, and sea level rise.” Practically speaking, it’s an easy and intuitive way to look at future climate scenarios based on the latest climate science.
In October, I introduced En-ROADS to the Princeton Energy and Climate Scholars (PECS), a diverse group of graduate students interested in climate issues from departments all across the university (politics, chemistry, engineering, geosciences, atmosphere and oceanic sciences, ecology and evolutionary biology, and more). I have led these En-ROADS events for the past three years and was excited to share with a group of Princeton students who were really invested in climate mitigation. For this event, I led a role-playing game where players act as climate stakeholders in a simulated emergency climate summit organized by the United Nations (UN). Acting as the UN secretary-general, I called on the group to propose a climate action plan that would keep total global warming under 2 degrees Celsius by the year 2100. Over the course of three rounds of play, the PECS attendees— role-playing interest groups like “Agriculture, forestry and land use” or “Clean Tech”— suggested policy or mitigation strategies to reduce net greenhouse gas emissions that could be implemented in the En-ROADS simulator.
We started the first round with the baseline scenario in En-ROADS, which results in 3.3o C of warming by the year 2100 (Figure 1). This baseline assumes social and technological progress continues at the current rate, without additional climate policies or action, similar to the “Current Policies Scenario” of the Integrated Assessment Models or the IPCC reports.
In the first round, each stakeholder group unilaterally proposed one mitigation strategy that aligned with their own priorities, but quickly found issues with this approach. One group proposed subsidizing renewable energy sources and another subsidizing nuclear energy, both with the hope of reducing fossil fuel emissions from the energy sector. Adding a $0.03 /kWh subsidy to renewable energy expanded the energy market share of renewables, thus reducing the more fossil-fuel intensive sources like coal and oil. This change resulted in reducing the projected 2100 warming by a modest tenth of a degree to 3.2o C. However, adding a subsidy for nuclear not only reduced fossil-fuel intensive energy sources as intended, but also ate into the market share of renewable energy. This is because nuclear and renewable energy compete with each other as much as with coal, oil and gas (Figure 1). As a result, adding the nuclear subsidy on top of the renewable energy subsidy had a negligible effect, keeping projected 2100 warming at 3.2o C. This surprising result demonstrated a need for more synergistic mitigation strategies, fueling negotiations between stakeholder groups in the next round.
Round two saw periods of intense debate between stakeholder groups, as individual actors formed coalitions with two blocs emerging. One coalition, led by “Conventional Energy”, which included “Clean Tech”, “Industry and Commerce” and the “Developed Nations”, proposed investment in technological carbon removal, or carbon capture, to offset emissions. This would prop up the “Industry and Commerce” and “Clean Tech” sectors by increasing funding for research and development without negatively impacting the status quo of “Conventional Energy” and “Developed nations”. Alternatively, the second coalition led by the “Climate Justice Hawks” in coordination with the “Agriculture, forestry and land use” and the “Developing Nations” focused on reducing methane, nitrous oxide and other greenhouse gas emissions that were more relevant to their sectors. This two-pronged approach resulted in a total 2100 temperature increase of 2.7o C, a significant improvement from the 3.3o C starting baseline, but short of the 2o C goal set out at the beginning of the game.
By the start of the third round, no one had proposed any mitigation strategies that would directly reduce CO2 emissions from fossil fuel burning. This was, in part, to accommodate the strong personalities of the “Conventional Energy” team, who had aggressively lobbied for anything else. However, in the third round, the PECS role-players were encouraged to open the En-ROADS simulator themselves and quickly discovered the power of a substantial carbon tax.
“We just need the carbon tax to be $100 [per ton CO2]” said one member of the “Agriculture, forestry and land use” team.
“Who’s going to pay the carbon tax?” asked a member of “Conventional Energy”.
The conversation quickly escalated as the groups sparred over which mitigation strategies should be implemented in the final round. In short order, the two blocs became one, with every PECS member gathered around a single table, advocating for their stakeholder interests. Ultimately, with a compromise on the carbon tax, increased energy efficiency, a tax on oil, investment in carbon removal technologies and decreased deforestation, the group argued and laughed their way to the targeted 2o C of warming by 2100.
The event was a great success, and the players walked away with a new appreciation for the various perspectives at play when discussing our climate future. “We need to invest more in research; we need to put money into new technologies,” said one student who thought that negative emissions were a powerful factor in their final scenario. Another student said, “This really validates the “all of the above” energy policy of the Obama era” noting that there are still ways to the 2o C threshold without slashing the fossil fuel industry. In the long run, the PECS members all agreed that the solution was complex, requiring cooperation across sectors and a variety of mitigation strategies, with no “one-size fits all” answer. The En-ROADS Climate Action Simulation worked; it facilitated an informative, engaging, and fun discussion of climate change mitigation and policy.
The roleplaying game is my favorite way to introduce the En-ROADS simulator but Climate Interactive also has other ways of engaging. If you’re interested in bringing En-ROADS to your classroom, group or workshop, check out: https://www.climateinteractive.org/en-roads/ or reach out to Abigale Wyatt in the Geosciences department for more information.
Abigale Wyatt is a PhD Candidate in Geosciences working with Prof. Laure Resplandy to study how ocean physics and climate variability impact surface ocean biogeochemistry. After completing her degree, Abigale will join the team at [C]Worthy, an initiative investigating the efficacy of marine-based carbon dioxide removal, as a postdoctoral research scientist.
Written by Shashank Anand, Hezekiah Grayer II, Anna Jacobson, and Harrison Watson
Sustainability is the notion that we should consume with caution, as the Earth is a delicately balanced ecosystem with limited natural resources. Social justice generally aims to eliminate disparities and inequities between discrete demographics. These include inequalities between persons of different socioeconomic status, race, gender, and sexual orientation. Environmental justice (EJ) intersects both of these movements: EJ is the notion that specific ecological burdens of society should be shared equitably across communities. Historical trends suggest that as we expand, consume, pollute, and produce, the benefits and costs of industrialization are inequitably distributed. This inequality comes at the cost of poor health for those living in highly polluted areas. Inequitable distribution of pollutants has recently brought EJ to the center of political discourse due to its correlation with increased Covid-19 mortality and racially skewed disease outcomes.
Unfair treatment of workers at farms and manufacturing plants is a prime example of an injustice that ethical spending can aim to rectify. The misuse of pesticides, low worker wages, poor living conditions for farmers, and child labor are all sources of social and environmental injustices in food production. Socially conscious purchasing could be key in fighting these injustices. Academic institutions, which often purchase food en masse to serve thousands of individuals, have a sizable impact on humanity’s social and environmental footprint. Institutions like Princeton thus have a practical interest in reducing their footprint and a deontological obligation to mitigate their negative societal impact.
In general, it is difficult to assess the relative social and EJ impact of discrete products due to the inherently unquantifiable nature of justice. Certifications like Fair Trade and the Rainforest Alliance attempt to assuage buyers’ concerns by identifying and establishing environmentally just organizations. Certifications like USDA Organic and the Non-GMO Project endorse products and operations from an environmental sustainability standpoint.
CASE STUDIES
Rainforest Alliance (RA) is an international NGO that provides certifications in sustainable agriculture, forestry, and tourism. RA seeks to “protect forests… and forest communities.” For farmers, the certification process involves site audits that check for compliance with the Rainforest Alliance Standards for Sustainable Agriculture. Standards include child labor protections and worker protection against the use of harmful pesticides listed in the Sustainable Agriculture Network Prohibited Pesticide List. RA Standards address economic and gender disparities on farms through the use of an “assess-and-address” approach. Farms are responsible for setting the goals that will mitigate the effects of “child labor, forced labor, discrimination, and workplace harassment and violence”. RA Standards also enforce implementation of a “salary matrix tool” for the collection of comprehensive wage data and identification of wage gaps.
Support from RA has historically proven impactful, most notably on certified cocoa farms in Côte d’Ivoire, the world’s largest cocoa producer. A 2011 survey conducted by the Committee on Sustainability Assessment analyzed the impact of RA on the economic, environmental, and social dynamics of these cocoa farms. RA certification was shown to increase school attendance (noted as the percentage of children who have completed the appropriate number of grades for their age) by 392%, thereby reducing child labor; increase crop yields by 172%; and improve farm income by 356% compared to uncertified farms (see figures 4, 6, and 10 at this link). Despite these documented successes, there has been a history of exploitation of previous Standards on certified farms. In 2019, for example, pineapple farms in Costa Rica were cited employing undocumented workers and illegal agrochemicals despite RA restrictions.
Fair Trade USA (FTU) is a certification that focuses on social and EJ much like RA does. FTU cites ideals in democratic and fair working conditions for its workers. FTU employs an Impact Management System (IMS) towards these ends; the IMS is used to assess the social and economic impact of growers’ practices. FTU is distinct from its well-known parent company, Fairtrade International: the two split in 2012 over a dispute about certified growers’ company size.
FTU implements a price premium, ensuring that if the market value of a product falls, FTU products have a floor price on store shelves, thus ensuring workers earn some minimum wage. FTU also requires a small additional fee, the “Fair Trade Premium”, on top of the purchase price of the product. The premium is used to improve local infrastructure for the producers. How it is used is decided democratically by workers at the farm. In a poor economy, Fair Trade products are likely to be pricier than their uncertified counterparts. In a thriving economy with high demand, this difference will be negligible (see figure 1 at this link). A 2009 case study of coffee production in Nicaragua found that many Fair Trade coffee producers still had trouble finding places to sell their coffee. In times of high coffee prices, producers found that they reaped little financial benefit from the Fair Trade label.
The Non-GMO Project (NGP) certifies distributors and farms whose procedures align with “standards consumers expect.” Certification is obtained after evaluation of the presence of genetically modified organisms (GMOs) in produced foods. GMO crops are often bred to be more resistant to drought or pests. This may lead them to outcompete local crops and flora. Combined with the potential unknown behavior of these nonnative crop variants and risk of gene flow, e.g. through cross-pollination, many communities want to keep excessive GMO cultivation out of their neighborhoods. NGP upholds the long-standing Non-GMO Standard, which outlines requirements for companies looking to sport the butterfly label. These standards necessitate greater coordination between cleaning and transference of products between storage facilities (termed “elevators”) as well as increased investments in process monitoring to account for the potential introduction of GMOs along the production process. NGP partners with third-party certification bodies (also known as technical administrators) that audit businesses and farms for compliance with all Non-GMO Standards. Application fees, as well as Non-GMO product premiums, contribute to the conservation of environmental health through the protection of genetic diversity in organic agriculture.
USDA Organic was created by the Organic Foods Production Act (OFPA) in 1990, which mandated the USDA to develop federal-level regulations in the US for organic food. It was actualized in 2002, after 10 years of public debate, as a compulsory certification requiring producers and handlers with annual organic sales greater than $5,000 to discontinue the use of prohibited substances. To ensure the insulation of formed policies from special interest groups, OFPA also instituted the National Organic Standards Board (NOSB) that includes 15 volunteers representing the consumer, organic farmer/handler, retailer, scientist, and environmental conservationist. A two-thirds majority of NOSB is required to add a material in the National List of Allowed and Prohibited Substances (NLAPS). Third-party certifying agents issue the product as organic after confirming that the producer or handler has discontinued the use of prohibited substances for three years.
USDA Organic and a growing market for organic produce have resulted in high product premiums. Unfortunately, a booming market does not guarantee good wages, living standards, or fair treatment for farm labor. There are cases recorded where working conditions have worsened due to the heavy work and time demands of organic farming. Some new programs build on USDA Organic’s structure with additional focus on standards for animal welfare and worker fairness. Regenerative Organic Certification (ROC) is an example of such a program. It is too early to determine whether these certification programs will be successful or will earn the trust of the market.
DISCUSSION
Consumer activism flourishes with effective metrics on desired qualities (e.g., EJ) to inform conscientious purchasing. Certification efficacy for social and EJ depends on two main questions: on a policy level, how relevant are the certifications’ guidelines to the social and EJ movement? In practice, how successfully are rules enforced; are audits thorough, unbiased, and based on clear criteria? These questions help us establish whether certifications actually impact procedure at the farm-level. Certifications lacking in the first quality risk being irrelevant to social and EJ, while certifications lacking in the second risk being inconsequential.
The missions of certifications like RFA and FTU to enable sustainable livelihoods for farmworkers and promote environmental stewardship are in line with core tenets of social and EJ. However, the auditing processes of these certifications have demonstrated weaknesses, as noted by recent RFA-certified pineapple farms in Costa Rica. Furthermore, the guidelines for these certifications may be poorly communicated with farm workers as shown by a study from Vakila and Nygren on Nicaraguan Fair Trade-certified coffee farms.
USDA Organic and NGP are more closely aligned with environmental sustainability than social or EJ, yet they have more streamlined auditing processes because sustainability can be more directly quantified (e.g., unit volume of water usage). USDA Organic, for example, strictly regulates pesticides and herbicides, thus protecting farm workers’ health. Prohibited chemicals in NLAPS include methyl bromide, sulfuryl fluoride, and phosphine (aluminum phosphide or magnesium phosphide), exposure to which can affect fetal development and can lead to irreversibledamage. NGP, on the other hand, does not regulate chemical substances; on the contrary, the products it promotes forgo the health benefits associated with reduced pesticide use in farming GM crops. In general, many larger social justice themes (minimum wage, underage labor, unfair working conditions) are not addressed by these sustainability certifications.
The cost of buy-in is one major obstacle for smaller distributors. For example, the harvest process for GMOs and Non-GMOs must be separated to prevent contamination, leading to more labor for farmworkers. Investigations check for use of USDA Organic’s prohibited substances for three years leading up to product harvest; a waiting period that may prove prohibitive to some smaller farms. These smaller farms may not be able to afford the fees of the certification process, or the costs of regulations/liability insurance as required by schools’ procurement offices. Interviews with local players in food distribution, however, alleviated these concerns: Ms. Linda Recine of Princeton Dining Services confirmed that many small farms have difficulties affording the certification label, but asserted that a network of farmers, larger distributors, and university support systems help small businesses obtain necessary certifications and build a sustainable customer base. She cited a pilot conference hosted by the Princeton University Department of Finance and Treasury and Princeton University Central Procurement. This conference focused on woman-, veteran-, and minority-owned businesses; through the conference, Princeton offered to subsidize the first year of various certifications at no cost to the vendor. For obtaining expensive liability insurance, as well, outside help proves paramount: Ms. Recine says that many small farms may be able to get their goods onto campus by partnering with larger distributors. Jim Kinsel of Honeybrook Organic Farm stated that open communication with customers about the certification waiting period usually assuages their concerns about uncertified crops.
Cost of buy-in shows that many certifiable farms may lack a formal label. Additionally, if farms pursuing certification already employ environmentally just practices before they apply for the label, we may see biases which interfere with our ability to assess certification efficacy objectively. A recent meta-study confirmed that many reports investigating the efficacy of certifications did not control for possible selection bias.
With certifications alone, we are left with an incomplete picture of ethical consumption. If EJ certifications rely on vague self-improvement, sustainability certifications are not as justice-relevant, and all certifications are audited by third parties whose reliability is hard to ascertain, is a certification stamp on a unit of packaging truly enough to assert that a product was ethically produced? The ethical consumer is caught between a rock and a hard place; incomplete information makes it impossible to gauge EJ using certification labels alone. We will need additional information from producers to rely more comfortably on the value of consumer certifications.
The solution to these concerns may lie in local purchasing. Sarah Bavuso and Linda Recine of Princeton Dining Services emphasized the importance of forming relationships with producers, citing the value of allowing farmers to see the campus and of university officials taking trips to farms and production sites. This relationship allows Princeton to be more hands-on with its food and to interfere when questions of ethics arise. Indeed, a 2007 study suggests that forming relationships with local farms decreases the distance that products travel, allows for cooperative relationships with individual farmers, and introduces flexibility in verification processes.
Decreasing food-miles through local purchasing may be a critical component of both sustainability and EJ: as food travels and the supply chain lengthens, more middlemen get involved, and there are more opportunities for injustices and unsustainable practices. Each border that food passes through serves as another regulatory vulnerability for the introduction of harmful pesticides and food contamination. At each stop on the road, food loses freshness and emits greenhouse gases (GHGs) by burning fossil fuel through transit. Additionally, laws and regulations are more easily ascertained locally: consumers are more likely to know the minimum wage and regulations on working conditions for farms near their own homes.
Local farms may also be smaller and more sustainable than larger national chains. Mr. Kinsel claims that larger farms are more likely to cut corners in the name of profit. While Ms. Recine confirms that larger producers may be less inclined to act ethically, she states that these farms have “come a long way” towards humane and ethical behavior, largely thanks to students and universities vocally lobbying for causes that were important to them. Purchasing certified food that is also locally grown may address many of the concerns introduced by the information gap mentioned above.
CONCLUSION
Rather than relying entirely on certifications like USDA Organic, a supply chain can be created where the university shares the risk of crop production under unpredictable hydroclimatic conditions with the local farming community. One realization of a more local supply chain is Community Supported Agriculture (CSA), where schools select membership for a season and receive fixed volumes of freshly harvested produce from local farms. Students receive fresh and nutritional food from farms that abide by local regulations. Farmers get money from subscriptions upfront, allowing them to expand and invest early. Schools build working relationships with constituent farms and their management, creating a point-person on the farm grounds who can verify safe conditions for farmers. Many local farms in the Princeton area (like the Snapping Turtle Farm and the Cherry Grove Organic Farm) already have some of the same certifications as larger factory farms.
A CSA supply chain would fit neatly into many residential colleges for small portions of salads or boiled eggs and meats. Non-perishable products like crackers and cereals could still be purchased from larger certified producers. In this supply chain, certifications are relied upon for goods that are difficult to buy from local producers. The local economy around the university is enhanced by the CSA program employed for fruits, vegetables, and meats. There are, of course, logistic questions to be resolved: a supply chain where crop proportions are not predetermined is quite different from the institutional status quo. The feasibility of such a supply chain will likely need to be vetted through a pilot program or a case study of other institutions implementing a similar program. CSAs have been implemented on some scale at schools like the University of Kentucky, Rutgers University, and the New Jersey Institute of Technology. We suggest schools start small: by implementing a CSA supply chain in an on-campus cafe or residential college. The program can be scaled up over time, after feasibility studies and conversations with local farmers.
The feasibility of establishing a local supply chain will depend on how universities currently source their food. Ms. Bavuso indicated that many schools fall into one of two classes: self-operated schools, whose food procurement departments are university-run and in-house, and non-self-operated schools, whose food procurement is outsourced via contracts. Many schools employ some combination of these operations, with state schools being particularly strictly regulated via contracts (Aramark, University of Delaware; Sodexo, The College of New Jersey). Self-operated schools like Princeton will likely have more flexibility in vetting and choosing vendors. Non-self-operated schools aiming for social change will likely have to do so by lobbying distributors through the schools’ purchasing power or threatening to withdraw their business if practices are not improved. Not all schools will have the means to investigate each food product on their shelves: it will likely be useful to leverage an inter-school consortium of food procurement research, see the National Association of College & University Food Services, allowing inter-institutional procurement departments to swap findings and relevant research.
The authors of this article do not wish to claim that certifications are entirely ineffective in gauging the social and EJ of food procurement. But certifications are not a panacea for ethical supply chains. Universities relying solely on these certifications for assessing food safety and social and EJ are not doing due diligence when it comes to ethical spending. It may take additional effort to switch to a CSA-style supply chain like the one suggested above; but if institutions are serious about the values that they promote in their dining services brochures, this added effort will be well worth the improvement seen in the quality and justice of the campus food.
Princeton’s president Christopher Eisgruber wrote in June of 2020: “As a University, we must examine all aspects of this institution — from our scholarly work to our daily operations—with a critical eye and a bias toward action. This will be an ongoing process, one that depends on concrete and reasoned steps[.]” The authors of this article believe that a CSA pilot program would be one such concrete step towards action, a step that would be directly in line with the larger themes of environmental and social justice that have become more pronounced in the societal collective consciousness during recent years. At the very least, it is the duty of university procurement departments to state the steps they intend to take to address inequity. Princeton’s recent Supplier Diversity Plan is one example of such an effort in that it aims to support more diverse-owned businesses. As entities with large economic impacts, universities do have the power to effect real societal change.
Shashank Anand: I am a Ph.D. Candidate in the Department of Civil and Environmental Engineering, working with Prof. Amilcare Porporato. My research focuses on understanding the role of ecohydrological and geomorphological processes in the evolving landscape topography by analyzing process-based models and learning from the available observations.
Hezekiah Grayer II: I am a 2nd year PhD candidate in the Program in Applied and Computational Mathematics, where I am fortunate to advised by Prof. Peter Constantin. My academic goals intersect fluid mechanics, plasma physics, and partial differential equations.
Anna Jacobson: I am a 3rd year PhD candidate in the department of Quantitative and Computational Biology. I am affiliated with the Andlinger Center for Energy and the Environment and the High Meadows Environmental Institute. For my thesis work, I study energy systems and environmental policy.
Harrison Watson: I am a Ph.D Candidate in the Department of Ecology and Evolutionary Biology working with Professors Lars Hedin, Rob Pringle, and Corina Tarnita. My work currently focuses on clarifying the forces that influence land carbon cycles using eastern and southern African savannas as a study system.
If you’re reading this, you probably don’t need to be persuaded that the planet is on fire, and we need to do something to put it out fast. We see evidence all around us: California is again in the throes of a record wildfire season, glaciers the size of Manhattan are sliding into the sea, and in some of the most densely populated parts of the world, massive cities are being swallowed by the tide. There is little dispute that these disasters stem from our burning of fossil fuels, and that by most any measure, we are failing to prevent the worst.
Meanwhile, in balmy Princeton, New Jersey, the university’s Carbon Mitigation Initiative (CMI) and Andlinger Center for Energy and the Environment have signed splashy agreements with BP and Exxon (respectively) to fund research into renewable fuels, carbon capture and storage, and other climate innovations. Since 2000, these companies have pumped over $30 million into CMI and the Andlinger Center, with the latter recently extending its Exxon contract for another five years.
To put it politely, we of Divest Princeton say these partnerships do more harm than good. True, they may create new and valuable knowledge, but that isn’t really why they exist. In one leaked exchange from 1998, Exxon representatives strategized about the need to “identify and establish cooperative relationships with all major scientists whose research in the field supports our position,” and to “monitor and serve as an early warning system for scientific development with the potential to impact on the climate science debate, pro and con.”
Taking this statement literally — and why shouldn’t we? — BP and Exxon’s support for Princeton is more than simple altruism. It’s more than good PR. Rather, it’s part of a years-long effort not to aid, but to manage climate research toward ends not in conflict with their extractive business model. Tellingly, these do-gooder oil companies plan to increase production 35% by 2030. This would be cataclysmic.
Their schemes are made possible by funding and power gifted by Princeton. We cannot tolerate, let alone enable these activities any longer. Not when they pose such obvious conflicts with our university’s core values and threaten our fellow students and faculty working around the world. Princeton must stand up for itself. How better than by divesting from fossil fuels?
The divestment movement has grown rapidly in recent years, with institutions like Georgetown University, Brown, Cornell, and Oxford recently joining its ranks. Collective actions have taken a toll — Goldman Sachs says that divestment is partly to blame for widespread credit de-ratings in the coal industry, and Shell is on-record saying divestment will present “a material adverse effect on the price of our securities and our ability to access equity capital markets.” Essentially, divestment works.
We argue that the moral imperative of divestment should be compelling enough on its own; if Princeton moved to divest and the markets didn’t budge an inch, at least then our conscience would be clean. At least then we could call ourselves “sustainable” with a straight face and live honestly by our motto: “in the nation’s service, and the service of humanity.”
Detractors maintain that any “demands” on Princeton’s endowment would constrain its ability to earn huge returns, depriving students of the financial support they need to prosper. This is absurd. Billion-dollar endowments like the Rockefeller Brothers Fund have demonstrated that divestment can be a net positive. Fossil fuel stocks have also been declining for years. It looks increasingly clear that an investor gains little “diversifying” in fossil fuel, and that the risks of divestment have been well overblown. Shareholders — especially shareholders with a fiduciary responsibility like Princeton’s — should be looking for the exit.
In order to remain within 1.5°C of global warming by mid-century — the threshold at which the IPCC and Princeton’s own Sustainability Action Plan say “catastrophic consequences” will be unavoidable — the fossil fuel industry’s ambitious exploration and development will need to be mothballed. Undrilled oil fields and unmined coal will become stranded assets, or dead weight on their companies’ books. To have faith in these investments, Princeton must think stranded assets will actually go to use, in which case, Princeton ignores its own scientists and legitimizes the activities central to our climate crisis.
Others have argued that regardless of donors’ ulterior motives, divesting would only leave good money and research on the table. To these people, the “greenwashing” corporations seek from partnering with elite institutions is both inevitable and of little consequence compared to the novel scholarship their funding provides. The catch here is that quality research and a morally invested endowment are not mutually exclusive. There isn’t a rule saying our research must be funded by BP or Exxon — if Princeton truly valued this knowledge, it would channel its creative energies toward finding funding elsewhere.
“Elsewhere” could very easily be the university’s own wallet. Princeton is quick to remind us it holds the biggest per-student endowment in the country. The endowment today is a bit larger than $26 billion, roughly the size of Iceland’s GDP and larger than GDPs of half the world’s countries. In the lastten years alone, Princeton’s endowment has more than doubled. In this light, the money needed to sustain current research is practically a rounding error. If just a few Trusteesput their donations together, they could recoup Exxon’s latest $5 million donation in under five seconds!
We tried to anticipate these doubts in our divestment proposal, which was given to Princeton’s administration last February. Since then, we have met with Princeton’s Resources Committee and invited experts — former Committee Member Shannon Osaka, President of the Rockefeller Brothers Fund Stephen Heintz, and Stanford researcher Dr. Ben Franta — to help present our case. Discussions will continue through the end of 2020, culminating in a forum with 350.org’s Bill McKibben in November.
As a reward for our persistence, the Resources Committee has indicated it might decide on our proposal by Christmas. If it approves, the proposal goes to the Board of Trustees, and the clock starts over. This, dear readers, is the “fast track.”
It has been demoralizing to watch Princeton, one of the world’s great centers of higher learning and a temple to empirical evidence, run interference for companies that have scorned the truth, knowingly endangered billions, and literally confessed to their ill intent. From its byzantine system for proposing divestments to its arbitrary requirement saying divestment must take the form of complete dissociation (a prohibitively high bar), Princeton’s strategy is to frustrate and outlast causes like ours. Most of the time, it succeeds.
But our cause is different from the others. With climate change, waiting is simply not an option. The immovable object will meet an unstoppable force, and the unstoppable force will win.
The longer we delay, the longer we allow fossil fuel companies to weaponize Princeton’s gravitas, spreading disinformation and quack science while purporting to be part of “the solution.” Until Princeton inevitably divests from these bad actors, we will continue to withhold our donations, continue to protest, and continue to organize, fighting fire with fire.
Divest Princeton is a volunteer movement of Princeton students, alumni, parents, faculty, and staff. Sign their “No Donations Until Divestment” petition and learn more here.
The rising popularity and falling capital costs of renewable energy make its integration into the electricity system appear inevitable. However, major challenges remain. In part one of our ‘integrating renewable energy’ series, we introduced key concepts of the physical electricity system and some of the physical challenges of integrating variable renewable energy. In this second instalment, we introduce how electricity markets function and relevant policies for renewable energy development.
Modern electricity markets were first mandated by the Federal Energy Regulatory Commission (FERC) in the United States at the turn of millennium to allow market forces to drive down the price of electricity. Until then, most electricity systems were managed by regulated vertically-integrated utilities. Today, these markets serve two-thirds of the country’s electricity demand (Figure 1) and the price of wholesale electricity in these regions is historically low due to cheap natural gas prices and subsidized renewable energy deployment.
The primary objective of electricity markets is to provide reliable electricity at least cost to consumers. This objective can be further broken down into several sub-objectives. The first is short-run efficiency: making the best of the existing electricity infrastructure. The second is long-run efficiency: ensuring that the market provides the proper incentives for investment in electricity system infrastructure to guarantee to satisfy electricity demand in the future. Other objectives are fairness, transparency, and simplicity. This is no easy task; there is uncertainty in both supply and demand of electricity and many physical constraints need to be considered.
While the specific structure of electricity markets varies slightly by region, they all provide a competitive market structure where electricity generators can compete to sell their electricity. The governance of these markets can be broken down into several actors: the regulator, the board, participant committees, an independent market monitor, and a system operator. FERC is the regulator for all interstate wholesale electricity markets (all except ERCOT in Texas). In addition, reliability standards and regulations are set by the North American Electric Reliability Council (NERC), which FERC gave authority in 2006. Lastly, markets are operated by independent system operators (ISOs) or Regional Transmission Operators (RTOs) (Figure 1). In tandem, regulations set by FERC, NERC, and system operators drive the design of wholesale markets.
Before we get ahead of ourselves, let’s first learn about how electricity markets work. A basic electricity market functions as such: electricity generators (i.e. power plants) bid to generate an amount of electricity into a centralized market. In a perfectly competitive market, the price of these bids is based on the costs of an individual power plant to generate electricity. Generally, costs are grouped by technology and organized along a “supply stack” (Figure 2). Once all bids are placed, the ISO/RTO accepts the cheapest assortment of generation bids that satisfies electricity demand while also meeting physical system and reliability constraints (Figure 2a). The price of the most expensive accepted bid becomes the market-clearing price and sets the price of electricity that all accepted generators receive as compensation (Figure 2a). In reality it is a bit more complicated: the ISO/RTOs operate day-ahead, real-time, and ancillary services markets and facilitate forward contract trading to better orchestrate the system and lower physical and financial risks.
Because real electricity markets are not completely efficient and competitive (due to a number of reasons), some regions have challenges providing enough incentives for the long-run investment objective. As a result, several ISO/RTOs have designed an additional “capacity market.” In capacity markets, power plants bid for the ability to generate electricity in the future (1-3 years ahead). If the generator clears this market, it will receive extra compensation for the ability to generate electricity in the future (regardless of whether it is called upon to generate electricity) or will face financial penalties if it cannot. While experts continue to debate the merits of these secondary capacity markets, some ISO/RTOs argue capacity markets provide the necessary additional financial incentives to ensure a reliable electricity system in the future.
Sound complicated? It is! Luckily, ISO/RTOs have sophisticated tools to continuously model the electricity system and orchestrate the purchasing and transmission of wholesale electricity. Two key features of electricity markets are time and location. First, market clearing prices are time dependent because of continuously changing demand and supply. During periods of high electricity demand, prices can rise because more expensive electricity generators are needed to meet demand, which increases the settlement price (Figure 2a). In extreme cases, these are referred to as price spikes. Second, market-clearing prices are regional because of electricity transmission constraints. In regions where supply is low and the transmission capacity to import electricity from elsewhere is limited, electricity prices can increase even more.
Several recent developments have complicated the economics of generating electricity in wholesale markets. First, low natural gas prices and the greater efficiency of combined cycle power plants have resulted in low electricity bids, restructuring the supply stack and lowering market settlement prices (Figure 2b). Second, the introduction of renewable power plants, which have almost-zero operating costs, introduce almost-zero electricity market bids. As such, renewables fall at the beginning of the supply stack and push other technologies towards the right (higher-demand periods that are less utilized), further depressing settlement prices (Figure 2c). A recent study by the National Renewable Energy Laboratory expects these trends to continue with increasing renewable deployment.
In combination, these developments have reduced revenues and challenged the operation of less competitive generation technologies, such as coal and nuclear energy, and elicited calls for government intervention to save financial investments. While the shutdown of coal plants is welcome news for climate advocates, nuclear power provided 60% of the U.S. carbon-free electricity in 2016. Several states have already instated credits or subsidies to prevent these low-emission power plants from going bankrupt. However, some experts argue that the retirement of uneconomic resources is a welcome indication that markets are working properly.
While renewable energy advocates support such policies, system operators and private investors argue these out-of-market policies could potentially distort wholesale electricity markets by suppressing prices and imposing regulatory risks on investors. Importantly, they argue that this leads to inefficient resource investment decisions and reduced competition that ultimately increases costs for consumers. As a result, several ISO/RTOs are attempting to reform electricity capacity market rules to satisfy these complaints but are having difficulty finding a solution that satisfies all stakeholders. How future policies will be dealt with by FERC, operators and stakeholders remains to be resolved.
As states continue to instate new renewable energy mandates and technologies yet to be well-integrated with wholesale markets, such as battery storage, continue to evolve and show promise, wholesale market structures and policies will need to adapt. In the end, the evolution of electricity market rules and policies will depend on a complex interplay between technological innovation, stakeholder engagement, regulation, and politics. Exciting!
Kasparas Spokas is a Ph.D. candidate in the Civil & Environmental Engineering Department and a policy-fellow in the Woodrow Wilson School of Public & International Affairs at Princeton University. Broadly, he is interested in the challenge of developing low-emissions energy systems from a techno-economic perspective. Follow him on Twitter @KSpokas.
While capital cost reductions and popularity are key to driving widespread deployment of renewables, there remain significant challenges for integrating renewables into our electricity system. This two-part series introduces key concepts of electricity systems and identifies the challenges and opportunities of integrating renewables.
What are electricity systems? Physically, they are composed of four main interacting elements: electricity generation, transmission grids, distribution grids, and end users (Figure 1). In addition to the physical elements, regulatory and governance structures guide the operation and evolution of electricity systems (these are the focus of part two in this series). These include the U.S. Federal Regulatory Commission (FERC), the North American Electric Reliability Council (NERC), and numerous state-level policies and laws. The interplay between the physical and regulatory elements has guided electricity systems to where they are today.
In North America, the electricity system is segmented into three interconnected regions (Figure 2). These regions are linked by only a few low-capacity transmission wires and often operate independently. These regions are then further segmented into areas where independent organizations operate wholesale electricity markets and areas where federally-regulated vertically-integrated utilities manage all the physical elements (Figure 2). Roughly two-thirds of the U.S. electricity demand is now located in wholesale electricity markets. Lastly, some of these broad areas are further subdivided into smaller balancing authorities that are responsible for supplying electricity to meet demand under regulations set by FERC and NERC.
Electricity systems’ main objective is to orchestrate electricity generation, transmission and distribution to maintain instantaneous balance of supply and continuously changing demand. To maintain this balance, the coordination of electricity system operations is vital. Electricity systems need to provide electricity where and when it is needed.
Historically, electricity systems have been built to suit conventional electricity generation technologies, such as coal, oil, natural gas, nuclear, and hydropower. These technologies rely on fuel that can be transported to power plants, allowing them to be sited in locations where electricity demand is present. The one exception is hydropower, which requires that plants are sited along rivers. In addition, the timing of electricity generation at these power plants can be controlled. The ability to control where and when electricity is generated simplifies the process by which an electricity system is orchestrated.
Enter solar and wind power. These technologies lack the two features of conventional electricity generation technologies, the ability to control where and when to generate electricity, and make the objective of instantaneously balancing supply and demand even more challenging. For starters, solar and wind technologies are dependent on natural resources, which can limit where they are situated. The areas that are best for sun and wind do not always coincide with where electricity demand is highest. As an example, the most productive region for on-shore wind stretches along a “wind-belt” through the middle of U.S. (Figure 3). For solar, the sparsely populated southwest region presents the most attractive sunny skies (Figure 3). As of now, long-distance transmission infrastructure to transport electricity from renewable resource-rich regions to high electricity demand regions is limited.
In addition, the timing of electricity generation from wind and solar cannot be controlled: solar panels only produce electricity when the sun is shining and wind turbines only function when the wind is blowing. Therefore, the scaling up of renewables alone would result in instances where supply of renewables does not equal customer demand (Figure 4). When renewable energy production suddenly drops (due to cloud cover or a lull in wind), the electricity system is required to coordinate other generators to quickly make up the difference. In the inverse situation where renewable energy generation suddenly increases, electricity generators often curtail the electricity to avoid dealing with the variability. The challenge of forecasting how much sun and wind there will be in the future adds more uncertainty to the enterprise.
A well-known challenge in solar-rich regions is the “duck-curve” (Figure 5). The typical duck-curve (named after the fact that the curve resembles a duck) depicts the electricity demand after subtracting the amount of solar generation at each hour of the day. In other words, the graph depicts the electricity demand that needs to be met with power plants other than solar, called “net-load.” During the day, the sun shines and solar panels generate electricity, resulting in low net-loads. However, as the sun sets and people turn on electric appliances after returning home from work, the net load increases quickly. Electricity systems often respond by calling upon natural gas power plants to quickly ramp up their generation. Unfortunately, natural gas power plants that can quickly increase their output are less efficient and have higher emission rates than slower natural gas power plants.
These challenges result in economic costs. A study about California concluded that increasing renewable deployment could result in only modest emission reductions at very high abatement costs ($300-400/ton of CO2). This is because the added variability and uncertainty of more renewables will require higher-emitting and quickly-ramping natural gas power plants to balance sudden electricity demand and supply imbalances. In addition, more renewable power will be curtailed in order to maintain stability (Figure 6), reducing the return on investment and increasing costs.
Although solar and wind power do pose these physical challenges, technological advances and electricity system design enhancements can facilitate their integration. Several key strategies for integrating renewables will be: the development of economic energy storage that can store energy for later use, demand response technologies that can help consumers reduce electricity demand during periods of high net-load, and expansion of long-distance electricity transmission to transport electricity from natural resource (sun and wind) rich areas to electricity demand areas (cities). Which solutions succeed will depend on the interplay of future innovation, state and federal incentives, and electricity market design and regulation improvements. As an example, regulations that facilitate long-distance electricity transmission could significantly reduce technical challenges of integrating renewables using current-day technologies. To ensure efficient integration of renewable energy, regulatory and energy market reform will likely be necessary. For more about this topic, check out part two of our series here!
Kasparas Spokas is a Ph.D. candidate in the Civil & Environmental Engineering Department and a policy-fellow in the Woodrow Wilson School of Public & International Affairs at Princeton University. Broadly, he is interested in the challenge of developing low-emissions energy systems from a techno-economic perspective. Follow him on Twitter @KSpokas.