Back in June, E&E’s Emily Holden broke the story that EPA Administrator Scott Pruitt is planning a “red team, blue team” exercise to critique climate change science. He told Breitbart he got the idea from a Wall Street Journal editorial by physicist Steven Koonin. Rick Perry is also a big fan.
Then on Thursday, at his first oversight hearing before the House Energy and Commerce Subcommittee on Environment, Pruitt told Rep. Joe Barton (R-TX) that he has a timeline: the red team/blue process will be underway by “early next year.”
The way Pruitt reportedly plans to do it is, he will assemble a team of climate “skeptics,” who doubt some or all of the results of mainstream climate science, to be one team. Another group of climate scientists would defend the existing body of research, as the blue team. And there will be some kind of public debate or exchange between them, perhaps televised.
This is, for many, many obvious reasons, a terrible idea. But one of its worst consequences is perhaps the least obvious: It will discredit the idea of a red team, blue team exercise focused on climate change. That is too bad, because the concept could be used for a more productive conversation about what to do about this gigantic threat.
I’ll get to why in a second. First, briefly, the main two reasons that Pruitt’s idea is bad and wrong.
The red-team exercise is not suited to basic science
The red-team exercise has its origins in the military, which has done some of the most rigorous thinking of any institution about decision-making in the face of deep uncertainty. Military planners do not typically use cost-benefit analysis in pursuit of ideally efficient results. They are not focused on optimality, but on resilience and robustness. They seek to create systems — computer systems, systems of rank and discipline, combat systems, decision-making systems, war plans — that are robust against a variety of risks.
Risk, crudely speaking, means probability multiplied by severity. So lower probability does not necessarily mean lower risk. It depends on the shape of the curve.
The black line shows a more probable risk, with outcomes grouped tightly around a narrow (and low) range of severities. The red line shows a less probable risk but with longer “tails,” encompassing some extremely severe outcomes. The red scenario is less likely, but contains more total risk.
One way to assess a system’s resilience is to attack it — or rather, to simulate an attack. That, in a nutshell, is the origin of the red-team exercise. The idea is to create a group devoted to taking the system down and find out how it holds up under stress. The blue team and read team do faux battle, whether it’s hackers trying to penetrate a computer system or terrorist cells trying to take down US installations. Then blue teams figure out how to ruggedize their systems and run the exercise again. (For an extensive account, see Red Team: How to Succeed by Thinking Like the Enemy, by Micah Zenko of the Council on Foreign Relations.)
It’s a great way to help figure out what to do. (More on that in a minute.) It’s not necessarily suited to determining what’s true, scientifically speaking.
Or rather, what you’d want for science is something with the same spirit, but slower, more deliberate, more systematized. Call it “peer review.”
As many scientists and others have already pointed out, science basically is one giant, slow-moving red-team exercise, and it’s working just fine. Scientists spend a great deal of time scrutinizing one another’s work. Correcting or amending important work can make a scientific career.
There is always the possibility of cultural bias in peer review, as there is in everything, but there is no evidence, at long last, after years of attacks, that the many disciplines that fall under the rubric “climate science” face any unique problems in that regard, or have ignored significant results.
There is certainly no reason to believe that our collective understanding would be improved by a televised debate.
No body of science has been more reviewed and stress-tested than climate science. The most-cited list of skeptics hasn’t changed in years, in some cases for decades. They are kept in the public eye entirely through the efforts of an organized ideological campaign, but they rarely add to their ranks.
Which brings us to the second reason this a dumb idea.
Trump’s a denier, Pruitt’s a denier, and the whole point of the exercise is denialist
Whatever theoretical good a red-team exercise might do for climate science — Niskanen Center’s Joseph Majkut makes a decent case for them — there is every reason to think that this administration will approach the exercise as it has everything else: incompetently and in bad faith.
They have not thought a lot about climate science, certainly not about systemic weaknesses in the peer review process. They don’t even really know what those words mean.
“What the American people deserve,” Pruitt told Breitbart, “is a true, legitimate, peer-reviewed, objective, transparent discussion about CO2.”
You’re thinking “uh, that’s science.” But the thing to remember, the thing that becomes clear when listening to any Pruitt speech or interview, is that he’s just saying words. He’s heard liberals say “peer review” a lot, so he’s gathered that it’s some kind of synonym for “good” and just co-opted it.
Almost nothing Pruitt has ever said about climate change has made sense, or been true, or either. He is not a man one might reasonably trust to run an objective stress test on the findings of a large and diverse scientific field.
Meanwhile, in a recent budget hearing, Sen. Al Franken explained to Perry that the red-team thing is basically what scientists do all the time. He mentioned the red team of skeptics assembled by the Koch brothers several years ago, run by Dr. Richard Muller. At the end of the process, Muller renounced his skepticism and urged action on climate change.
Here is Muller’s New York Times op-ed: “The Conversion of a Climate-Change Skeptic.”
“I don’t believe that,” Perry said. “I don’t buy it.”
These are not people who care about climate science.
Red team exercises would be great for climate decision-making
Where red-team exercises excel is in helping with decision-making — figuring out what to do.
Climate science confronts us with a skein of overlapping risks of varying probabilities, some of them high probability and moderately severe, many of them low probability but extremely severe. Even among the “known” risks, like droughts and crop failures, it is difficult for scientists to predict regional effects with confidence, especially at the temporal scale humans need (decades, not centuries).
Risks arise not only from the physical manifestations of atmospheric heat, but from its social effects — the heat stress, starvation, migration, and conflict it will drive. Climate is what the military calls a “risk multiplier.” Faced with these rising and potentially catastrophic risks, and not much time, we very urgently need to figure out what to do.
And climate science, whatever its merits, can never tell us what to do.
In a large and complex culture, any collective decision involves myriad conflicting interests and influences. Decisions must take climate science into account, but also how to balance risks, how to judge the economic and political challenges against the atmospheric one, how to weigh lives, money, and time.
Decisions require not just true facts and accurate projections, they require wisdom, a balancing of risks and interests.
The military’s experience making urgent decisions in the face of deep uncertainty will soon prove extremely valuable to all of us. Like military strategists, we will have to think in terms of risk management, make decisions based not primarily on optimality — the situation is too urgent and too uncertain for economic optimization — but on resilience. We will need to ruggedize our systems, in Alex Steffen’s words.
We need ways to make good, reliable decisions quickly (especially, I would argue, at the city level). That is the kind of thing red-team exercises can help with. Come up with plans, budgets, and strategies and then assign teams to tear them apart: Point out overlooked risks, challenge the weighting of values, or expose possible unanticipated effects and feedback loops. Run the exercise until you find a plan that is rugged against multiple attacks, points of failure, and unanticipated outcomes.
Red team exercises are something other parts of the government could adopt from the military to healthy effect. It will be a shame if their name and reputation are tarnished by Pruitt’s hackish attempt to prosecute old and long-settled disputes over basic science.
Back in 2012, I did a couple of posts for Grist on these themes:
More Info: www.vox.com