Being apolitical doesn’t mean politics doesn’t exist

A few years ago, we had a writer who would constantly pitch articles to us about how the Indian government should be doing X, Y or Z in the fight against this or that disease. Their submissions grew quickly tiresome, and then wholly ridiculous when, in one article (well before the pandemic), they wrote that “the government should distribute good-quality masks for TB patients to use”. That the government should do this is a banal truism. But to make this recommendation over and over risks hiding from sight the fact that the government probably isn’t doing it not because it doesn’t know it should be done but because it has decided that what it is doing is more important, more necessary.

I find myself contending with many similar articles today. It is people’s right to express themselves, especially on counts on which the Indian government has dropped the ball via-à-vis the country’s COVID-19 epidemic. But to repeat recommendations that are often staring most of us in our faces I fear could be harmful – by only reminding us of what needs to be done but hasn’t been, over and over, is an act that deepens the elision and then the forgetting of the real reason why it hasn’t been done.

This doesn’t mean reminders are redundant; to the contrary, there is important value in repetition, so that we may not lose sight of which outcomes are ultimately desirable. But in tandem, we also need to start acknowledging what could be standing in the way and contemplating honestly whether what we’re advocating for could surmount that barrier. (This issue is also of a piece with the one about processes and outcomes – whereby some commentators stress on what the outcomes can or should be but have nothing to say about the processes that will get us there.)

For example, what happened to the rapid self-administered COVID-19 tests that many scientists in India developed last year? A reporter with an appetite for a small investigation could speak to the researchers, university administrators, the DST or the DBT as the case may be, and finally to officials in the Union health ministry, and weave together a story about where exactly in this pipeline of translation from the lab to the market the product vanished. There is value in knowing this but it is not paramount value. It is on equal footing with the view, from the perch of the political economy of public healthcare, that the Modi government is unlikely to okay the widespread use of such tests because many Indian states, especially BJP strongholds like Uttar Pradesh and Gujarat, are widely underreporting cases and deaths, and a state-managed project to suppress this data is easier to do with centralised testing facilities instead of freely distributed rapid tests whose results can also be quickly crowdsourced.

Quite a few authors of articles (many of them scientists) also like to say that we shouldn’t politicise the pandemic. They ignore, deliberately or otherwise, the fact that all pandemics are political by default. By definition, a pandemic is an epidemic of the same disease occurring in multiple geographically distinct regions at the same time. Governments have to get involved to manage them. Pandemics are not, and should never be, carte blanche for scientists to assume power, their prescriptions to assume primacy and their priorities to assume importance – by default. This can only lead to tunnel vision that is blind to problems, and in fact solutions, that arise from social and political compulsions.

Instead, it would be much more valuable if scientists, and in fact any expert in any field, could admit the politically motivated parts of a government’s response to its local epidemic instead of forcing everyone else to work around their fantasies of separation – and even better if they could join the collaborative efforts to develop solutions instead of trying to solve it like a science problem.

Anthony Fauci demonstrates this same… attitude (for lack of a better word), in an interview to Indian Express. When asked how he might respond to India’s crisis, he said:

The one thing I don’t want to do and I hope it doesn’t turn out this way, is to get involved in any sort of criticism of how India has handled the situation because then it becomes a political issue and I don’t want to do that since I’m a public health person and I’m not a political person.

It just seems to me that, right now, India is in a very difficult and desperate situation. I just got off, in preparation for this interview, I watched a clip from CNN… it seems to me it’s a desperate situation. So when you have a situation like that you’ve got to look at the absolute immediate.

I mean, first of all, I don’t know if India has put together a crisis group that would meet and start getting things organised. I heard from some of the people in the street bringing their mothers and their fathers and their sisters and their brothers searching for oxygen. They seem to think there really was not any organisation, any central organisation.

When asked about what India should do towards getting more people vaccinated:

You’ve got to get supplies. You’ve got to make contractual arrangements with the various companies that are out there in the world.

😑 And what about the fact that the US didn’t just advance-book the doses it needed but hoarded enough to vaccine its population thrice over, and blocked a petition by India and South Africa, and some other countries, to release the patents on US-made vaccines to increase global supply?

Fauci’s answers are, again, a reminder of which outcomes are or ought to be ultimately desirable – what goals we should be working towards – but simply repeating this needs to stop being a virtue. Fauci, like many others before him, doesn’t wish to consider why we’re not on the path to achieving these outcomes despite fairly common knowledge of their existence. He may not be a political person but being apolitical doesn’t mean politics isn’t involved. The bulk of India’s response to its COVID-19 epidemic has been driven by political strategy. Is the idea that even the ideal part science can play in this enterprise is decidedly finite so off-putting?

And even if there is a legitimate aspiration to expand the part science should be allowed to play in pandemic governance, scientists need to begin by convincing political institutions – and not attempt to seize power. They may be tempted to, as we all are, because our current national government seems to think accountability is blasphemy, and without being accountable it has stopped speaking for the people of the country, even those who put it in power. Nonetheless, the fruits of scientific work need to be democratic, too.

I would also contend that Fauci complicates the picture by implying that there can be a clean separation of political and scientific issues on this matter; many scientists in India and perhaps too many people in India have an elevated opinion of Fauci, to the point of considering his words to be gospel. As one friend put it recently, “Unbelievable – the idea that a single white man is the foremost disease epidemiologist in the world” (emphasis in the original). “How do people say it with a straight face?”

This post isn’t intended to disparage Fauci, even if our exalted opinion of him deserves to be taken down a few notches. Instead, I hope it highlights how Fauci nicely demonstrates a deceptively trivial prejudice against politics that, I could argue, helped land India in its latest disaster. Even when he pitches, for example, that India should lock itself down for a few weeks – instead of a few months like it did last year – he is at liberty to ignore the aftermath. We are not. Does that mean a lockdown shouldn’t come to be? No. But if he accommodated the political in his considerations, will it mean a man of his smarts will be able to meaningfully contemplate what the problem could really be? Maybe.

Featured image: Former US President Donald Trump, VP Mike Pence and NIAID director Anthony Fauci at a press briefing at the White House on April 16, 2020. Credit: Public domain.

COVID-19, AMR and India

Maybe it’s not a coincidence that India is today the site of the world’s largest COVID-19 outbreak and the world’s most prominent source of antimicrobial resistant (AMR) pathogens, a.k.a. ‘superbugs’. The former fiasco is the product of failures on multiple fronts – including policy, infrastructure, logistics, politics and even ideology, before we need to consider faster-spreading variants of the novel coronavirus. I’m not sure of all the factors that have contributed to AMR’s burgeoning in India; some of them are irrational use of broad-spectrum antibiotics, poor public hygiene, laws that disprivilege ecological health and subpar regulation of hospital practices.

But all this said, both the second COVID-19 wave and the rise of AMR have benefited from being able to linger in the national population for longer. The longer the novel coronavirus keeps circulating in the population, the more opportunities there are for new variants to appear; the longer pathogens are exposed repeatedly to antimicrobial agents in different environments, the more opportunities they have to develop resistance. And once these things happen, their effects on their respective crises are exacerbated by the less-than-ideal social, political and economic contexts in which they manifest.

Again, I should emphasise that if these afflictions have been assailing India for such a long time and in increasingly stronger ways, it’s because of many distinct, and some overlapping, forces – but I think it’s also true that the resulting permission for pathogens to persist, at scale to boot, makes India more vulnerable than other countries might be to problems of the emergent variety. And given the failures that give rise to this vulnerability, this can be one hell of a vicious cycle.

The constructionist hypothesis and expertise during the pandemic

Now that COVID-19 cases are rising again in the country, the trash talk against journalists has been rising in tandem. The Indian government was unprepared and hapless last year, and it is this year as well, if only in different ways. In this environment, journalists have come under criticism along two equally unreasonable lines. First, many people, typically supporters of the establishment, either don’t or can’t see the difference between good journalism and contrarianism, and don’t or can’t acknowledge the need for expertise in the practise of journalism.

Second, the recognition of expertise itself has been sorely lacking across the board. Just like last year, when lots of scientists dropped what they were doing and started churning out disease transmission models each one more ridiculous than the last, this time — in response to a more complex ‘playing field’ involving new and more variants, intricate immunity-related mechanisms and labyrinthine clinical trial protocols — too many people have been shouting their mouths off, and getting most of it wrong. All of these misfires have reminded us of two things: again and again that expertise matters, and that unless you’re an expert on something, you’re unlikely to know how deep it runs. The latter isn’t trivial.

There’s what you know you don’t know, and what you don’t know you don’t know. The former is the birthplace of learning. It’s the perfect place from which to ask questions and fill gaps in your knowledge. The latter is the verge of presumptuousness — a very good place from which to make a fool of yourself. Of course, this depends on your attitude: you can always be mindful of the Great Unknown, such as it is, and keep quiet.

As these tropes have played out in the last few months, I have been reminded of an article written by the physicist Philip Warren Anderson, called ‘More is Different’, and published in 1972. His idea here is simple: that the statement “if everything obeys the same fundamental laws, then the only scientists who are studying anything really fundamental are those who are working on those laws” is false. He goes on to explain:

“The main fallacy in this kind of thinking is that the reductionist hypothesis does not by any means imply a ‘constructionist’ one: The ability to reduce everything to simple fundamental laws does not imply the ability to start from those laws and reconstruct the universe. … The constructionist hypothesis breaks down when confronted with the twin difficulties of scale and complexity. The behaviour of large and complex aggregates of elementary particles, it turns out, is not to be understood in terms of a simple extrapolation of the properties of a few particles. Instead, at each level of complexity entirely new properties appear, and the understanding of the new behaviours requires research which I think is as fundamental in its nature as any other.”

The seemingly endless intricacies that beset the interaction of a virus, a human body and a vaccine are proof enough that the “twin difficulties of scale and complexity” are present in epidemiology, immunology and biochemistry as well – and testament to the foolishness of any claims that the laws of conservation, thermodynamics or motion can help us say, for example, whether a particular variant infects people ‘better’ because it escapes the immune system better or because the immune system’s protection is fading.

But closer to my point: not even all epidemiologists, immunologists and/or biochemists can meaningfully comment on every form or type of these interactions at all times. I’m not 100% certain, but at least from what I’ve learnt reporting topics in physics (and conceding happily that covering biology seems more complex), scale and complexity work not just across but within fields as well. A cardiologist may be able to comment meaningfully on COVID-19’s effects on the heart in some patients, or a neurologist on the brain, but they may not know how the infection got there even if all these organs are part of the same body. A structural biologist may have deciphered why different mutations change the virus’s spike protein the way they do, but she can’t be expected to comment meaningfully on how epidemiological models will have to be modified for each variant.

To people who don’t know better, a doctor is a doctor and a scientist is a scientist, but as journalists plumb the deeper, more involved depths of a new yet specific disease, we bear from time to time a secret responsibility to be constructive and not reductive, and this is difficult. It becomes crucial for us to draw on the wisdom of the right experts, who wield the right expertise, so that we’re moving as much and as often as possible away from the position of what we don’t know we don’t know even as we ensure we’re not caught in the traps of what experts don’t know they don’t know. The march away from complete uncertainty and towards the names of uncertainty is precarious.

Equally importantly, at this time, to make our own jobs that much easier, or at least less acerbic, it’s important for everyone else to know this as well – that more is vastly different.

The Government Project

Considering how much the Government of India has missed anticipating – the rise of a second wave of COVID-19 infections, the crippling medical oxygen shortage, the circulation of new variants of concern – I have been wondering about why we assemble giant institutions like governments: among other things, they are to weather uncertainty as best as our resources and constitutional moralities will allow. Does this mean bigger the institution, the farther into the future it will be able to see? (I’m assuming here a heuristic that we normally are able to see, say, a day into the future with 51% uncertainty – slightly better than chance – for each event in this period.)

Imagine behemoth structures like the revamped Central Vista in New Delhi and other stonier buildings in other cities and towns, the tentacles of state control dictating terms in every conceivable niche of daily life, and a prodigious bureaucracy manifested as tens of thousands of civil servants most of whom do nothing more than play musical chairs with The Paperwork.

Can such a super-institution see farther into the future? It should be able to, I’d expect, considering the future – in one telling – is mostly history filtered through our knowledge, imagination, priorities and memories in the present. A larger government should be able to achieve this feat by amassing the talents of more people in its employ, labouring in more and more fields of study and experiment, effectively shining millions of tiny torchlights into the great dark of what’s to come.

Imagine one day that the Super Government’s structures grow so big, so vast that all the ministers determine to float it off into space, to give it as much room as it needs to expand, so that it may perform its mysterious duties better – something like the City of a Thousand Planets.

The people of Earth watch as the extraterrestrial body grows bigger and bigger, heavier and heavier. It attracts the attention of aliens, who are bemused and write in their notebooks: “One could, in principle, imagine ‘creatures’ that are far larger. If we draw on Landauer’s principle describing the minimum energy for computation, and if we assume that the energy resources of an ultra-massive, ultra-slothful, multi-cellular organism are devoted only to slowly reproducing its cells, we find that problems of mechanical support outstrip heat transport as the ultimate limiting factor to growth. At these scales, though, it becomes unclear what such a creature would do, or how it might have evolved.”

One day, after many years of attaching thousands of additional rooms, corridors, cabinets and canteens to its corse, the government emits a gigantic creaking sound, and collapses into a black hole. On the outside, black holes are dull: they just pull things towards them. That the pulled things undergo mind-boggling distortions and eventual disintegration is a triviality. The fun part is what happens on the inside – where spacetime, instead of being an infinite fabric, is curved in on itself. Here, time moves sideways, perpendicular to the direction in which it flows on the outside, in a state of “perpetual freefall”. The torch-wielding scientists, managers, IAS officers, teachers, thinkers are all trapped on the inner surface of a relentless sphere, running round and round, shining their lights to look not into the actual future but to find their way within the government itself.

None of them can turn around to see who it is that’s chasing them, or whom they’re chasing. The future is lost to them. Their knowledge of history is only marginally better: they have books to tell them what happened, according to a few historians at one point of time; they can’t know what the future can teach us about history. And what they already know they constantly mix and remix until, someday, like the progeny of generations of incest, what emerges is a disgusting object of fascination.

The government project is complete: it is so big that it can no longer see past itself.

Exporting risk

I’m torn between admitting that our cynicism about scientists’ solutions for the pandemic is warranted and the palliative effects of reading this Reuters report about seemingly nothing more than the benevolence of richer nations not wasting their vaccine doses:

Apart from all the other transgressions – rather business as usual practices – that have transpired thus far, this is one more testimony to all those instances of insisting “we’re all in this together” being just platitudes uttered to move things along. And if it weren’t enough already that poorer nations must make do with the leftovers of their richer counterparts that ordered not as many doses as they needed but as many as would reassure their egos (a form of pseudoscience not new to the western world), the doses they’re going to give away have been rejected for fear of leading to rare but life-threatening blood clots. To end the pandemic, what kills you can be given away?

US experiments find hint of a break in the laws of physics

At 9 pm India time on April 7, physicists at an American research facility delivered a shot in the arm to efforts to find flaws in a powerful theory that explains how the building blocks of the universe work.

Physicists are looking for flaws in it because the theory doesn’t have answers to some questions – like “what is dark matter?”. They hope to find a crack or a hole that might reveal the presence of a deeper, more powerful theory of physics that can lay unsolved problems to rest.

The story begins in 2001, when physicists performing an experiment in Brookhaven National Lab, New York, found that fundamental particles called muons weren’t behaving the way they were supposed to in the presence of a magnetic field. This was called the g-2 anomaly (after a number called the gyromagnetic factor).

An incomplete model

Muons are subatomic and can’t be seen with the naked eye, so it could’ve been that the instruments the physicists were using to study the muons indirectly were glitching. Or it could’ve been that the physicists had made a mistake in their calculations. Or, finally, what the physicists thought they knew about the behaviour of muons in a magnetic field was wrong.

In most stories we hear about scientists, the first two possibilities are true more often: they didn’t do something right, so the results weren’t what they expected. But in this case, the physicists were hoping they were wrong. This unusual wish was the product of working with the Standard Model of particle physics.

According to physicist Paul Kyberd, the fundamental particles in the universe “are classified in the Standard Model of particle physics, which theorises how the basic building blocks of matter interact, governed by fundamental forces.” The Standard Model has successfully predicted the numerous properties and behaviours of these particles. However, it’s also been clearly wrong about some things. For example, Kyberd has written:

When we collide two fundamental particles together, a number of outcomes are possible. Our theory allows us to calculate the probability that any particular outcome can occur, but at energies beyond which we have so far achieved, it predicts that some of these outcomes occur with a probability of greater than 100% – clearly nonsense.

The Standard Model also can’t explain what dark matter is, what dark energy could be or if gravity has a corresponding fundamental particle. It predicted the existence of the Higgs boson but was off about the particle’s mass by a factor of 100 quadrillion.

All these issues together imply that the Standard Model is incomplete, that it could be just one piece of a much larger ‘super-theory’ that works with more particles and forces than we currently know. To look for these theories, physicists have taken two broad approaches: to look for something new, and to find a mistake with something old.

For the former, physicists use particle accelerators, colliders and sophisticated detectors to look for heavier particles thought to exist at higher energies, and whose discovery would prove the existence of a physics beyond the Standard Model. For the latter, physicists take some prediction the Standard Model has made with a great degree of accuracy and test it rigorously to see if it holds up. Studies of muons in a magnetic field are examples of this.

According to the Standard Model, a number associated with the way a muon swivels in a magnetic field is equal to 2 plus 0.00116591804 (with some give or take). This minuscule addition is the handiwork of fleeting quantum effects in the muon’s immediate neighbourhood, and which make it wobble. (For a glimpse of how hard these calculations can be, see this description.)

Fermilab result

In the early 2000s, the Brookhaven experiment measured the deviation to be slightly higher than the model’s prediction. Though it was small – off by about 0.00000000346 – the context made it a big deal. Scientists know that the Standard Model has a habit of being really right, so when it’s wrong, the wrongness becomes very important. And because we already know the model is wrong about other things, there’s a possibility that the two things could be linked. It’s a potential portal into ‘new physics’.

“It’s a very high-precision measurement – the value is unequivocal. But the Standard Model itself is unequivocal,” Thomas Kirk, an associate lab director at Brookhaven, had told Science in 2001. The disagreement between the values implied “that there must be physics beyond the Standard Model.”

This is why the results physicists announced today are important.

The Brookhaven experiment that ascertained the g-2 anomaly wasn’t sensitive enough to say with a meaningful amount of confidence that its measurement was really different from the Standard Model prediction, or if there could be a small overlap.

Science writer Brianna Barbu has likened the mystery to “a single hair found at a crime scene with DNA that didn’t seem to match anyone connected to the case. The question was – and still is – whether the presence of the hair is just a coincidence, or whether it is actually an important clue.”

So to go from ‘maybe’ to ‘definitely’, physicists shipped the 50-foot-wide, 15-tonne magnet that the Brookhaven facility used in its Muon g-2 experiment to Fermilab, the US’s premier high-energy physics research facility in Illinois, and built a more sensitive experiment there.

The new result is from tests at this facility: that the observation differs from the Standard Model’s predicted value by 0.00000000251 (give or take a bit).

The Fermilab results are expected to become a lot better in the coming years, but even now they represent an important contribution. The statistical significance of the Brookhaven result was just below the threshold at which scientists could claim evidence but the combined significance of the two results is well above.

Potential dampener

So for now, the g-2 anomaly seems to be real. It’s not easy to say if it will continue to be real as physicists further upgrade the Fermilab g-2’s performance.

In fact there appears to be another potential dampener on the horizon. An independent group of physicists has had a paper published today saying that the Fermilab g-2 result is actually in line with the Standard Model’s prediction and that there’s no deviation at all.

This group, called BMW, used a different way to calculate the Standard Model’s value of the number in question than the Fermilab folks did. Aida El-Khadra, a theoretical physicist at the University of Illinois, told Quanta that the Fermilab team had yet to check BMW’s approach, but if it was found to be valid, the team would “integrate it into its next assessment”.

The ‘Fermilab approach’ itself is something physicists have worked with for many decades, so it’s unlikely to be wrong. If the BMW approach checks out, it’s possible according to Quanta that just the fact that two approaches lead to different predictions of the number’s value is likely to be a new mystery.

But physicists are excited for now. “It’s almost the best possible case scenario for speculators like us,” Gordan Krnjaic, a theoretical physicist at Fermilab who wasn’t involved in the research, told Scientific American. “I’m thinking much more that it’s possibly new physics, and it has implications for future experiments and for possible connections to dark matter.”

The current result is also important because the other way to look for physics beyond the Standard Model – by looking for heavier or rarer particles – can be harder.

This isn’t simply a matter of building a larger particle collider, powering it up, smashing particles and looking for other particles in the debris. For one, there is a very large number of energy levels at which a particle might form. For another, there are thousands of other particle interactions happening at the same time, generating a tremendous amount of noise. So without knowing what to look for and where, a particle hunt can be like looking for a very small needle in a very large haystack.

The ‘what’ and ‘where’ instead come from different theories that physicists have worked out based on what we know already, and design experiments depending on which one they need to test.

Into the hospital

One popular theory is called supersymmetry: it predicts that every elementary particle in the Standard Model framework has a heavier partner particle, called a supersymmetric partner. It also predicts the energy ranges in which these particles might be found. The Large Hadron Collider (LHC) in CERN, near Geneva, was powerful enough to access some of these energies, so physicists used it and went looking last decade. They didn’t find anything.

A table showing searches for particles associated with different post-standard-model theories (orange labels on the left). The bars show the energy levels up to which the ATLAS detector at the Large Hadron Collider has not found the particles. Table: ATLAS Collaboration/CERN

Other groups of physicists have also tried to look for rarer particles: ones that occur at an accessible energy but only once in a very large number of collisions. The LHC is a machine at the energy frontier: it probes higher and higher energies. To look for extremely rare particles, physicists explore the intensity frontier – using machines specialised in generating collisions.

The third and last is the cosmic frontier, in which scientists look for unusual particles coming from outer space. For example, early last month, researchers reported that they had detected an energetic anti-neutrino (a kind of fundamental particle) coming from outside the Milky Way participating in a rare event that scientists predicted in 1959 would occur if the Standard Model is right. The discovery, in effect, further cemented the validity of the Standard Model and ruled out one potential avenue to find ‘new physics’.

This event also recalls an interesting difference between the 2001 and 2021 announcements. The late British scientist Francis J.M. Farley wrote in 2001, after the Brookhaven result:

… the new muon (g-2) result from Brookhaven cannot at present be explained by the established theory. A more accurate measurement … should be available by the end of the year. Meanwhile theorists are looking for flaws in the argument and more measurements … are underway. If all this fails, supersymmetry can explain the data, but we would need other experiments to show that the postulated particles can exist in the real world, as well as in the evanescent quantum soup around the muon.

Since then, the LHC and other physics experiments have sent supersymmetry ‘to the hospital’ on more than one occasion. If the anomaly continues to hold up, scientists will have to find other explanations. Or, if the anomaly whimpers out, like so many others of our time, we’ll just have to put up with the Standard Model.

Featured image: A storage-ring magnet at Fermilab whose geometry allows for a very uniform magnetic field to be established in the ring. Credit: Glukicov/Wikimedia Commons, CC BY-SA 4.0.

The Wire Science
April 8, 2021

13 years

I realised some time ago that I completed 13 years of blogging around January or March (archives on this blog go back to March 2012; the older posts are just awful to read today. The month depends on which post I consider to be my first.). Regardless of how bad my writing in this period has been, I consider the unlikely duration of this habit to be one of the few things that I can be, and enjoy being, unabashedly proud of. I’m grateful at this point for two particular groups of people: readers who email notes (of appreciation or criticism) in response to posts and reviewers who go through many of my posts before they’re published. Let me thank the latter by name: Dhiya, Thomas, Madhusudhan, Jahnavi, Nehmat and Shankar. Thomas in particular has been of tremendous help – an engaged interlocutor of the sort that’s hard to find on any day. Thank you all very much!

On the NASEM report on solar geoengineering

A top scientific body in the US has asked the government to fund solar geoengineering research in a bid to help researchers and policymakers know the fullest extent of their options to help the US deal with climate change.

Solar geoengineering is a technique in which sunlight-reflecting aerosols are pumped into the air, to subtract the contribution of solar energy to Earth’s rapidly warming surface.

The technique is controversial because the resulting solar dimming is likely to affect ecosystems in a detrimental way and because, without the right policy safeguards, its use could allow polluting industries to continue polluting.

The US National Academies of Sciences, Engineering and Medicine (NASEM) released its report on March 25. It describes three solar geoengineering strategies: stratospheric aerosol injection (described above), marine cloud brightening and cirrus cloud thinning.

“Although scientific agencies in the US and abroad have funded solar-geoengineering research in the past, governments have shied away from launching formal programmes in the controversial field,” Nature News reported. In addition, “Previous recommendations on the subject by elite scientific panels in the US and abroad have gone largely unheeded” – including NASEM’s own 2015 recommendations.

To offset potential roadblocks, the new report requests the US government to setup a transparent research administration framework, including a code of conduct, an open registry of researchers’ proposals for studies and a fixed process by which the government will grant permits for “outdoor experiments”. And to achieve these goals, it recommends a dedicated allocation of $100-200 million (Rs 728-1,456 crore).

According to experts who spoke to Nature News, Joe Biden being in the Oval Office instead of Donald Trump is crucial: “many scientists say that Biden’s administration has the credibility to advance geoengineering research without rousing fears that doing so will merely displace regulations and other efforts to curb greenhouse gases, and give industry a free pass.”

This is a significant concern for many reasons – including, notably, countries’ differentiated commitments to ensuring outcomes specified in the Paris Agreement and the fact that climate is a global, not local, phenomenon.

Data from 1900 to 2017 indicates that US residents had the world’s ninth highest carbon dioxide emissions per capita; Indians were 116th. This disparity, which holds between the group of large developed countries and of large developing countries in general, has given rise to demands by the latter that the former should do more to tackle climate change.

The global nature of climate is a problem particularly for countries with industries that depend on natural resources like solar energy and seasonal rainfall. One potential outcome of geoengineering is that climatic changes induced in one part of the planet could affect outcomes in a faraway part.

For example, the US government sowed the first major seeds of its climate research programme in the late 1950s after the erstwhile Soviet Union set off three nuclear explosions underground to divert the flow of a river. American officials were alarmed because they were concerned that changes to the quality and temperature of water entering the Arctic Ocean could affect climate patterns.

For another, a study published in 2007 found that when Mt Pinatubo in the Philippines erupted in 1991, it spewed 20 million tonnes of sulphur dioxide that cooled the whole planet by 0.5º C. As a result, the amount of rainfall dropped around the world as well.

In a 2018 article, Rob Bellamy, a Presidential Fellow in Environment at the University of Manchester, had also explained why stratospheric aerosol injection is “a particularly divisive idea”:

For example, as well as threatening to disrupt regional weather patterns, it, and the related idea of brightening clouds at sea, would require regular “top-ups” to maintain cooling effects. Because of this, both methods would suffer from the risk of a “termination effect”: where any cessation of cooling would result in a sudden rise in global temperature in line with the level of greenhouse gases in the atmosphere. If we hadn’t been reducing our greenhouse gas emissions in the background, this could be a very sharp rise indeed.

A study published in 2018 had sought to quantify the extent of this effect – a likely outcome of, say, projects losing political favour or funding. The researchers created a model in which humans pumped five million tonnes of sulphur dioxide a year into the stratosphere for 50 years, and suddenly stopped. One of the paper’s authors told The Wire Science at the time: “This would lead to a rapid increase in temperature, two- to four-times more rapid than climate change without geoengineering. This increase would be dangerous for biodiversity and ecosystems.”

Prakash Kashwan, a political scientist at the University of Connecticut and a senior research fellow of the Earth System Governance Project, has also written for The Wire Science about the oft-ignored political and social dimensions of geoengineering.

He told the New York Times on March 25, “Once these kinds of projects get into the political process, the scientists who are adding all of these qualifiers and all of these cautionary notes” – such as “the steps urged in the report to protect the interests of poorer countries” – “aren’t in control”. In December 2018, Kashwan also advised caution in the face of scientific pronouncements:

The community of climate engineering scientists tends to frame geoengineering in certain ways over other equally valid alternatives. This includes considering the global average surface temperature as the central climate impact indicator and ignoring vested interests linked to capital-intensive geoengineering infrastructure. This could bias future R&D trajectories in this area. And these priorities, together with the assessments produced by eminent scientific bodies, have contributed to the rise of a de facto form of governance. In other words, some ‘high-level’ scientific pronouncements have assumed stewardship of climate geoengineering in the absence of other agents. Such technocratic modes of governance don’t enjoy broad-based social or political legitimacy.

For now, the NASEM report “does not in any way advocate deploying the technology, but says research is needed to understand the options if the climate crisis becomes even more serious,” according to Nature News. The report itself concludes thus:

The recommendations in this report focus on an initial, exploratory phase of a research program. The program might be continued or expand over a longer term, but may also shrink over time, with some or all elements eventually terminated, if early research suggests strong reasons why solar geoengineering should not be pursued. The proposed approaches to transdisciplinary research, research governance, and robust stakeholder engagement are different from typical climate research programs and will be a significant undertaking; but such efforts will enable the research to proceed in an effective, societally responsive manner.

Matthew Watson, a reader in natural hazards at the University of Bristol, had discussed a similar issue in conversation with Bellamy in 2018, including an appeal to our moral responsibilities the same way ‘geoengineers’ must be expected to look out for transnational and subnational effects:

Do you remember the film 127 Hours? It tells the (true) story of a young climber who, pinned under a boulder in the middle of nowhere, eventually ends up amputating his arm, without anaesthetic, with a pen knife. In the end, he had little choice. Circumstances dictate decisions. So if you believe climate change is going to be severe, you have no option but to research the options (I am not advocating deployment) as broadly as possible. Because there may well come a point in the future where it would be immoral not to intervene.

The Wire Science
March 30, 2021

Lord of the Rings Day

Here’s wishing you a Happy Lord of the Rings Day! (Previous editions: 2020, 2019, 2018, 2017, 2016, 2014.) On this day in the book, Frodo, Sam and Smeagol (with help from Gandalf, Aragon, Gimli, Legolas, Faramir, Eowyn, Theoden, Eomer, Treebeard and the Ents, Meriadoc, Peregrin, Galadriel, Arwen and many, many others) destroyed the One Ring in the fires of Orodruin, throwing down Barad-dûr, bringing about the end of Sauron the Deceiver and forestalling the Age of Orcs, and making way for peace on Middle Earth.

Even though my – rather our – awareness of the different ways in which Lord of the Rings and J.R.R. Tolkien’s literature more broadly are flawed increases every year, in the last year in particular I’ve come back to the trilogy more than before, finding both that it’s entwined in messy ways with various events in my life, having been the sole piece of fantasy I read between 1998 and 2005, and more importantly, because Lord of the Rings was more expansive than most similar work of its time, I often can’t help but see that much of what came after is responding to it in some way. (I know I’ve made this point before but, as in journalism, what stories we have available to tell doesn’t change just because we’re repeating ourselves. :D)

This said, I don’t know what Lord of the Rings means today, in 2021, simply because the last 15 months or so have been a lousy time for replenishing my creative energy. I haven’t been very able to think about stories, leave alone write them – but on the flip side, I’ve been very grateful for the work and energy of story writers and tellers, irrespective of how much of it they’ve been able to summon, whether one sentence or one book, or the forms in which they’ve been able to summon it, whether as a Wikipedia page, a blog post, a D&D quest or a user’s manual. I’m thankful for all the stories that keep us going just as I’m mindful that everything, even the alt text of images, is fiction. More power to anyone thinking of something and putting it down in words – and also to your readers.

Defending philosophy of science

From Carl Bergstrom’s Twitter thread about a new book called How Irrationality Created Modern Science, by Michael Strevens:

The Iron Rule from the book is, in Bergstrom’s retelling, “no use of philosophical reasoning in the mode of Aristotle; no leveraging theological or scriptural understanding in the mode of Descartes. Formal scientific arguments must be sterilised, to use Strevens’s word, of subjectivity and non-empirical content.” I was particularly taken by the use of the term ‘individual’ in the tweet I’ve quoted above. The point about philosophical argumentation being an “individual” technique is important, often understated.

There are some personal techniques we use to discern some truths but which we don’t publicise. But the more we read and converse with others doing the same things, the more we may find that everyone has many of the same stand-ins – tools or methods that we haven’t empirically verified to be true and/or legitimate but which we have discerned, based on our experiences, to be suitably good guiding lights.

I discovered this issue first when I read Paul Feyerabend’s Against Method many years ago, and then in practice when I found during reporting some stories that scientists in different situations often developed similar proxies for processes that couldn’t be performed in their fullest due to resource constraints. But they seldom spoke to each other (especially across institutes), thus allowing an ideal view of how to do something to crenellate even as almost every one did that something in a similarly different way.

A very common example of this is scientists evaluating papers based on the ‘prestigiousness’ and/or impact factors of the journals the papers are published in, instead of based on their contents – often simply for lack of time and proper incentives. As a result, ideas like “science is self-correcting” and “science is objective” persist as ideals because they’re products of applying the Iron Rule to the process of disseminating the products of one’s research.

But “by turning a lens on the practice of science itself,” to borrow Bergstrom’s words, philosophies of science allow us to spot deviations from the prescribed normal – originating from “Iron Rule Ecclesiastics” like Richard Dawkins – and, to me particularly, revealing how we really, actually do it and how we can become better at it. Or as Bergstrom put it: “By understanding how norms and institutions create incentives to which scientists respond …, we can find ways to nudge the current system toward greater efficiency.”

(It is also gratifying a bit to see the book as well as Bergstrom pick on Lawrence Krauss. The book goes straight into my reading list.)