COVID-19, AMR and India

Maybe it’s not a coincidence that India is today the site of the world’s largest COVID-19 outbreak and the world’s most prominent source of antimicrobial resistant (AMR) pathogens, a.k.a. ‘superbugs’. The former fiasco is the product of failures on multiple fronts – including policy, infrastructure, logistics, politics and even ideology, before we need to consider faster-spreading variants of the novel coronavirus. I’m not sure of all the factors that have contributed to AMR’s burgeoning in India; some of them are irrational use of broad-spectrum antibiotics, poor public hygiene, laws that disprivilege ecological health and subpar regulation of hospital practices.

But all this said, both the second COVID-19 wave and the rise of AMR have benefited from being able to linger in the national population for longer. The longer the novel coronavirus keeps circulating in the population, the more opportunities there are for new variants to appear; the longer pathogens are exposed repeatedly to antimicrobial agents in different environments, the more opportunities they have to develop resistance. And once these things happen, their effects on their respective crises are exacerbated by the less-than-ideal social, political and economic contexts in which they manifest.

Again, I should emphasise that if these afflictions have been assailing India for such a long time and in increasingly stronger ways, it’s because of many distinct, and some overlapping, forces – but I think it’s also true that the resulting permission for pathogens to persist, at scale to boot, makes India more vulnerable than other countries might be to problems of the emergent variety. And given the failures that give rise to this vulnerability, this can be one hell of a vicious cycle.

The constructionist hypothesis and expertise during the pandemic

Now that COVID-19 cases are rising again in the country, the trash talk against journalists has been rising in tandem. The Indian government was unprepared and hapless last year, and it is this year as well, if only in different ways. In this environment, journalists have come under criticism along two equally unreasonable lines. First, many people, typically supporters of the establishment, either don’t or can’t see the difference between good journalism and contrarianism, and don’t or can’t acknowledge the need for expertise in the practise of journalism.

Second, the recognition of expertise itself has been sorely lacking across the board. Just like last year, when lots of scientists dropped what they were doing and started churning out disease transmission models each one more ridiculous than the last, this time — in response to a more complex ‘playing field’ involving new and more variants, intricate immunity-related mechanisms and labyrinthine clinical trial protocols — too many people have been shouting their mouths off, and getting most of it wrong. All of these misfires have reminded us of two things: again and again that expertise matters, and that unless you’re an expert on something, you’re unlikely to know how deep it runs. The latter isn’t trivial.

There’s what you know you don’t know, and what you don’t know you don’t know. The former is the birthplace of learning. It’s the perfect place from which to ask questions and fill gaps in your knowledge. The latter is the verge of presumptuousness — a very good place from which to make a fool of yourself. Of course, this depends on your attitude: you can always be mindful of the Great Unknown, such as it is, and keep quiet.

As these tropes have played out in the last few months, I have been reminded of an article written by the physicist Philip Warren Anderson, called ‘More is Different’, and published in 1972. His idea here is simple: that the statement “if everything obeys the same fundamental laws, then the only scientists who are studying anything really fundamental are those who are working on those laws” is false. He goes on to explain:

“The main fallacy in this kind of thinking is that the reductionist hypothesis does not by any means imply a ‘constructionist’ one: The ability to reduce everything to simple fundamental laws does not imply the ability to start from those laws and reconstruct the universe. … The constructionist hypothesis breaks down when confronted with the twin difficulties of scale and complexity. The behaviour of large and complex aggregates of elementary particles, it turns out, is not to be understood in terms of a simple extrapolation of the properties of a few particles. Instead, at each level of complexity entirely new properties appear, and the understanding of the new behaviours requires research which I think is as fundamental in its nature as any other.”

The seemingly endless intricacies that beset the interaction of a virus, a human body and a vaccine are proof enough that the “twin difficulties of scale and complexity” are present in epidemiology, immunology and biochemistry as well – and testament to the foolishness of any claims that the laws of conservation, thermodynamics or motion can help us say, for example, whether a particular variant infects people ‘better’ because it escapes the immune system better or because the immune system’s protection is fading.

But closer to my point: not even all epidemiologists, immunologists and/or biochemists can meaningfully comment on every form or type of these interactions at all times. I’m not 100% certain, but at least from what I’ve learnt reporting topics in physics (and conceding happily that covering biology seems more complex), scale and complexity work not just across but within fields as well. A cardiologist may be able to comment meaningfully on COVID-19’s effects on the heart in some patients, or a neurologist on the brain, but they may not know how the infection got there even if all these organs are part of the same body. A structural biologist may have deciphered why different mutations change the virus’s spike protein the way they do, but she can’t be expected to comment meaningfully on how epidemiological models will have to be modified for each variant.

To people who don’t know better, a doctor is a doctor and a scientist is a scientist, but as journalists plumb the deeper, more involved depths of a new yet specific disease, we bear from time to time a secret responsibility to be constructive and not reductive, and this is difficult. It becomes crucial for us to draw on the wisdom of the right experts, who wield the right expertise, so that we’re moving as much and as often as possible away from the position of what we don’t know we don’t know even as we ensure we’re not caught in the traps of what experts don’t know they don’t know. The march away from complete uncertainty and towards the names of uncertainty is precarious.

Equally importantly, at this time, to make our own jobs that much easier, or at least less acerbic, it’s important for everyone else to know this as well – that more is vastly different.

The Government Project

Considering how much the Government of India has missed anticipating – the rise of a second wave of COVID-19 infections, the crippling medical oxygen shortage, the circulation of new variants of concern – I have been wondering about why we assemble giant institutions like governments: among other things, they are to weather uncertainty as best as our resources and constitutional moralities will allow. Does this mean bigger the institution, the farther into the future it will be able to see? (I’m assuming here a heuristic that we normally are able to see, say, a day into the future with 51% uncertainty – slightly better than chance – for each event in this period.)

Imagine behemoth structures like the revamped Central Vista in New Delhi and other stonier buildings in other cities and towns, the tentacles of state control dictating terms in every conceivable niche of daily life, and a prodigious bureaucracy manifested as tens of thousands of civil servants most of whom do nothing more than play musical chairs with The Paperwork.

Can such a super-institution see farther into the future? It should be able to, I’d expect, considering the future – in one telling – is mostly history filtered through our knowledge, imagination, priorities and memories in the present. A larger government should be able to achieve this feat by amassing the talents of more people in its employ, labouring in more and more fields of study and experiment, effectively shining millions of tiny torchlights into the great dark of what’s to come.

Imagine one day that the Super Government’s structures grow so big, so vast that all the ministers determine to float it off into space, to give it as much room as it needs to expand, so that it may perform its mysterious duties better – something like the City of a Thousand Planets.

The people of Earth watch as the extraterrestrial body grows bigger and bigger, heavier and heavier. It attracts the attention of aliens, who are bemused and write in their notebooks: “One could, in principle, imagine ‘creatures’ that are far larger. If we draw on Landauer’s principle describing the minimum energy for computation, and if we assume that the energy resources of an ultra-massive, ultra-slothful, multi-cellular organism are devoted only to slowly reproducing its cells, we find that problems of mechanical support outstrip heat transport as the ultimate limiting factor to growth. At these scales, though, it becomes unclear what such a creature would do, or how it might have evolved.”

One day, after many years of attaching thousands of additional rooms, corridors, cabinets and canteens to its corse, the government emits a gigantic creaking sound, and collapses into a black hole. On the outside, black holes are dull: they just pull things towards them. That the pulled things undergo mind-boggling distortions and eventual disintegration is a triviality. The fun part is what happens on the inside – where spacetime, instead of being an infinite fabric, is curved in on itself. Here, time moves sideways, perpendicular to the direction in which it flows on the outside, in a state of “perpetual freefall”. The torch-wielding scientists, managers, IAS officers, teachers, thinkers are all trapped on the inner surface of a relentless sphere, running round and round, shining their lights to look not into the actual future but to find their way within the government itself.

None of them can turn around to see who it is that’s chasing them, or whom they’re chasing. The future is lost to them. Their knowledge of history is only marginally better: they have books to tell them what happened, according to a few historians at one point of time; they can’t know what the future can teach us about history. And what they already know they constantly mix and remix until, someday, like the progeny of generations of incest, what emerges is a disgusting object of fascination.

The government project is complete: it is so big that it can no longer see past itself.

Exporting risk

I’m torn between admitting that our cynicism about scientists’ solutions for the pandemic is warranted and the palliative effects of reading this Reuters report about seemingly nothing more than the benevolence of richer nations not wasting their vaccine doses:

Apart from all the other transgressions – rather business as usual practices – that have transpired thus far, this is one more testimony to all those instances of insisting “we’re all in this together” being just platitudes uttered to move things along. And if it weren’t enough already that poorer nations must make do with the leftovers of their richer counterparts that ordered not as many doses as they needed but as many as would reassure their egos (a form of pseudoscience not new to the western world), the doses they’re going to give away have been rejected for fear of leading to rare but life-threatening blood clots. To end the pandemic, what kills you can be given away?

US experiments find hint of a break in the laws of physics

At 9 pm India time on April 7, physicists at an American research facility delivered a shot in the arm to efforts to find flaws in a powerful theory that explains how the building blocks of the universe work.

Physicists are looking for flaws in it because the theory doesn’t have answers to some questions – like “what is dark matter?”. They hope to find a crack or a hole that might reveal the presence of a deeper, more powerful theory of physics that can lay unsolved problems to rest.

The story begins in 2001, when physicists performing an experiment in Brookhaven National Lab, New York, found that fundamental particles called muons weren’t behaving the way they were supposed to in the presence of a magnetic field. This was called the g-2 anomaly (after a number called the gyromagnetic factor).

An incomplete model

Muons are subatomic and can’t be seen with the naked eye, so it could’ve been that the instruments the physicists were using to study the muons indirectly were glitching. Or it could’ve been that the physicists had made a mistake in their calculations. Or, finally, what the physicists thought they knew about the behaviour of muons in a magnetic field was wrong.

In most stories we hear about scientists, the first two possibilities are true more often: they didn’t do something right, so the results weren’t what they expected. But in this case, the physicists were hoping they were wrong. This unusual wish was the product of working with the Standard Model of particle physics.

According to physicist Paul Kyberd, the fundamental particles in the universe “are classified in the Standard Model of particle physics, which theorises how the basic building blocks of matter interact, governed by fundamental forces.” The Standard Model has successfully predicted the numerous properties and behaviours of these particles. However, it’s also been clearly wrong about some things. For example, Kyberd has written:

When we collide two fundamental particles together, a number of outcomes are possible. Our theory allows us to calculate the probability that any particular outcome can occur, but at energies beyond which we have so far achieved, it predicts that some of these outcomes occur with a probability of greater than 100% – clearly nonsense.

The Standard Model also can’t explain what dark matter is, what dark energy could be or if gravity has a corresponding fundamental particle. It predicted the existence of the Higgs boson but was off about the particle’s mass by a factor of 100 quadrillion.

All these issues together imply that the Standard Model is incomplete, that it could be just one piece of a much larger ‘super-theory’ that works with more particles and forces than we currently know. To look for these theories, physicists have taken two broad approaches: to look for something new, and to find a mistake with something old.

For the former, physicists use particle accelerators, colliders and sophisticated detectors to look for heavier particles thought to exist at higher energies, and whose discovery would prove the existence of a physics beyond the Standard Model. For the latter, physicists take some prediction the Standard Model has made with a great degree of accuracy and test it rigorously to see if it holds up. Studies of muons in a magnetic field are examples of this.

According to the Standard Model, a number associated with the way a muon swivels in a magnetic field is equal to 2 plus 0.00116591804 (with some give or take). This minuscule addition is the handiwork of fleeting quantum effects in the muon’s immediate neighbourhood, and which make it wobble. (For a glimpse of how hard these calculations can be, see this description.)

Fermilab result

In the early 2000s, the Brookhaven experiment measured the deviation to be slightly higher than the model’s prediction. Though it was small – off by about 0.00000000346 – the context made it a big deal. Scientists know that the Standard Model has a habit of being really right, so when it’s wrong, the wrongness becomes very important. And because we already know the model is wrong about other things, there’s a possibility that the two things could be linked. It’s a potential portal into ‘new physics’.

“It’s a very high-precision measurement – the value is unequivocal. But the Standard Model itself is unequivocal,” Thomas Kirk, an associate lab director at Brookhaven, had told Science in 2001. The disagreement between the values implied “that there must be physics beyond the Standard Model.”

This is why the results physicists announced today are important.

The Brookhaven experiment that ascertained the g-2 anomaly wasn’t sensitive enough to say with a meaningful amount of confidence that its measurement was really different from the Standard Model prediction, or if there could be a small overlap.

Science writer Brianna Barbu has likened the mystery to “a single hair found at a crime scene with DNA that didn’t seem to match anyone connected to the case. The question was – and still is – whether the presence of the hair is just a coincidence, or whether it is actually an important clue.”

So to go from ‘maybe’ to ‘definitely’, physicists shipped the 50-foot-wide, 15-tonne magnet that the Brookhaven facility used in its Muon g-2 experiment to Fermilab, the US’s premier high-energy physics research facility in Illinois, and built a more sensitive experiment there.

The new result is from tests at this facility: that the observation differs from the Standard Model’s predicted value by 0.00000000251 (give or take a bit).

The Fermilab results are expected to become a lot better in the coming years, but even now they represent an important contribution. The statistical significance of the Brookhaven result was just below the threshold at which scientists could claim evidence but the combined significance of the two results is well above.

Potential dampener

So for now, the g-2 anomaly seems to be real. It’s not easy to say if it will continue to be real as physicists further upgrade the Fermilab g-2’s performance.

In fact there appears to be another potential dampener on the horizon. An independent group of physicists has had a paper published today saying that the Fermilab g-2 result is actually in line with the Standard Model’s prediction and that there’s no deviation at all.

This group, called BMW, used a different way to calculate the Standard Model’s value of the number in question than the Fermilab folks did. Aida El-Khadra, a theoretical physicist at the University of Illinois, told Quanta that the Fermilab team had yet to check BMW’s approach, but if it was found to be valid, the team would “integrate it into its next assessment”.

The ‘Fermilab approach’ itself is something physicists have worked with for many decades, so it’s unlikely to be wrong. If the BMW approach checks out, it’s possible according to Quanta that just the fact that two approaches lead to different predictions of the number’s value is likely to be a new mystery.

But physicists are excited for now. “It’s almost the best possible case scenario for speculators like us,” Gordan Krnjaic, a theoretical physicist at Fermilab who wasn’t involved in the research, told Scientific American. “I’m thinking much more that it’s possibly new physics, and it has implications for future experiments and for possible connections to dark matter.”

The current result is also important because the other way to look for physics beyond the Standard Model – by looking for heavier or rarer particles – can be harder.

This isn’t simply a matter of building a larger particle collider, powering it up, smashing particles and looking for other particles in the debris. For one, there is a very large number of energy levels at which a particle might form. For another, there are thousands of other particle interactions happening at the same time, generating a tremendous amount of noise. So without knowing what to look for and where, a particle hunt can be like looking for a very small needle in a very large haystack.

The ‘what’ and ‘where’ instead come from different theories that physicists have worked out based on what we know already, and design experiments depending on which one they need to test.

Into the hospital

One popular theory is called supersymmetry: it predicts that every elementary particle in the Standard Model framework has a heavier partner particle, called a supersymmetric partner. It also predicts the energy ranges in which these particles might be found. The Large Hadron Collider (LHC) in CERN, near Geneva, was powerful enough to access some of these energies, so physicists used it and went looking last decade. They didn’t find anything.

A table showing searches for particles associated with different post-standard-model theories (orange labels on the left). The bars show the energy levels up to which the ATLAS detector at the Large Hadron Collider has not found the particles. Table: ATLAS Collaboration/CERN

Other groups of physicists have also tried to look for rarer particles: ones that occur at an accessible energy but only once in a very large number of collisions. The LHC is a machine at the energy frontier: it probes higher and higher energies. To look for extremely rare particles, physicists explore the intensity frontier – using machines specialised in generating collisions.

The third and last is the cosmic frontier, in which scientists look for unusual particles coming from outer space. For example, early last month, researchers reported that they had detected an energetic anti-neutrino (a kind of fundamental particle) coming from outside the Milky Way participating in a rare event that scientists predicted in 1959 would occur if the Standard Model is right. The discovery, in effect, further cemented the validity of the Standard Model and ruled out one potential avenue to find ‘new physics’.

This event also recalls an interesting difference between the 2001 and 2021 announcements. The late British scientist Francis J.M. Farley wrote in 2001, after the Brookhaven result:

… the new muon (g-2) result from Brookhaven cannot at present be explained by the established theory. A more accurate measurement … should be available by the end of the year. Meanwhile theorists are looking for flaws in the argument and more measurements … are underway. If all this fails, supersymmetry can explain the data, but we would need other experiments to show that the postulated particles can exist in the real world, as well as in the evanescent quantum soup around the muon.

Since then, the LHC and other physics experiments have sent supersymmetry ‘to the hospital’ on more than one occasion. If the anomaly continues to hold up, scientists will have to find other explanations. Or, if the anomaly whimpers out, like so many others of our time, we’ll just have to put up with the Standard Model.

Featured image: A storage-ring magnet at Fermilab whose geometry allows for a very uniform magnetic field to be established in the ring. Credit: Glukicov/Wikimedia Commons, CC BY-SA 4.0.

The Wire Science
April 8, 2021

13 years

I realised some time ago that I completed 13 years of blogging around January or March (archives on this blog go back to March 2012; the older posts are just awful to read today. The month depends on which post I consider to be my first.). Regardless of how bad my writing in this period has been, I consider the unlikely duration of this habit to be one of the few things that I can be, and enjoy being, unabashedly proud of. I’m grateful at this point for two particular groups of people: readers who email notes (of appreciation or criticism) in response to posts and reviewers who go through many of my posts before they’re published. Let me thank the latter by name: Dhiya, Thomas, Madhusudhan, Jahnavi, Nehmat and Shankar. Thomas in particular has been of tremendous help – an engaged interlocutor of the sort that’s hard to find on any day. Thank you all very much!

On the NASEM report on solar geoengineering

A top scientific body in the US has asked the government to fund solar geoengineering research in a bid to help researchers and policymakers know the fullest extent of their options to help the US deal with climate change.

Solar geoengineering is a technique in which sunlight-reflecting aerosols are pumped into the air, to subtract the contribution of solar energy to Earth’s rapidly warming surface.

The technique is controversial because the resulting solar dimming is likely to affect ecosystems in a detrimental way and because, without the right policy safeguards, its use could allow polluting industries to continue polluting.

The US National Academies of Sciences, Engineering and Medicine (NASEM) released its report on March 25. It describes three solar geoengineering strategies: stratospheric aerosol injection (described above), marine cloud brightening and cirrus cloud thinning.

“Although scientific agencies in the US and abroad have funded solar-geoengineering research in the past, governments have shied away from launching formal programmes in the controversial field,” Nature News reported. In addition, “Previous recommendations on the subject by elite scientific panels in the US and abroad have gone largely unheeded” – including NASEM’s own 2015 recommendations.

To offset potential roadblocks, the new report requests the US government to setup a transparent research administration framework, including a code of conduct, an open registry of researchers’ proposals for studies and a fixed process by which the government will grant permits for “outdoor experiments”. And to achieve these goals, it recommends a dedicated allocation of $100-200 million (Rs 728-1,456 crore).

According to experts who spoke to Nature News, Joe Biden being in the Oval Office instead of Donald Trump is crucial: “many scientists say that Biden’s administration has the credibility to advance geoengineering research without rousing fears that doing so will merely displace regulations and other efforts to curb greenhouse gases, and give industry a free pass.”

This is a significant concern for many reasons – including, notably, countries’ differentiated commitments to ensuring outcomes specified in the Paris Agreement and the fact that climate is a global, not local, phenomenon.

Data from 1900 to 2017 indicates that US residents had the world’s ninth highest carbon dioxide emissions per capita; Indians were 116th. This disparity, which holds between the group of large developed countries and of large developing countries in general, has given rise to demands by the latter that the former should do more to tackle climate change.

The global nature of climate is a problem particularly for countries with industries that depend on natural resources like solar energy and seasonal rainfall. One potential outcome of geoengineering is that climatic changes induced in one part of the planet could affect outcomes in a faraway part.

For example, the US government sowed the first major seeds of its climate research programme in the late 1950s after the erstwhile Soviet Union set off three nuclear explosions underground to divert the flow of a river. American officials were alarmed because they were concerned that changes to the quality and temperature of water entering the Arctic Ocean could affect climate patterns.

For another, a study published in 2007 found that when Mt Pinatubo in the Philippines erupted in 1991, it spewed 20 million tonnes of sulphur dioxide that cooled the whole planet by 0.5º C. As a result, the amount of rainfall dropped around the world as well.

In a 2018 article, Rob Bellamy, a Presidential Fellow in Environment at the University of Manchester, had also explained why stratospheric aerosol injection is “a particularly divisive idea”:

For example, as well as threatening to disrupt regional weather patterns, it, and the related idea of brightening clouds at sea, would require regular “top-ups” to maintain cooling effects. Because of this, both methods would suffer from the risk of a “termination effect”: where any cessation of cooling would result in a sudden rise in global temperature in line with the level of greenhouse gases in the atmosphere. If we hadn’t been reducing our greenhouse gas emissions in the background, this could be a very sharp rise indeed.

A study published in 2018 had sought to quantify the extent of this effect – a likely outcome of, say, projects losing political favour or funding. The researchers created a model in which humans pumped five million tonnes of sulphur dioxide a year into the stratosphere for 50 years, and suddenly stopped. One of the paper’s authors told The Wire Science at the time: “This would lead to a rapid increase in temperature, two- to four-times more rapid than climate change without geoengineering. This increase would be dangerous for biodiversity and ecosystems.”

Prakash Kashwan, a political scientist at the University of Connecticut and a senior research fellow of the Earth System Governance Project, has also written for The Wire Science about the oft-ignored political and social dimensions of geoengineering.

He told the New York Times on March 25, “Once these kinds of projects get into the political process, the scientists who are adding all of these qualifiers and all of these cautionary notes” – such as “the steps urged in the report to protect the interests of poorer countries” – “aren’t in control”. In December 2018, Kashwan also advised caution in the face of scientific pronouncements:

The community of climate engineering scientists tends to frame geoengineering in certain ways over other equally valid alternatives. This includes considering the global average surface temperature as the central climate impact indicator and ignoring vested interests linked to capital-intensive geoengineering infrastructure. This could bias future R&D trajectories in this area. And these priorities, together with the assessments produced by eminent scientific bodies, have contributed to the rise of a de facto form of governance. In other words, some ‘high-level’ scientific pronouncements have assumed stewardship of climate geoengineering in the absence of other agents. Such technocratic modes of governance don’t enjoy broad-based social or political legitimacy.

For now, the NASEM report “does not in any way advocate deploying the technology, but says research is needed to understand the options if the climate crisis becomes even more serious,” according to Nature News. The report itself concludes thus:

The recommendations in this report focus on an initial, exploratory phase of a research program. The program might be continued or expand over a longer term, but may also shrink over time, with some or all elements eventually terminated, if early research suggests strong reasons why solar geoengineering should not be pursued. The proposed approaches to transdisciplinary research, research governance, and robust stakeholder engagement are different from typical climate research programs and will be a significant undertaking; but such efforts will enable the research to proceed in an effective, societally responsive manner.

Matthew Watson, a reader in natural hazards at the University of Bristol, had discussed a similar issue in conversation with Bellamy in 2018, including an appeal to our moral responsibilities the same way ‘geoengineers’ must be expected to look out for transnational and subnational effects:

Do you remember the film 127 Hours? It tells the (true) story of a young climber who, pinned under a boulder in the middle of nowhere, eventually ends up amputating his arm, without anaesthetic, with a pen knife. In the end, he had little choice. Circumstances dictate decisions. So if you believe climate change is going to be severe, you have no option but to research the options (I am not advocating deployment) as broadly as possible. Because there may well come a point in the future where it would be immoral not to intervene.

The Wire Science
March 30, 2021