Broken clocks during the pandemic

Proponents of conspiracy theories during the pandemic, at least in India, appear to be like broken clocks: they are right by coincidence, without the right body of evidence to back their claims. Two of the most read articles published by The Wire Science in the last 15 months have been the fact-checks of Luc Montagnier’s comments on the two occasions he spoke up in the French press. On the first, he said the novel coronavirus couldn’t have evolved naturally; the second, he insisted mass vaccination was a big mistake. The context in which Montagnier published his remarks evolved considerably between the two events, and it tells an important story.

When Montagnier said in April 2020 that the virus was lab-made, the virus’s spread was just beginning to accelerate in India, Europe and the US, and the proponents of the lab-leak hypothesis to explain the virus’s origins had few listeners and were consigned firmly to the margins of popular discourse on the subject. In this environment, Montagnier’s comments stuck out like a sore thumb, and were easily dismissed.

But when Montagnier said in May 2021 that mass vaccination is a mistake, the context was quite different: in the intervening period, Nicholas Wade had published his article on why we couldn’t dismiss the lab-leak hypothesis so quickly; the WHO’s missteps were more widely known; China’s COVID-19 outbreak had come completely under control (actually or for all appearances); many vaccine-manufacturers’ immoral and/or unethical business practices had come to light; more people were familiar with the concept and properties of viral strains; the WHO had filed its controversial report on the possible circumstances of the virus’s origins in China; etc. As a result, speaking now, Montagnier wasn’t so quickly dismissed. Instead, he was, to many observers, the man who had got it right the first time, was brave enough to stick his neck out in support of an unpopular idea, and was speaking up yet again.

The problem here is that Luc Montagnier is a broken clock – in the way even broken clocks are right twice a day: not because they actually tell the time but because the time is coincidentally what the clock face is stuck at. On both occasions, the conclusions of Montagnier’s comments coincided with what conspiracists have been going on about since the pandemic’s start, but on both occasions, his reasoning was wrong. The same has been true of many other claims made during the pandemic. People have said things that have turned out to be true but they themselves have always been wrong, whenever they have been wrong, because their particular reasons for something to be true were wrong.

That is, unless you can say why you’re right, you’re not right. Unless you can explain why the time is what it is, you’re not a clock!

Montagnier’s case also illuminates a problem with soothsaying: if you wish to be a prophet, it is in your best interests to make as many predictions as possible – to increase the odds of reality coinciding with at least one prediction in time. And when such a coincidence does happen, it doesn’t mean the prophet was right; it means they weren’t wrong. There is a big difference between these positions, and which becomes pronounced when the conspiratorially-minded start incorporating every article published anywhere, from The Wire Science to The Daily Guardian, into their narratives of choice.

As the lab-leak hypothesis moved from the fringes of society to the centre and came mistakenly to conflate possibility with likelihood (i.e. zoonotic spillover and lab-leak are two valid hypotheses for the virus’s origins but they aren’t equally likely to be true), the conspiratorial proponents of the lab-leak hypotheses (the ones given to claiming Chinese scientists engineered the pathogen as a weapon, etc.) have steadily woven imaginary threads between the hypothesis and Indian scientists who opposed Covaxin’s approval, the Congress leaders who “mooted” vaccine hesitancy in their constituencies, scientists who made predictions that came to be wrong, even vaccines that were later found to have rare side-effects restricted to certain demographic groups.

The passage of time is notable here. I think adherents of lab-leak conspiracies are motivated by an overarching theory born entirely of speculation, not evidence, and who then pick and choose from events to build the case that the theory is true. I say ‘overarching’ because, to the adherents, the theory is already fully formed and true, and that pieces of it become visible to observers as and when the corresponding events play out. This could explain why time is immaterial to them. You and I know that Shahid Jameel and Gagandeep Kang cast doubt on Covaxin’s approval (and not Covaxin itself) after the time we were aware that Covaxin’s phase 3 clinical trials were only just getting started in December, and before Covishield’s side-effects in Europe and the US came to light (with the attendant misreporting). We know that at the time Luc Montagnier said the novel coronavirus was made in a lab, last year, we didn’t know nearly enough about the structural biology underlying the virus’s behaviour; we do now.

The order of events matters: we went from ignorance to knowledge, from knowing to knowing more, from thinking one thing to – in the face of new information – thinking another. But the conspiracy-theorists and their ideas lie outside of time: the order of events doesn’t matter; instead, to these people, 2021, 2022, 2023, etc. are preordained. They seem to be simply waiting for the coincidences to roll around.

An awareness of the time dimension (so to speak), or more accurately of the arrow of time, leads straightforwardly to the proper practice of science in our day-to-day affairs as well. As I said, unless you can say why you’re right, you’re not right. This is why effects lie in the future of causes, and why theories lie in the causal future of evidence. What we can say to be true at this moment depends entirely on what we know at this moment. If we presume what we can say at this moment to be true will always be true, we become guilty of dragging our theory into the causal history of the evidence – simply because we are saying that the theory will come true given enough time in which evidence can accrue.

This protocol (of sorts) to verify the truth of claims isn’t restricted to the philosophy of science, even if it finds powerful articulation there: a scientific theory isn’t true if it isn’t falsifiable outside its domain of application. It is equally legitimate and necessary in the daily practice of science and its methods, on Twitter and Facebook, in WhatsApp groups, every time your father, your cousin or your grand-uncle begins a question with “If the lab-leak hypothesis isn’t true…”.

‘Surface of last screaming’

This has nothing to do with anything in the news. I was reading up about the Big Bang for a blog post when I came across this lucid explanation – so good it’s worth sharing for that reason alone – for the surface of last scattering, the site of an important event in the history of the universe. A lot happens by this moment, even if it happens only 379,000 year after the bang, and it’s easy to get lost in the details. But as the excerpt below shows, coming at it from the PoV of phase transitions considerably simplifies the picture (assuming of course that you’re comfortable with phase transitions).

To visualise how this effect arises, imagine that you are in a large field filled with people screaming. You are screaming too. At some time t = 0 everyone stops screaming simultaneously. What will you hear? After 1 second you will still be able to hear the distant screaming of people more than 330 metres away (the speed of sound in air, v, is about 330 m/s). After 3 seconds you will be able to hear distant screams from people more than 1 kilometre away (even though those distant people stopped screaming when you did). At any time t, assuming a suitably heightened sense of hearing, you will hear some faint screams, but the closest and loudest will be coming from people a distance v*t away. This distance defines the ‘surface of last screaming’ and this surface is receding from you at the speed of sound. …

When something is hot and cools down it can undergo a phase transition. For example, hot steam cools down to become water, and when cooled further it becomes ice. The Universe went through similar phase transitions as it expanded and cooled. One such phase transition … produced the last scattering surface. When the Universe was cool enough to allow the electrons and protons to fall together, they ‘recombined’ to form neutral hydrogen. […] photons do not interact with neutral hydrogen, so they were free to travel through the Universe without being scattered. They decoupled from matter. The opaque Universe then became transparent.

Imagine you are living 15 billion years ago. You would be surrounded by a very hot opaque plasma of electrons and protons. The Universe is expanding and cooling. When the Universe cools down below a critical temperature, the fog clears instantaneously everywhere. But you would not be able to see that it has cleared everywhere because, as you look into the far distance, you would be seeing into the opaque past of distant parts of the Universe. As the Universe continues to expand and cool you would be able to see farther, but you would always see the bright opaque fog in the distance, in the past. That bright fog is the surface of last scattering. It is the boundary between a transparent and an opaque universe and you can still see it today, 15 billion years later.

On the PSA’s new COVID-19 advisory

The Office of the Principal Scientific Adviser (PSA) to the Government of India, K. VijayRaghavan, has issued a new advisory emphasising the roles of “masks, distance, sanitation and ventilation” to end the country’s COVID-19 epidemic.

Over the last few weeks, VijayRaghavan has been sharing similar messages from his official Twitter account, most recently on May 15. The advisory reflects many of his suggestions, including following COVID-appropriate behaviour, maintaining distances and ventilating rooms.

It’s noticeable that this advisory has shown up in the middle of the country’s second wave – instead of before the first wave, which began around February 2020.

What to do but not what not

The advisory begins with a recap of how the virus is transmitted: “Even one infected person showing no symptoms can release enough droplets to create a ‘viral load’ that can infect many others,” it says. “Symptoms can take up to two weeks to appear in an infected person, during which time they may continue to transmit the virus to others. Some people may never show symptoms and still transmit the virus.”

Next, it briefly discusses the mechanics of aerosol versus droplet transmission, starting with: “Aerosols and droplets are the key transmission mode (sic) of the virus.”

Both aerosols and droplets describe fluid particles; aerosols are just smaller and lighter, thus less susceptible to being pulled down by gravity and more likely to be blown around by winds. All persons release both aerosols and droplets when they breathe, talk, cough, sneeze, etc. If a person is infected with the novel coronavirus, the aerosols and droplets will contain viral particles.

Early last year, when the pandemic was just getting underway, the WHO refused to admit that particles of the novel coronavirus could be transmitted through aerosols.

Because droplets are bigger, they typically settle down to the ground within six feet, or two metres – a point that the advisory also makes. Fluid dynamics expert Ronak Gupta wrote for The Wire Science in May 2020 that this figure is based on a study conducted with tuberculosis patients in the 1930s. This is also where the suggestion to maintain a distance of six feet from people around you comes from.

The WHO didn’t change its mind until 200 scientists expressed their concerns in an “unusually public outcry”, and forced the international body to reconsider the evidence for aerosol transmission.

The advisory also reminds readers of the reality of transmission via surfaces. “Virus-laden droplets can survive on non-porous surfaces such as glass, plastic and stainless steel for a fairly long time,” it reads, and recommends that people regularly clean surfaces they touch often, like door-knobs and light switches, with bleach or phenyl.

Note that the US Centres for Disease Control (CDC) said last month that the chance of a person getting infected after touching surfaces is “1 in 10,000”. The PSA’s advisory doesn’t mention the relative unlikelihood of this mode of transmission, suggesting that it is as equally likely as the other two (droplets and aerosols).

The advisory also doesn’t advise against unnecessarily disinfecting certain surfaces. For example, Sumi Krishna has written about civic officials in Bengaluru spraying bleach on trees, roads and vehicle tyres, echoing reports of similar activities in other parts of the country. In the face of uncertainty about what to do, people have often done whatever they can – leading to what some have called ‘hygiene theatre’.

In one infamous incident in March last year, municipal officers in Bareilly forced a group of migrant workers to squat on the road and hosed them with a sodium hypochlorite solution.

Masking strategies

Next, the advisory discusses masks and the risks of different masking strategies in different situations.

Wear a surgical mask, then wear another tight fitting cloth mask over it. If you do not have a surgical mask, wear two cotton masks together. Ideally surgical mask should be used only once, but when pairing, you can use it up to 5 times by leaving it in a dry place for 7 days after one use (ideally give it some sun exposure) and then reuse as double layer.

The next five pages are devoted to ventilation. It describes having windows and doors shut as “poor ventilation”, having doors and windows open as “good ventilation” and doors/windows open with an exhaust system as “ideal ventilation”. Second, it describes what people living in hutments can do to improve ventilation, including requesting gram panchayats to install small windows to improve air flow.

Its recommendation for work spaces is the same as in the first case, with the addition of air conditioners, thus ensuring both directed inflow and directed outflow.

Fourth, the advisory recommends “offices, auditoriums, shopping malls, etc.” install “roof ventilators and HEPA/regular filters” and that the people in charge be mindful of the filters’ service lives and replacement schedules. High-efficiency particular air (HEPA) filters are filters designed to remove at least 99.95% of particles that are 0.3 µm wide.

Finally, it makes similar recommendations for people travelling in crowded vehicles, that passengers should have as many opportunities as possible for fresh air to flow in a direction away from them.

The last part of the advisory deals with “community-level testing and isolation” in rural and semi-urban areas.

Get rapid antigen testing done for people entering the area. ASHA/anganwadi/health workers must be trained and protected for conducting the rapid antigen test. These health workers must be given a certified N95 mask even if they are vaccinated. ASHA/anganwadi/health workers to also be provided oximeters to monitor infected person (sic).

It also asks that “every person who tests positive should be given a certified N95 mask, or a surgical mask if this is not feasible, and advised isolated (sic) as per ICMR guidelines.”

Other communication events

Many behavioural economists have said that clear, simple and authoritative communication that encourages good behaviour vis-à-vis controlling the epidemic is always welcome. The Office of the PSA also released an advisory early last year stressing the importance of wearing masks, including a widely appreciated guidance (PDF) on how to stitch one’s own masks.

This said, the advisory’s timing is interesting because it coincides with some other significant pandemic-related communication events.

First, Tamil TV channels, especially those affiliated with the Dravida Munnetra Kazhagam, have been airing a two-minute long video in which Tamil Nadu’s new chief minister M.K. Stalin describes the proper way to wear a mask, to wash hands, the importance of staying indoors to the extent possible and of getting vaccinated as soon as possible.

Second, the CDC recently updated its guidelines to say people in the US who had received both doses of their vaccines needn’t have to wear masks in public. The update stoked some confusion among experts, but CDC director Rochelle Walensky said the agency’s decision was based on early reports that suggest the Pfizer-BioNTech and Moderna vaccines also significantly cut transmission. That is, people who have received both doses of either vaccine also become highly unlikely to be able to transmit the virus if they get infected.

However, any similar data for the vaccines in use in India – mostly Covishield and Covaxin – are lacking. We don’t know, provably at least, if Covishield and Covaxin cut down transmission and, if so, to what extent.

Conflicting aims

Third, as a document that sticks to the ‘physical’ characteristics of the epidemic, the advisory doesn’t address what people without the resources whose availability it presumes – like room enough to maintain a gap of six feet, exhaust fans that open to meaningful air-streams or clean running water – can do to avoid getting infected.

Even if this criticism can’t be laid at the PSA office’s doorstep alone, the issues make up a significant point of difference between the government’s poor communication thus far and the lived realities of many lakhs of Indians, especially in rural parts, where the second wave is expected to surge next.

By not discussing what the government could have done better, differently or not at all, the advisory gives the impression that the pandemic’s future is in the people’s hands. However, the Indian and many state governments are already out of step with many of the recommendations.

For example, the advisory spends five pages on ventilating rooms properly – but many vaccination centres and hospitals around the country have become potential sites of new infections themselves: the queues are long, the rooms often crowded; in some instances, overcrowding forced healthcare workers to accommodate two people on each bed, sharing oxygen supplies.

For another example, the advisory suggests that air-conditioned trains and buses install HEPA filters. This demand is a far cry from the conditions in which many of these vehicles, but especially buses, currently operate – with torn seat covers, broken handles and guardrails and grime covering most surfaces.

There is no indication that VijayRaghavan or his colleagues have spoken up against these shortcomings before. VijayRaghavan himself has been silent in the face of many questions about his role in the government’s actions. For example, as Karan Thapar asked: “when Assam health minister Himanta Biswa Sarma said there was no need to wear masks in his state or when Uttarakhand Chief Minister Tirath Singh Rawat said faith in god and the power of the Ganga river would protect people from COVID-19”, what did VijayRaghavan say to them?

Prem Shankar Jha has pointed out that the government has maintained “two conflicting aims”, each undermining the other, since the pandemic began: one to avert a second wave and the other to extract political mileage. The PSA is a high office in the government: articulating the bare minimum of what needs to be done is necessary to further one set of aims. But what happens when he doesn’t push back against the other?

The Wire Science
May 21, 2021

On The Lancet editorial

On May 8, The Lancet published an editorial criticising the Narendra Modi government’s response to India’s second COVID-19 outbreak, which has been redefining the meaning of ‘snafu’. All hell broke loose. Of course, hell has been breaking loose for quite some time in India now, but the latest episode was in one specific sense also gratifying to behold.

There were the usual rumbles in the week following the editorial’s appearance, until on May 17 India’s health minister Dr Harsh Vardhan shared a blog post penned by a Pankaj Chaturvedi deriding The Lancet‘s choice of arguments. (I’m fond of emboldening the honorific: it shows doctors can be stupid, too.) The post is mostly whataboutery studded with a few gems about how people who liked the editorial aren’t pissed enough that favipiravir and hydroxychloroquine were approved for use – as Dr Vardhan’s ministry did. More importantly, it seems Dr Vardhan, and his colleagues in fact, threw themselves into the barrel looking for anything with fully formed sentences that said The Lancet was wrong – a sign that their government still gives a damn about what foreign journals, and perhaps magazines and newspapers too, say about it.

We need to use this to the fullest extent, and I daresay that it’s the sort of resource the government is going to find difficult to duplicate as well. There was recently an article about Modi doing a great job during India’s second wave, published in an outlet called The Daily Guardian. There was enough confusion to draw the UK’s The Guardian forward and clarify that it was an unaffiliated entity – but no amount of confusion can supplant an institution, no matter how illiberal. Aakar Patel wrote in 2018: “The fact is that intelligent and intellectual bigotry is very difficult. There are very few people who can pull that off and that is why we can count the major ones on our fingers.” This is also why the government has twitched every time the New York Times, the Washington Post, BBC, The Lancet, Science and The BMJ have published articles critical of India, even if this isn’t the full picture.

It’s doubly interesting that the sophistry of the rejoinders aside, Dr Vardhan, his colleagues in government and his party’s supporters have all been antagonised by what they perceive to be a political act by a medical journal. This is an untenable distinction, of course – one that fantasises about a clear divide between the Watchers, who look out, and the Watched, who dare not know what the Watchers see. More pertinently, it’s a reflection of what they desperately expect from their own compatriots: to ignore how bad political leadership could help a virus ravage hundreds of thousands of families.

Featured image credit: Kunj Parekh/Unsplash.

On the International Day of Light, remembering darkness

Today is the International Day of Light. According to a UNESCO note:

The International Day of Light is celebrated on 16 May each year, the anniversary of the first successful operation of the laser in 1960 by physicist and engineer, Theodore Maiman. This day is a call to strengthen scientific cooperation and harness its potential to foster peace and sustainable development.

While there are natural lasers, the advent of the laser in Maiman’s hands portended an age of manipulating light to make big advances in a variety of fields. Some applications that come immediately to mind are communications, laser-guided missiles, laser cooling and astronomy. I’m not sure why “the first successful operation of the laser” came to be commemorated as a ‘day of light’, but since it has, its association with astronomy is interesting.

Astronomers have found themselves collecting to protest the launch and operation of satellite constellations, notably SpaceX’s Starlink and Amazon’s upcoming Project Kuiper, after the first few Starlink satellites interfered with astronomical observations. SpaceX has since acknowledged the problem and said it will reduce the reflectance of the satellites it launches, but I don’t think the problem has been resolved. Further, the constellation isn’t complete: thousands of additional satellites will be launched in the coming years, and will be joined by other constellations as well, and the full magnitude of the problem may only become apparent then.

Nonetheless, astronomers’ opposition to such projects brought the idea of the night sky as a shared commons into the public spotlight. Just like arid lands, butterfly colonies and dense jungles are part of our ecological commons, and plateaus, shelves and valleys make up our geological commons, and so on – all from which the human species draws many benefits, an obstructed view of the night sky and the cosmic objects embedded therein characterise the night sky as a commons. And as we draw tangible health and environmental benefits from terrestrial commons, the view of the night sky has, over millennia, offered humans many cultural benefits as well.

However, this conflict between SpaceX, etc. on one hand and the community of astronomers on the other operates at a higher level, so to speak: its resolution in favour of astronomers, for example, still only means – for example – operating fewer satellites or satellites at a higher altitude, avoiding major telescopes’ fields of view, painting the underside with a light-absorbing substance, etc. The dispute is unlikely to have implications for the night sky as a commons of significant cultural value. If it is indeed to be relevant, the issue needs to become deep enough to accommodate, and continue to draw the attention and support of academics and corporations for, the non-rivalrous enjoyment of the night sky with the naked eye, for nothing other than to write better poems, have moonlight dinners and marvel at the stars.

As our fight to preserve our ecological commons has hardened in the face of a state bent on destroying them to line the pockets of its capital cronies, I think we have also started to focus on the economic and other tangible benefits this commons offers us – at the cost of downplaying a transcendental right to their sensual enjoyment. Similarly, we shouldn’t have to justify the importance of the night sky as a commons beyond saying we need to be able to enjoy it.

Of course such an argument is bound to be accused of being disconnected from reality, that the internet coverage Starlink offers will be useful for people living in as-yet unconnected or poorly connected areas – and I agree. We can’t afford to fight all our battles at once if we also expect to reap meaningful rewards in a reasonably short span of time, so let me invoke a reminder that the night sky is an environmental resource as well: “Let us be reminded, as we light the world to suit our needs and whims,” a 2005 book wrote, “that doing so may come at the expense of other living beings, some of whom detect subtle gradations of light to which we are blind, and for whom the night is home.”

More relevant to our original point, of the International Day of Light, astronomy and the night sky as a commons, a study published in 2016 reported the following data:

According to the study paper (emphasis added):

The sky brightness levels are those used in the tables and indicate the following: up to 1% above the natural light (0 to 1.7 μcd/m2; black); from 1 to 8% above the natural light (1.7 to 14 μcd/m2; blue); from 8 to 50% above natural nighttime brightness (14 to 87 μcd/m2; green); from 50% above natural to the level of light under which the Milky Way is no longer visible (87 to 688 μcd/m2; yellow); from Milky Way loss to estimated cone stimulation (688 to 3000 μcd/m2; red); and very high nighttime light intensities, with no dark adaption for human eyes (>3000 μcd/m2; white).

That is, in India, ‘only’ a fifth of the population experiences a level of light pollution that obscures the faintest view of the Milky Way – but in Saudi Arabia, at the other end of the spectrum, nearly 92% of the population is correspondingly unfortunate (not that I presume they care).

DOI: 10.1126/sciadv.1600377
DOI: 10.1126/sciadv.1600377

While India has a few red dots, it is green almost nearly everywhere and blue nearly everywhere, lest we get carried away. Why, in March this year, Dorje Angchuk, an engineer at the Indian Astronomical Observatory in Hanle who has come to be celebrated for his beautiful photographs of the night sky over Ladakh, tweeted the following images that demonstrate how even highly localised light pollution, which may not be well-represented on global maps, can affect the forms and hues in which the night sky is available to us.

The distribution of colours also reinforces our understanding of cities as economic engines – where more lights shine brighter and, although this map doesn’t show it, more pollutants hang in the air. The red dots over India coincide roughly with the country’s major urban centres: New Delhi, Mumbai, Kolkata, Guwahati, Hyderabad, Bangalore and Chennai. Photographs of winter mornings in New Delhi show the sky as an orange-brown mass through which even the Sun is barely visible; other stars are out of the question, even after astronomical twilight.

But again, we’re not going to have much luck if our demands to reduce urban emissions are premised on our inability to have an unobstructed view of the night sky. At the same time we must achieve this victory: there’s no reason our street lamps and other public lighting facilities need to throw light upwards, that our billboards need to be visible from above, etc., and perhaps every reason for human settlements – even if they aren’t erected around or near optical telescopes – to turn off as many lights as they can between 10 pm and 6 am. The regulation of light needs to be part of our governance. And the International Day of Light should be a reminder that our light isn’t the only light we need, that darkness is a virtue as well.

The constructionist hypothesis and expertise during the pandemic

Now that COVID-19 cases are rising again in the country, the trash talk against journalists has been rising in tandem. The Indian government was unprepared and hapless last year, and it is this year as well, if only in different ways. In this environment, journalists have come under criticism along two equally unreasonable lines. First, many people, typically supporters of the establishment, either don’t or can’t see the difference between good journalism and contrarianism, and don’t or can’t acknowledge the need for expertise in the practise of journalism.

Second, the recognition of expertise itself has been sorely lacking across the board. Just like last year, when lots of scientists dropped what they were doing and started churning out disease transmission models each one more ridiculous than the last, this time — in response to a more complex ‘playing field’ involving new and more variants, intricate immunity-related mechanisms and labyrinthine clinical trial protocols — too many people have been shouting their mouths off, and getting most of it wrong. All of these misfires have reminded us of two things: again and again that expertise matters, and that unless you’re an expert on something, you’re unlikely to know how deep it runs. The latter isn’t trivial.

There’s what you know you don’t know, and what you don’t know you don’t know. The former is the birthplace of learning. It’s the perfect place from which to ask questions and fill gaps in your knowledge. The latter is the verge of presumptuousness — a very good place from which to make a fool of yourself. Of course, this depends on your attitude: you can always be mindful of the Great Unknown, such as it is, and keep quiet.

As these tropes have played out in the last few months, I have been reminded of an article written by the physicist Philip Warren Anderson, called ‘More is Different’, and published in 1972. His idea here is simple: that the statement “if everything obeys the same fundamental laws, then the only scientists who are studying anything really fundamental are those who are working on those laws” is false. He goes on to explain:

“The main fallacy in this kind of thinking is that the reductionist hypothesis does not by any means imply a ‘constructionist’ one: The ability to reduce everything to simple fundamental laws does not imply the ability to start from those laws and reconstruct the universe. … The constructionist hypothesis breaks down when confronted with the twin difficulties of scale and complexity. The behaviour of large and complex aggregates of elementary particles, it turns out, is not to be understood in terms of a simple extrapolation of the properties of a few particles. Instead, at each level of complexity entirely new properties appear, and the understanding of the new behaviours requires research which I think is as fundamental in its nature as any other.”

The seemingly endless intricacies that beset the interaction of a virus, a human body and a vaccine are proof enough that the “twin difficulties of scale and complexity” are present in epidemiology, immunology and biochemistry as well – and testament to the foolishness of any claims that the laws of conservation, thermodynamics or motion can help us say, for example, whether a particular variant infects people ‘better’ because it escapes the immune system better or because the immune system’s protection is fading.

But closer to my point: not even all epidemiologists, immunologists and/or biochemists can meaningfully comment on every form or type of these interactions at all times. I’m not 100% certain, but at least from what I’ve learnt reporting topics in physics (and conceding happily that covering biology seems more complex), scale and complexity work not just across but within fields as well. A cardiologist may be able to comment meaningfully on COVID-19’s effects on the heart in some patients, or a neurologist on the brain, but they may not know how the infection got there even if all these organs are part of the same body. A structural biologist may have deciphered why different mutations change the virus’s spike protein the way they do, but she can’t be expected to comment meaningfully on how epidemiological models will have to be modified for each variant.

To people who don’t know better, a doctor is a doctor and a scientist is a scientist, but as journalists plumb the deeper, more involved depths of a new yet specific disease, we bear from time to time a secret responsibility to be constructive and not reductive, and this is difficult. It becomes crucial for us to draw on the wisdom of the right experts, who wield the right expertise, so that we’re moving as much and as often as possible away from the position of what we don’t know we don’t know even as we ensure we’re not caught in the traps of what experts don’t know they don’t know. The march away from complete uncertainty and towards the names of uncertainty is precarious.

Equally importantly, at this time, to make our own jobs that much easier, or at least less acerbic, it’s important for everyone else to know this as well – that more is vastly different.

US experiments find hint of a break in the laws of physics

At 9 pm India time on April 7, physicists at an American research facility delivered a shot in the arm to efforts to find flaws in a powerful theory that explains how the building blocks of the universe work.

Physicists are looking for flaws in it because the theory doesn’t have answers to some questions – like “what is dark matter?”. They hope to find a crack or a hole that might reveal the presence of a deeper, more powerful theory of physics that can lay unsolved problems to rest.

The story begins in 2001, when physicists performing an experiment in Brookhaven National Lab, New York, found that fundamental particles called muons weren’t behaving the way they were supposed to in the presence of a magnetic field. This was called the g-2 anomaly (after a number called the gyromagnetic factor).

An incomplete model

Muons are subatomic and can’t be seen with the naked eye, so it could’ve been that the instruments the physicists were using to study the muons indirectly were glitching. Or it could’ve been that the physicists had made a mistake in their calculations. Or, finally, what the physicists thought they knew about the behaviour of muons in a magnetic field was wrong.

In most stories we hear about scientists, the first two possibilities are true more often: they didn’t do something right, so the results weren’t what they expected. But in this case, the physicists were hoping they were wrong. This unusual wish was the product of working with the Standard Model of particle physics.

According to physicist Paul Kyberd, the fundamental particles in the universe “are classified in the Standard Model of particle physics, which theorises how the basic building blocks of matter interact, governed by fundamental forces.” The Standard Model has successfully predicted the numerous properties and behaviours of these particles. However, it’s also been clearly wrong about some things. For example, Kyberd has written:

When we collide two fundamental particles together, a number of outcomes are possible. Our theory allows us to calculate the probability that any particular outcome can occur, but at energies beyond which we have so far achieved, it predicts that some of these outcomes occur with a probability of greater than 100% – clearly nonsense.

The Standard Model also can’t explain what dark matter is, what dark energy could be or if gravity has a corresponding fundamental particle. It predicted the existence of the Higgs boson but was off about the particle’s mass by a factor of 100 quadrillion.

All these issues together imply that the Standard Model is incomplete, that it could be just one piece of a much larger ‘super-theory’ that works with more particles and forces than we currently know. To look for these theories, physicists have taken two broad approaches: to look for something new, and to find a mistake with something old.

For the former, physicists use particle accelerators, colliders and sophisticated detectors to look for heavier particles thought to exist at higher energies, and whose discovery would prove the existence of a physics beyond the Standard Model. For the latter, physicists take some prediction the Standard Model has made with a great degree of accuracy and test it rigorously to see if it holds up. Studies of muons in a magnetic field are examples of this.

According to the Standard Model, a number associated with the way a muon swivels in a magnetic field is equal to 2 plus 0.00116591804 (with some give or take). This minuscule addition is the handiwork of fleeting quantum effects in the muon’s immediate neighbourhood, and which make it wobble. (For a glimpse of how hard these calculations can be, see this description.)

Fermilab result

In the early 2000s, the Brookhaven experiment measured the deviation to be slightly higher than the model’s prediction. Though it was small – off by about 0.00000000346 – the context made it a big deal. Scientists know that the Standard Model has a habit of being really right, so when it’s wrong, the wrongness becomes very important. And because we already know the model is wrong about other things, there’s a possibility that the two things could be linked. It’s a potential portal into ‘new physics’.

“It’s a very high-precision measurement – the value is unequivocal. But the Standard Model itself is unequivocal,” Thomas Kirk, an associate lab director at Brookhaven, had told Science in 2001. The disagreement between the values implied “that there must be physics beyond the Standard Model.”

This is why the results physicists announced today are important.

The Brookhaven experiment that ascertained the g-2 anomaly wasn’t sensitive enough to say with a meaningful amount of confidence that its measurement was really different from the Standard Model prediction, or if there could be a small overlap.

Science writer Brianna Barbu has likened the mystery to “a single hair found at a crime scene with DNA that didn’t seem to match anyone connected to the case. The question was – and still is – whether the presence of the hair is just a coincidence, or whether it is actually an important clue.”

So to go from ‘maybe’ to ‘definitely’, physicists shipped the 50-foot-wide, 15-tonne magnet that the Brookhaven facility used in its Muon g-2 experiment to Fermilab, the US’s premier high-energy physics research facility in Illinois, and built a more sensitive experiment there.

The new result is from tests at this facility: that the observation differs from the Standard Model’s predicted value by 0.00000000251 (give or take a bit).

The Fermilab results are expected to become a lot better in the coming years, but even now they represent an important contribution. The statistical significance of the Brookhaven result was just below the threshold at which scientists could claim evidence but the combined significance of the two results is well above.

Potential dampener

So for now, the g-2 anomaly seems to be real. It’s not easy to say if it will continue to be real as physicists further upgrade the Fermilab g-2’s performance.

In fact there appears to be another potential dampener on the horizon. An independent group of physicists has had a paper published today saying that the Fermilab g-2 result is actually in line with the Standard Model’s prediction and that there’s no deviation at all.

This group, called BMW, used a different way to calculate the Standard Model’s value of the number in question than the Fermilab folks did. Aida El-Khadra, a theoretical physicist at the University of Illinois, told Quanta that the Fermilab team had yet to check BMW’s approach, but if it was found to be valid, the team would “integrate it into its next assessment”.

The ‘Fermilab approach’ itself is something physicists have worked with for many decades, so it’s unlikely to be wrong. If the BMW approach checks out, it’s possible according to Quanta that just the fact that two approaches lead to different predictions of the number’s value is likely to be a new mystery.

But physicists are excited for now. “It’s almost the best possible case scenario for speculators like us,” Gordan Krnjaic, a theoretical physicist at Fermilab who wasn’t involved in the research, told Scientific American. “I’m thinking much more that it’s possibly new physics, and it has implications for future experiments and for possible connections to dark matter.”

The current result is also important because the other way to look for physics beyond the Standard Model – by looking for heavier or rarer particles – can be harder.

This isn’t simply a matter of building a larger particle collider, powering it up, smashing particles and looking for other particles in the debris. For one, there is a very large number of energy levels at which a particle might form. For another, there are thousands of other particle interactions happening at the same time, generating a tremendous amount of noise. So without knowing what to look for and where, a particle hunt can be like looking for a very small needle in a very large haystack.

The ‘what’ and ‘where’ instead come from different theories that physicists have worked out based on what we know already, and design experiments depending on which one they need to test.

Into the hospital

One popular theory is called supersymmetry: it predicts that every elementary particle in the Standard Model framework has a heavier partner particle, called a supersymmetric partner. It also predicts the energy ranges in which these particles might be found. The Large Hadron Collider (LHC) in CERN, near Geneva, was powerful enough to access some of these energies, so physicists used it and went looking last decade. They didn’t find anything.

A table showing searches for particles associated with different post-standard-model theories (orange labels on the left). The bars show the energy levels up to which the ATLAS detector at the Large Hadron Collider has not found the particles. Table: ATLAS Collaboration/CERN

Other groups of physicists have also tried to look for rarer particles: ones that occur at an accessible energy but only once in a very large number of collisions. The LHC is a machine at the energy frontier: it probes higher and higher energies. To look for extremely rare particles, physicists explore the intensity frontier – using machines specialised in generating collisions.

The third and last is the cosmic frontier, in which scientists look for unusual particles coming from outer space. For example, early last month, researchers reported that they had detected an energetic anti-neutrino (a kind of fundamental particle) coming from outside the Milky Way participating in a rare event that scientists predicted in 1959 would occur if the Standard Model is right. The discovery, in effect, further cemented the validity of the Standard Model and ruled out one potential avenue to find ‘new physics’.

This event also recalls an interesting difference between the 2001 and 2021 announcements. The late British scientist Francis J.M. Farley wrote in 2001, after the Brookhaven result:

… the new muon (g-2) result from Brookhaven cannot at present be explained by the established theory. A more accurate measurement … should be available by the end of the year. Meanwhile theorists are looking for flaws in the argument and more measurements … are underway. If all this fails, supersymmetry can explain the data, but we would need other experiments to show that the postulated particles can exist in the real world, as well as in the evanescent quantum soup around the muon.

Since then, the LHC and other physics experiments have sent supersymmetry ‘to the hospital’ on more than one occasion. If the anomaly continues to hold up, scientists will have to find other explanations. Or, if the anomaly whimpers out, like so many others of our time, we’ll just have to put up with the Standard Model.

Featured image: A storage-ring magnet at Fermilab whose geometry allows for a very uniform magnetic field to be established in the ring. Credit: Glukicov/Wikimedia Commons, CC BY-SA 4.0.

The Wire Science
April 8, 2021

On the NASEM report on solar geoengineering

A top scientific body in the US has asked the government to fund solar geoengineering research in a bid to help researchers and policymakers know the fullest extent of their options to help the US deal with climate change.

Solar geoengineering is a technique in which sunlight-reflecting aerosols are pumped into the air, to subtract the contribution of solar energy to Earth’s rapidly warming surface.

The technique is controversial because the resulting solar dimming is likely to affect ecosystems in a detrimental way and because, without the right policy safeguards, its use could allow polluting industries to continue polluting.

The US National Academies of Sciences, Engineering and Medicine (NASEM) released its report on March 25. It describes three solar geoengineering strategies: stratospheric aerosol injection (described above), marine cloud brightening and cirrus cloud thinning.

“Although scientific agencies in the US and abroad have funded solar-geoengineering research in the past, governments have shied away from launching formal programmes in the controversial field,” Nature News reported. In addition, “Previous recommendations on the subject by elite scientific panels in the US and abroad have gone largely unheeded” – including NASEM’s own 2015 recommendations.

To offset potential roadblocks, the new report requests the US government to setup a transparent research administration framework, including a code of conduct, an open registry of researchers’ proposals for studies and a fixed process by which the government will grant permits for “outdoor experiments”. And to achieve these goals, it recommends a dedicated allocation of $100-200 million (Rs 728-1,456 crore).

According to experts who spoke to Nature News, Joe Biden being in the Oval Office instead of Donald Trump is crucial: “many scientists say that Biden’s administration has the credibility to advance geoengineering research without rousing fears that doing so will merely displace regulations and other efforts to curb greenhouse gases, and give industry a free pass.”

This is a significant concern for many reasons – including, notably, countries’ differentiated commitments to ensuring outcomes specified in the Paris Agreement and the fact that climate is a global, not local, phenomenon.

Data from 1900 to 2017 indicates that US residents had the world’s ninth highest carbon dioxide emissions per capita; Indians were 116th. This disparity, which holds between the group of large developed countries and of large developing countries in general, has given rise to demands by the latter that the former should do more to tackle climate change.

The global nature of climate is a problem particularly for countries with industries that depend on natural resources like solar energy and seasonal rainfall. One potential outcome of geoengineering is that climatic changes induced in one part of the planet could affect outcomes in a faraway part.

For example, the US government sowed the first major seeds of its climate research programme in the late 1950s after the erstwhile Soviet Union set off three nuclear explosions underground to divert the flow of a river. American officials were alarmed because they were concerned that changes to the quality and temperature of water entering the Arctic Ocean could affect climate patterns.

For another, a study published in 2007 found that when Mt Pinatubo in the Philippines erupted in 1991, it spewed 20 million tonnes of sulphur dioxide that cooled the whole planet by 0.5º C. As a result, the amount of rainfall dropped around the world as well.

In a 2018 article, Rob Bellamy, a Presidential Fellow in Environment at the University of Manchester, had also explained why stratospheric aerosol injection is “a particularly divisive idea”:

For example, as well as threatening to disrupt regional weather patterns, it, and the related idea of brightening clouds at sea, would require regular “top-ups” to maintain cooling effects. Because of this, both methods would suffer from the risk of a “termination effect”: where any cessation of cooling would result in a sudden rise in global temperature in line with the level of greenhouse gases in the atmosphere. If we hadn’t been reducing our greenhouse gas emissions in the background, this could be a very sharp rise indeed.

A study published in 2018 had sought to quantify the extent of this effect – a likely outcome of, say, projects losing political favour or funding. The researchers created a model in which humans pumped five million tonnes of sulphur dioxide a year into the stratosphere for 50 years, and suddenly stopped. One of the paper’s authors told The Wire Science at the time: “This would lead to a rapid increase in temperature, two- to four-times more rapid than climate change without geoengineering. This increase would be dangerous for biodiversity and ecosystems.”

Prakash Kashwan, a political scientist at the University of Connecticut and a senior research fellow of the Earth System Governance Project, has also written for The Wire Science about the oft-ignored political and social dimensions of geoengineering.

He told the New York Times on March 25, “Once these kinds of projects get into the political process, the scientists who are adding all of these qualifiers and all of these cautionary notes” – such as “the steps urged in the report to protect the interests of poorer countries” – “aren’t in control”. In December 2018, Kashwan also advised caution in the face of scientific pronouncements:

The community of climate engineering scientists tends to frame geoengineering in certain ways over other equally valid alternatives. This includes considering the global average surface temperature as the central climate impact indicator and ignoring vested interests linked to capital-intensive geoengineering infrastructure. This could bias future R&D trajectories in this area. And these priorities, together with the assessments produced by eminent scientific bodies, have contributed to the rise of a de facto form of governance. In other words, some ‘high-level’ scientific pronouncements have assumed stewardship of climate geoengineering in the absence of other agents. Such technocratic modes of governance don’t enjoy broad-based social or political legitimacy.

For now, the NASEM report “does not in any way advocate deploying the technology, but says research is needed to understand the options if the climate crisis becomes even more serious,” according to Nature News. The report itself concludes thus:

The recommendations in this report focus on an initial, exploratory phase of a research program. The program might be continued or expand over a longer term, but may also shrink over time, with some or all elements eventually terminated, if early research suggests strong reasons why solar geoengineering should not be pursued. The proposed approaches to transdisciplinary research, research governance, and robust stakeholder engagement are different from typical climate research programs and will be a significant undertaking; but such efforts will enable the research to proceed in an effective, societally responsive manner.

Matthew Watson, a reader in natural hazards at the University of Bristol, had discussed a similar issue in conversation with Bellamy in 2018, including an appeal to our moral responsibilities the same way ‘geoengineers’ must be expected to look out for transnational and subnational effects:

Do you remember the film 127 Hours? It tells the (true) story of a young climber who, pinned under a boulder in the middle of nowhere, eventually ends up amputating his arm, without anaesthetic, with a pen knife. In the end, he had little choice. Circumstances dictate decisions. So if you believe climate change is going to be severe, you have no option but to research the options (I am not advocating deployment) as broadly as possible. Because there may well come a point in the future where it would be immoral not to intervene.

The Wire Science
March 30, 2021

COVID-19, due process and an SNR problem

At a press conference streamed live on March 18, the head of the European Medicines Agency (EMA) announced that the body – which serves as the European Union’s drug and vaccine regulator – had concluded that the AstraZeneca COVID-19 vaccine was not associated with unusual blood clots that some vaccine recipients had reported in multiple countries. The pronouncement marked yet another twist in the roller-coaster ride the embattled shot has experienced over the past few months. But it has also left bioethicists debating how it is that governments should respond to a perceived crisis over vaccines during a pandemic.

Over the last two weeks or so, a fierce debate raged after a relatively small subset of people who had received doses complained of developing blood clots related to potentially life-threatening conditions. AstraZeneca, a British-Swedish company, didn’t respond to the concerns at first even though the EMA and the WHO continued to hold their ground: that the vaccine’s benefits outweighed its risks, so people should continue to take it. However, a string of national governments, including those of Germany, France and Spain, responded by pausing its rollout while scientists assessed the risks of receiving the vaccine.

Aside from allegations that AstraZeneca tried to dress up a significant mistake during its clinical trials of the vaccine as a ‘discovery’ and cherry-picked data from the trials to have the shot approved in different countries, the company has also been grappling with the fact that the shot was less efficacious than is ideal against infections by new, more contagious variants of the novel coronavirus.

But at the same time, the AstraZeneca vaccine is also one of the more affordable ones that scientists around the world have developed to quell the COVID-19 pandemic – more so than the Pfizer and Moderna mRNA vaccines. AstraZeneca’s candidate is also easier to store and transport, and is therefore in high demand in developing and under-developed nations around the world. Its doses are being manufactured by two companies, in India and South Korea, although geographically asymmetric demand has forced an accelerating vaccination drive in one country to come at the cost of deceleration in another.

Shot in the arm

Now that the EMA has reached its verdict, most of the 20 countries who had hit the pause button have announced that they will resume use of the vaccine. However, the incident has spotlighted a not-unlikely problem with the global vaccination campaign, and which could recur if scientists, ethicists, medical workers and government officials don’t get together to decide where they can draw the line between abundant precaution and harm.

In fact, there are two versions of this problem: one in countries that have a functional surveillance system that responds to adverse events following immunisation (AEFIs) and one in countries that don’t. An example of the former is Germany, which, according to the New York Times, decided to pause the rollout based on seven reports of rare blood clots from a pool of 1.6 million recipients – a naïve incidence rate of 0.0004375%. But as rare disorders go, this isn’t a negligible figure.

One component of the post-AEFI response protocol is causality assessment, and one part of this is for experts to check if certain purported side-effects are clustered in time and then to compare those to the illness’s time distribution for a long time before the pandemic. It’s possible that such clustering could have prompted health officials in Germany and other countries to suspend the rollout.

The Times quoted a German health ministry statement saying, “The state provides the vaccine and therefore has special duties of care”. These care considerations include what the ministry understands to be the purpose of the rollout (to reduce deaths? To keep as many people healthy as possible?) read together with the fact that vaccines are like drugs except in one important way: they’re given to healthy – and not to sick – people. To quote Stephan Lewandowsky, an expert of risk communication at the University of Bristol, from Science:

“You’ve got to keep the public on board. And if the public is risk-averse, as it is in Europe … it may have been the right decision to stop, examine this carefully and then say, ‘The evidence, when considered transnationally, clearly indicates it is safe to go forward.’”

On the other hand is the simpler and opposing calculus of how many people didn’t develop blood clots after taking the vaccine, how many more people the virus is likely to have infected in the time the state withheld the vaccine, how many of them were at greater risk of developing complications due to COVID-19 – topped off by the fact of the vaccines being voluntary. On this side of the argument, the state’s carefulness is smothering, considering it’s using a top-down policy without accounting for local realities or the state’s citizens’ freedom to access or refuse the vaccine during a pandemic.

Ultimately there appears to be no one right answer, at least in a country where there’s a baseline level of trust that the decision-making process included a post-vaccination surveillance system that’s doing its job. Experts have also said governments should consider ‘mixed responses’ – like continuing rollouts while also continuing to examine the vaccines, given the possibility that a short-term review may have missed something a longer term exercise could find. One group of exerts in India has even offered a potential explanation.

The background rate

In countries where such a system doesn’t exist, or does but is broken, like India, there is actually one clear answer: to be transparent and accountable instead of opaque and intractable. For example, N.K. Arora, a member of India’s National COVID-19 Task Force, told The Hindu recently that while the body would consider post-vaccination data of AstraZeneca’s vaccine, it also believed the fraction of worrying cases to be “very, very low”. Herein lies the rub: how does it know?

As of early March, according to Arora, the Union health ministry had recorded “50-60” cases of AEFIs that may or may not be related to receiving either of the two vaccines in India’s drive, Covaxin and Covishield. (The latter is the name of AstraZeneca’s shot in India.) Reading this with Arora’s statements and some other facts of the case, four issues become pertinent.

First is the deceptively simple problem of the background rate. Journalist Priyanka Pulla’s tweets prompt multiple immediate concerns on this front. If India had reported 10 cases of disease X in 20 years, but 10 more cases show up within two weeks after receiving one dose of a vaccine, should we assume the vaccine caused them? No – but it’s a signal that we should check for the existence of a causal link.

Experts will need to answer a variety of questions here: How many people have disease X in India? How many people of a certain age-group and gender have disease X? How many people of different religious and/or ethnic groups have disease X? How many cases of disease X are we likely to have missed (considering disease-underreporting is a hallmark of Indian healthcare)? How many cases of disease X should we expect to find in the population being vaccinated in the absence of a vaccine? Do the 10 new cases, or any subset of them, have a common but invisible cause unrelated to the vaccine? Do we have the data for all these considerations?

Cornelia Betsch, a psychologist at the University of Erfurt, told Science that “most of the cases of rare blood disorders were among young women, the group where vaccine hesitancy already runs highest”. Can India confirm or deny that this trend is reflected in its domestic data as well? This seems doubtful. Sarah Iqbal reported for The Wire Science in September 2020 that “unequal access to health”, unequal exposure to potentially disease-causing situations, unequal representation in healthcare data and unequal understanding of diseases in non-cis-male bodies together already render statements like ‘women have better resistance to COVID-19’ ignorant at best. Being able to reliably determine and tackle sex-wise vaccine hesitancy seems like a tall order.

The second issue is easy to capture in one question, which also makes it harder to ignore: why hasn’t the government released reports or data about AEFIs in India’s COVID-19 vaccination drive after February 26, 2021?

On March 16, a group of 29 experts from around the country – including virologist T. Jacob John, who has worked with the Indian Council of Medical Research on seroprevalence surveys and has said skeptics of the Indian drug regulator’s Covaxin approval were “prejudiced against Indian science/product” – wrote to government officials asking for AEFI data. They said in their letter:

We note with concern that critical updates to the fact sheets recommended by the CDSCO’s Subject Expert Committee have not been issued, even though they are meant to provide additional guidance and clarify use of the vaccines in persons such as those with allergies, who are immunocompromised or using immunosuppressants, or using blood thinners/anticoagulants. There are gaps in AEFI investigations at the local level, affecting the quality of evidence submitted to State and National AEFI Committees who depend on these findings for making causality assessments. The National AEFI Committee also has a critical role in assessing cases that present as a cluster and to explore potential common pathways. In our letter dated January 31, 2021, we asked for details of all investigations into deaths and other serious AEFIs, as well as the minutes of AEFI monitoring committees, and details of all AEFI committee members and other experts overseeing the vaccine rollout. We have not received any response.

City of Omelas

The third issue is India’s compliance with AEFI protocols – which, when read together with Pulla’s investigation of Bharat Biotech’s response to a severe adverse event in its phase 3 trials for Covaxin, doesn’t inspire much confidence. For example, media reports suggest that medical workers around the country aren’t treating all post-vaccination complaints of ill-health, but especially deaths, on equal footing. “Currently, we are observing gaps in how serious adverse events are being investigated at the district level,” New Delhi-based health activist Malini Aisola told IndiaSpend on March 9. “In many instances local authorities have been quick to make public statements that there is no link to the vaccine, even before investigations and post mortem have taken place. In some cases there is a post mortem, in some cases there isn’t.”

Some news reports of people having died of heart-related issues at a point of time after taking Covishield also include quotes from doctors saying the victims were known to have heart ailments – as if to say their deaths were not related to the vaccine.

But in the early days of India’s COVID-19 epidemic, experts told The Wire that even when people with comorbidities, like impaired kidney function, died due to renal failure and tested positive for COVID-19 at the time of death, their passing could be excluded from the official deaths tally only if experts had made sure the two conditions were unrelated – and this is difficult. Having a life-threatening illness doesn’t automatically make it the cause of death, especially since COVID-19 is also known to affect or exacerbate some existing ailments, and vice versa.

Similarly, today, is the National AEFI Committee for the COVID-19 vaccination drive writing off deaths as being unrelated to the vaccine or are they being considered to be potential AEFIs? And is the committee deliberating on these possibilities before making a decision? The body needs to be transparent on this front a.s.a.p. – especially since the government has been gifting AstraZeneca’s shots to other countries and there’s a real possibility of it suppressing information about potential problems with the vaccine to secure its “can do no wrong” position.

Finally, there’s the ‘trolley problem’, as the Times also reported – an ethical dilemma that applies in India as well as other countries: if you do nothing, three people will get hit by a train and die; if you pull a lever, the train will switch tracks and kill one person. What do you do?

But in India specifically, this dilemma is modified by the fact that due process is missing; this changes the problem to one that finds better, more evocative expression in Ursula K. Le Guin’s short story The Ones Who Walk Away from Omelas (1973). Omelas is a fictitious place, like paradise on Earth, where everyone is happy and content. But by some magic, this is only possible if the city can keep a child absolutely miserable, wretched, with no hope of a better life whatsoever. The story ends by contemplating the fate of those who discover the city’s gory secret and decide to leave.

The child in distress is someone – even just one person – who has reported an AEFI that could be related to the vaccine they took. When due process plays truant, when a twisted magic that promises bliss in return for ignorance takes shape, would you walk away from Omelas? And can you freely blame those who hesitate about staying back? Because this is how vaccine hesitancy takes root.

The Wire
March 20, 2021

A tale of vortices, skyrmions, paths and shapes

There are many types of superconductors. Some of them can be explained by an early theory of superconductivity called Bardeen-Cooper-Schrieffer (BCS) theory.

In these materials, vibrations in the atomic lattice force the electrons in the material to overcome their mutual repulsion and team up in pairs, if the material’s temperature is below a particular threshold (very low). These pairs of electrons, called Cooper pairs, have some properties that individual electrons can’t have. One of them is that all Cooper pairs together form an exotic state of matter called a Bose-Einstein condensate, which can flow through the material with much less resistance than individuals electrons experience. This is the gist of BCS theory.

When the Cooper pairs are involved in the transmission of an electric current through the material, the material is an electrical superconductor.

Some of the properties of the two electrons in each Cooper pair can influence the overall superconductivity itself. One of them is the orbital angular momentum, which is an intrinsic property of all particles. If both electrons have equal orbital angular momentum but are pointing in different directions, the relative orbital angular momentum is 0. Such materials are called s-wave superconductors.

Sometimes, in s-wave superconductors, some of the electric current – or supercurrent – starts flowing in a vortex within the material. If these vortices can be coupled with a magnetic structure called a skyrmion, physicists believe they can give rise to some new behaviour previously not seen in materials, some of them with important applications in quantum computing. Coupling here implies that a change in the properties of the vortex should induce changes in the skyrmion, and vice versa.

However, physicists have had a tough time creating a vortex-skyrmion coupling that they can control. As Gustav Bihlmayer, a staff scientist at the Jülich Research Centre, Germany, wrote for APS Physics, “experimental studies of these systems are still rare. Both parts” of the structures bearing these features “must stay within specific ranges of temperature and magnetic-field strength to realise the desired … phase, and the length scales of skyrmions and vortices must be similar in order to study their coupling.”

In a new paper, a research team from Nanyang Technical University, Singapore, has reported that they have achieved just such a coupling: they created a skyrmion in a chiral magnet and used it to induce the formation of a supercurrent vortex in an s-wave superconductor. In their observations, they found this coupling to be stable and controllable – important attributes to have if the setup is to find practical application.

A chiral magnet is a material whose internal magnetic field “typically” has a spiral or swirling pattern. A supercurrent vortex in an electrical superconductor is analogous to a skyrmion in a chiral magnet; a skyrmion is a “knot of twisting magnetic field lines” (source).

The researchers sandwiched an s-wave superconductor and a chiral magnet together. When the magnetic field of a skyrmion in the chiral magnet interacted with the superconductor at the interface, it induced a spin-polarised supercurrent (i.e. the participating electrons’ spin are aligned along a certain direction). This phenomenon is called the Rashba-Edelstein effect, and it essentially converts electric charge to electron spin and vice versa. To do so, the effect requires the two materials to be in contact and depends among other things on properties of the skyrmion’s magnetic field.

There’s another mechanism of interaction in which the chiral magnet and the superconductor don’t have to be in touch, and which the researchers successfully attempted to recreate. They preferred this mechanism, called stray-field coupling, to demonstrate a skyrmion-vortex system for a variety of practical reasons. For example, the chiral magnet is placed in an external magnetic field during the experiment. Taking the Rashba-Edelstein route means to achieve “stable skyrmions at low temperatures in thin films”, the field needs to be stronger than 1 T. (Earth’s magnetic field measures 25-65 µT.) Such a field could damage the s-wave superconductor.

For the stray-field coupling mechanism, the researchers inserted an insulator between the chiral magnet and the superconductor. Then, when they applied a small magnetic field, Bihlmayer wrote, the field “nucleated” skyrmions in the structure. “Stray magnetic fields from the skyrmions [then] induced vortices in the [superconducting] film, which were observed with scanning tunnelling spectroscopy.”


Experiments like this one reside at the cutting edge of modern condensed-matter physics. A lot of their complexity resides in scientists being able to closely control the conditions in which different quantum effects play out, using similarly advanced tools and techniques to understand what could be going on inside the materials, and to pick the right combination of materials to use.

For example, the heterostructure the physicists used to manifest the stray-field coupling mechanism had the following composition, from top to bottom:

  • Platinum, 2 nm (layer thickness)
  • Niobium, 25 nm
  • Magnesium oxide, 5 nm
  • Platinum, 2 nm

The next four layers are repeated 10 times in this order:

  • Platinum, 1 nm
  • Cobalt, 0.5 nm
  • Iron, 0.5 nm
  • Iridium, 1 nm

Back to the overall stack:

  • Platinum, 10 nm
  • Tantalum, 2 nm
  • Silicon dioxide (substrate)

The first three make up the superconductor, the magnesium oxide is the insulator, and the rest (except the substrate) make up the chiral magnet.

It’s possible to erect a stack like this through trial and error, with no deeper understanding dictating the choice of materials. But when the universe of possibilities – of elements, compounds and alloys, their shapes and dimensions, and ambient conditions in which they interact – is so vast, the exercise could take many decades. But here we are, at a time when scientists have explored various properties of materials and their interactions, and are able to engineer novel behaviours into existence, blurring the line between discovery and invention. Even in the absence of applications, such observations are nothing short of fascinating.

Applications aren’t wanting, however.


quasiparticle is a packet of energy that behaves like a particle in a specific context even though it isn’t actually one. For example, the proton is a quasiparticle because it’s really a clump of smaller particles (quarks and gluons) that together behave in a fixed, predictable way. A phonon is a quasiparticle that represents some vibrational (or sound) energy being transmitted through a material. A magnon is a quasiparticle that represents some magnetic energy being transmitted through a material.

On the other hand, an electron is said to be a particle, not a quasiparticle – as are neutrinos, photons, Higgs bosons, etc.

Now and then physicists abstract packets of energy as particles in order to simplify their calculations.

(Aside: I’m aware of the blurred line between particles and quasiparticles. For a technical but – if you’re prepared to Google a few things – fascinating interview with condensed-matter physicist Vijay Shenoy on this topic, see here.)

We understand how these quasiparticles behave in three-dimensional space – the space we ourselves occupy. Their properties are likely to change if we study them in lower or higher dimensions. (Even if directly studying them in such conditions is hard, we know their behaviour will change because the theory describing their behaviour predicts it.) But there is one quasiparticle that exists in two dimensions, and is quite different in a strange way from the others. They are called anyons.

Say you have two electrons in an atom orbiting the nucleus. If you exchanged their positions with each other, the measurable properties of the atom will stay the same. If you swapped the electrons once more to bring them back to their original positions, the properties will still remain unchanged. However, if you switched the positions of two anyons in a quantum system, something about the system will change. More broadly, if you started with a bunch of anyons in a system and successively exchanged their positions until they had a specific final arrangement, the system’s properties will have changed differently depending on the sequence of exchanges.

This is called path dependency, and anyons are unique in possessing this property. In technical language, anyons are non-Abelian quasiparticles. They’re interesting for many reasons, but one application stands out. Quantum computers are devices that use the quantum mechanical properties of particles, or quasiparticles, to execute logical decisions (the same way ‘classical’ computers use semiconductors). Anyons’ path dependency is useful here. Arranging anyons in one sequence to achieve a final arrangement can be mapped to one piece of information (e.g. 1), and arranging anyons by a different sequence to achieve the same final arrangement can be mapped to different information (e.g. 0). This way, what information can be encoded depends on the availability of different paths to a common final state.

In addition, an important issue with existing quantum computers is that they are too fragile: even a slight interaction with the environment can cause the devices to malfunction. Using anyons for the qubits could overcome this problem because the information stored doesn’t depend on the qubits’ existing states but the paths that they have taken there. So as long as the paths have been executed properly, environmental interactions that may disturb the anyons’ final states won’t matter.

However, creating such anyons isn’t easy.

Now, recall that s-wave superconductors are characterised by the relative orbital angular momentum of electrons in the Cooper pairs being 0 (i.e. equal but in opposite directions). In some other materials, it’s possible that the relative value is 1. These are the p-wave superconductors. And at the centre of a supercurrent vortex in a p-wave superconductor, physicists expect to find non-Abelian anyons.

So the ability to create and manipulate these vortices in superconductors, as well as, more broadly, explore and understand how magnet-superconductor heterostructures work, is bound to be handy.


The Nanyang team’s paper calls the vortices and skyrmions “topological excitations”. An ‘excitation’ here is an accumulation of energy in a system over and above what the system has in its ground state. Ergo, it’s excited. A topological excitation refers to energy manifested in changes to the system’s topology.

On this subject, one of my favourite bits of science is topological phase transitions.

I usually don’t quote from Wikipedia but communicating condensed-matter physics is exacting. According to Wikipedia, “topology is concerned with the properties of a geometric object that are preserved under continuous deformations, such as stretching, twisting, crumpling and bending”. For example, no matter how much you squeeze or stretch a donut (without breaking it), it’s going to be a ring with one hole. Going one step further, your coffee mug and a donut are topologically similar: they’re both objects with one hole.

I also don’t like the Nobel Prizes but some of the research that they spotlight is nonetheless awe-inspiring. In 2016, the prize was awarded to Duncan Haldane, John Kosterlitz and David Thouless for “theoretical discoveries of topological phase transitions and topological phases of matter”.

David Thouless in 1995. Credit: Mary Levin/University of Washington

Quoting myself from 2016:

There are four popularly known phases of matter: plasma, gas, liquid and solid. If you cooled plasma, its phase would transit to that of a gas; if you cooled gases, you’d get a liquid; if you cooled liquids, you’d get a solid. If you kept cooling a solid until you were almost at absolute zero, you’d find substances behaving strangely because, suddenly, quantum mechanical effects show up. These phases of matter are broadly called quantum phases. And their phase transitions are different from when plasma becomes a gas, a gas becomes a liquid, and so on.

A Kosterlitz-Thouless transition describes a type of quantum phase transition. A substance in the quantum phase, like all substances, tries to possess as low energy as possible. When it gains some extra energy, it sheds it. And how it sheds it depends on what the laws of physics allow. Kosterlitz and Thouless found that, at times, the surface of a flat quantum phase – like the surface of liquid helium – develops vortices, akin to a flattened tornado. These vortices always formed in pairs, so the surface always had an even number of vortices. And at very low temperatures, the vortices were always tightly coupled: they remained close to each other even when they moved across the surface.

The bigger discovery came next. When Kosterlitz and Thouless raised the temperature of the surface, the vortices moved apart and moved around freely, as if they no longer belonged to each other. In terms of thermodynamics alone, the vortices being alone or together wouldn’t depend on the temperature, so something else was at play. The duo had found a kind of phase transition – because it did involve a change in temperature – that didn’t change the substance itself but only a topological shift in how it behaved. In other words, the substance was able to shed energy by coupling the vortices.

Reality is so wonderfully weird. It’s also curious that some concepts that seemed significant when I was learning science in school (like invention versus discovery) and in college (like particle versus quasiparticle) – concepts that seemed meaningful and necessary to understand what was really going on – don’t really matter in the larger scheme of things.