This post was originally published on this site

In 1935, officials in the British Air Ministry were trying to figure out whether it was possible to shoot down enemy aircraft with a death ray. Reader, they did not succeed.

Fortunately, the effort spawned something much more useful. Robert Watson-Watt and Arnold “Skip” Wilkins of the Radio Research Station suggested a better use for radio beams: spotting incoming bombers when the beams reflected off them.

The resulting radar system was indispensable in fending off the Luftwaffe five years later. When a threat approaches, it helps to be able to see exactly where it is.

The pandemic has taught us the same lesson, the hard way. Weaknesses in our information systems have been telling. The tragic failure to produce enough accurate Covid-19 tests swiftly — particularly shocking in the US — is well known.

Subtler failures have received too little attention. Consider this paragraph about social care, in a new report from UK fact checkers Full Fact: “Basic information, such as the number of people receiving care in each area, was not known to central government departments, and local authorities only knew about those people whose care they paid for.” Patchy data cannot have made protecting care homes any easier — nor, more recently, vaccinating them.

Alexis Madrigal, co-founder of The Covid Tracking Project in the US, attests that the UK is not alone. At the beginning of the crisis, he says, “We didn’t even know how many hospitals there were in the United States.”

That may seem surprising. Yet useful statistics do not simply arrange themselves neatly in a spreadsheet somewhere, waiting to be downloaded. They must be collected: someone must set the standards, link up the systems, hire the personnel.

If not, there are gaps. Before the pandemic, Caroline Criado Perez’s book Invisible Women highlighted that many data sets fail to distinguish between men and women. Yet in the early days of Covid-19 neither the UK nor the US disaggregated cases by gender, although we now know Covid-19 is more dangerous for men.

What about data on ethnicity? “Incredibly uneven,” says Madrigal. It is not just a matter of clinical care or vulnerability, important though these are. Are police using their new powers — for example to fine or arrest those in breach of social-distancing rules — in an even-handed way with regard to race? We don’t have the data to know.

Food banks have been a vital resource for some households during the crisis but nobody really knows for how many. In the UK, no systematic data exist.

Then there are basic gaps in information management systems. Nobody is likely to forget the moment last autumn when Public Health England mislaid nearly 16,000 positive cases, reportedly because an Excel spreadsheet ran out of rows. We don’t know exactly what went wrong because PHE has not fully explained it. That does not bode well for preventing a repeat.

Why does this matter? When our information systems fail, we are flying blind. There are the basics: we cannot find cases, we cannot run an efficient test-and-trace programme, we cannot easily see which hospitals are most in need of help. There are the big strategic calls: when to impose or lift restrictions based on prevalence. And there are the long-term issues: how can we target assistance to help rebuild the economy? Which children have fallen behind their peers? How is the criminal justice system bearing up? We know less than we should.

Yet there have been impressive successes. “The UK statistical system responded,” says Rebecca Hill, lead author of the Full Fact report. “It is a real testament to their ability to innovate, contrary to their reputation.”

The Office for National Statistics, for example, rapidly set up a large representative survey of the prevalence of infection in the population. Economists have quickly looked to unconventional sources, such as mobility data from Google, to understand the shape of the crisis. Data on US hospitals were successfully rebuilt from the ground up over the summer. And non-governmental operations ranging from Our World in Data to The Covid Tracking Project to Johns Hopkins University have performed heroic efforts in assembling clear, usable information from a messy patchwork of primary sources.

We take good data for granted until something goes wrong — at which point it is too late. Instead we should make it a priority. For example, UK border infrastructure is being redesigned for a post-Brexit world. Decisions made now about how the IT systems work will lock in knowledge, or ignorance, for a generation. More broadly, Full Fact argues that statisticians should be doing regular horizon scanning — teaming up with experts from all fields to ask: what gaps exist now? What data should we be gathering with the future in mind?

Robust information systems are not free. They require time, attention and money — but they can pay for themselves over and over again in better decisions taken, and better democratic accountability after the fact.

When the British showed the Americans their cutting-edge radar equipment in 1940, the US response was to pour resources into developing every possible application. Ten Nobel laureates emerged from the project — as did the radar technologies that did so much to win the war.

It isn’t cheap to build the systems that show you what’s coming at you. But failing to build them? That’s far more expensive.

Tim Harford’s new book is “How to Make the World Add Up”

Follow @FTMag on Twitter to find out about our latest stories first. Listen to our podcast, Culture Call, where FT editors and special guests discuss life and art in the time of coronavirus. Subscribe on Apple, Spotify, or wherever you listen.