Kate Rogers: Data can be useful, but it must be tamed first

Investment managers rely on data, but inherent flaws have to be ironed out

Kate Rogers
Kate Rogers

I’m an investment manager, so I’ve always thought of data as my friend. How else do I make informed judgements? How can I test my theories and appraise my theses?

So the huge boom in the amount of data available over the past two decades must surely present a great opportunity for investors. But how to tame the proliferation of information available, translating it from "big data" to investment insights?

Many of the large fund managers have tried to answer this challenge by investing in teams of data scientists. Our own team, at Schroders, includes two members hired from the McLaren Formula 1 team. And they’ve delivered some fascinating insights.

One example used natural language processing, a machine-learning technique, to analyse content and cluster similar semantics, allowing our global equity team to spot patterns in Mark Zuckerberg’s public comments and reduce exposure to Facebook ahead of the Cambridge Analytica scandal.

Another example used geo-data to analyse the number of William Hill betting shops that would need to close should the betting regulation change: the analysis predicted 929; the company eventually announced 900. These insights can clearly deliver real value to investors; but context is key.

I have two examples of the dangers of over-reliance on data. The first comes from my summer reading, Invisible Women by Caroline Criado Perez. She talks about the increasing use of algorithms and machine learning based on big data, but finds that many of the large datasets are gender-biased.

For example, a University of Washington study found Google Images under-represented women across the 45 professions that were tested, with the largest discrepancy at chief executive (11 per cent of the images, but 27 per cent of chief executives in reality).

Google Translate will also convert gender-neutral Turkish sentences such as "O bir doctor" (s/he is a doctor) into the gender stereotypical "he is a doctor", with the reverse effect for a nurse. Algorithms often amplify this bias, and it’s not difficult to imagine how this could disadvantage women. Unconscious bias in artificial intelligence could well prove to be as prolific as in humans.

My second example of flawed data is the use of environmental, social and governance ratings to construct responsible investment portfolios.

These ratings, compiled by agencies, are backward-looking and based on tick-box questionnaires. Our analysis, however, suggests ESG ratings have no clear predictive value, with better-rated companies slightly more likely to experience controversies than worse-rated ones.

There is also a lack of consistency between agencies, with less than a 15 per cent chance that the same company would get the same rating across the three major ESG rating providers.

So although I believe passionately that data provides opportunities for better decision-making, I come back to the need to own the analysis and fully understand the data.

The old adage of "garbage in, garbage out" holds true: we are unlikely to get great insight if our data inputs are flawed. And I believe that good investment decisions come from a depth of understanding supported by rigorous, forward-looking research.

Kate Rogers is head of policy at Cazenove Charities

Have you registered with us yet?

Register now to enjoy more articles and free email bulletins

Register
Already registered?
Sign in