This Expert Comment follows Privacy International's participation in 'The Future of Future of Financial Information-Sharing Partnerships in Europe' discussion roundtable in Brussels, held on 9 January 2018 - please find the FFIS discussion summary of that event here.
As with any sector, the detection and investigation of criminality must protect the human rights of us all. As new ways of fighting money laundering and terrorist financing continue to develop through, for example, public/private financial information sharing partnerships (or FISPs), we have to ensure that the right to privacy is protected.
Anti-money laundering regulations in the financial sector require private entities to play an almost unique role in identifying and reporting suspected criminality. The financial sector is thus an interesting area for Privacy International: financial data can reveal sensitive information about our lives, and new technologies mean that the amount of data gathered and used by the sector is increasing. This has potential implications for developments in surveillance more broadly, as the United Nations High Commissioner for Human Rights puts it, part of the “growing reliance by Governments on the private sector to conduct and facilitate digital surveillance.”
Yet the level of public understanding, and that of civil society organisations, about the wide ranging forms of financial surveillance that exist is limited, including information-sharing partnerships. For those not specialised in the field, the financial sector can often be opaque and complex. When combined with the added opacity of the practices of intelligence and law enforcement agencies, it can mean that there are significant gaps in the public’s awareness of how personal data is being used. This risks leaving the existing system, and proposed developments such as FISPs, unchallenged by public scrutiny.
It is thus particularly important that obligations to respect and protect privacy are met. Under international human rights law, and in the UK, the Human Rights Act, interferences with our right to privacy have to be both “in accordance with the law” and “necessary and proportionate”.
Banks have access to a huge amount of data about our lives: not only where we go and what we buy, and the manners in which we generate our livelihoods, but also who we’re connected to; they may even gather photos of us every time we use their ATMs. As well as being used to make financial decisions about us – and decide what products to try to sell us – data is increasingly be used to detect criminality and data can be shared through these public/private partnerships.
This is all occurring even as the amount of data generated by and accessible to the financial industry is growing dramatically. Financial institutions and the regtech industry are now seeking access to data that currently resides outside of our financial relationships with institutions in the form of publicly available data and social media. The mosaic effect on someone’s life that emerges from the volume of collection and analysis of this data, even though ostensibly “publicly available” means that its use has to be closely regulated. Indeed, the European Court on Human Rights has long held that “there is […] a zone of interaction of a person with others, even in a public context, which may fall within the scope of “private life”, particularly when this data is systematically or permanently recorded.
With the vast data accessible to the financial industry, the obligation to conduct surveillance becomes even important to critically assess and question. In theory, many of these surveillance mechanisms including public/private partnerships in financial crime are meant to help the private sector to identify ‘suspicious’ behaviour and report it to the authorities. However, despite the anti-money laundering system being in operation for 25 years, there are still major unresolved questions about what ‘suspicious’ means in the context of financial behaviour. Human judgements and broad indicators of suspicion can result in individuals or transactions being identified as ‘suspicious’, which can have damaging implications for individuals who may, after all, be entirely innocent.
New forms of data about peoples’ lives may suddenly become seen by our surveillance systems as indicators of criminality without ever assessing whether this is legal, fair, or effective. This problem of proportionality and fairness could be compounded by public/private partnerships working from very large combined data sets and the use of technologies like artificial intelligence to spot patterns to detect criminality. As can be seen with the field of predictive policing, the use of artificial intelligence and algorithms to make decisions on a limited data set can result in deeply prejudicial outcomes.
We should remember that the starting point in financial crime is that, typically, 80-90% of Suspicious Activity Reports from the private sector are not relevant to law enforcement investigations, and that only 1% of money laundered is estimated to be recovered. The danger is that using data and analytics in this context may reinforce existing bias in historical data and whilst ignoring genuine criminality that doesn’t ‘fit the mould’.
If new approaches to public/private information-sharing to detect financial crime, money laundering and terrorist financing do not address these underlying concerns, there is a danger that they do not meet key tests of proportionality, necessity, but more broadly public tests of fairness and justice. To counter this risk, we need to make sure that there is sufficient independent oversight to protect our right to privacy.
The consequences, if financial surveillance developments in this sector are handled badly, including information-sharing partnerships, would be severe. The risk is not only to confidentiality of an individual’s personal data and finances. It has the potential to adversely impact upon people’s lives, in ways that are unfair. The accumulation of data, from an increasingly broad range of sources, and shared across organisations, increases the risks of breaches and other security concerns. As risks turn into ‘incidents’, including within these public/private partnerships, these factors can impact upon the confidence that people and communities have towards financial professionals, and indeed trust in the entire formal financial sector.