This is What a Cyber Security Expert Thinks About the New Government Contact Tracing App
[Article updated 18th May 2020 following a government announcement that suggested the contact tracing app rollout would be delayed until June.]
Some people collect stamps, coins or comics. Matthew Geyman, MD of IT support business Intersys, meanwhile, collects stories about cyber security failures. Some of these are of such Laurel and Hardy level ineptitude they would provoke laughter – if they weren’t so serious. And didn’t see private citizens’ data spilling all over the internet or, worse, surreptitiously falling into criminals’ inboxes.
‘NHS coding error sees details of 150,000 patients shared online.’ Matthew’s got that one.
‘Fraudsters withdraw cash from 20,000 Tesco Bank current accounts.’ He’s got that one too.
You get the idea. It’s part of his job to collect this stuff. After all, cyber security is a core part of his service and he and his team have written about data security extensively. So when the government unveiled plans for a centrally managed contact tracing app to help deal with the coronavirus pandemic, he was interested. To put it mildly.
Contract tracing apps are a great idea. They record details of our movements and alert us if we have been in sustained contact with someone who later self declares Covid-19 symptoms. Then, we can protect others by self-isolating.
There are two types: centralised apps save our data to a central system. Decentralised apps, such as the Exposure Notification API developed in a joint project from Apple and Google, will alert users directly without storing information on a central server.
The former centralised type is, according to Matthew and others in the industry, a problem from a cybersecurity and government surveillance point of view. A big problem.
The government may rethink its centrally managed model. At the date we put this story together (15 May 2020) there were rumours they may be entertaining a different approach. However the story unfolds, the decisions the government makes now are likely to have ramifications – in terms of the public’s trust in the government’s use, or potential misuse, of data – for years to come.
Here’s the interview.
Matthew, can you explain why the government collecting mass data in this centralised way is such a concern for you?
First off, there are some advantages to a centralised model – for example the ability to analyse a greater amount of detailed and granular data. But it’s the justifiable fears around centralised data collection that is on many people’s minds.
There are many, many examples of ineptitude at the highest level. To give you a recent one: Sheffield City Council’s automatic number-plate recognition (ANPR) system was inadvertently exposed to the internet. Millions of road journeys, involving thousands of people, were published for the world to see.
So you’re saying something similar could happen with our data in the contact tracing app?
Yes, and it’s not just me. One hundred and seventy five UK academics wrote to the government recently criticising a design choice that centrally stores sensitive health and travel data. It’s a hugely valuable target for hackers. The academics also suggested the information could lead to government surveillance.
But isn’t this all just shop talk from academics? Will the public really care?
The decision to use a big data model, relying on a vast centralised data store, rather than via distributed or mesh models, will put quite a few people’s ‘Big Brother’ antennas up. If overall uptake is in any way correlated with the responses I’ve seen on social media, there will be a significant portion of the population who refuse to run the app, on principle. Although this scepticism has come from non-technical users, who are less able to scrutinise its data protection principles, it will set the tone of the discussion.
And technical users?
The scepticism from technical users about the scope creep of this app is, if anything, even greater.
What are the consequences of this scepticism, from technical and non-technical people?
Considering the government is asking us to ‘do our duty’ and download the app (Health Secretary Matt Hancock’s words) they must be very cautious about a long term loss of faith in public data protection, especially considered against some previous examples of poor public data protection.
What could the government do to make the situation better?
If the contact tracing app’s encryption algorithms and methods used for data anonymisation and pseudonymisation are made public and found to be trusted within the security and cryptography community (meaning that re-identification and de-anonymisation are impossible), it may go a long way to assuaging the security and privacy concerns of technical and therefore non-technical users.
It’s worth noting that anonymisation is irreversible, which makes the possibility of re-identification impossible; however pseudonymisation (typically through data masking) does allow some reversal.
However, being this app is associated with ‘Big Brother’, concerns will naturally remain that someone else may have slurped or accessed the data prior to its anonymisation.
You mention ‘assuaging the security and privacy concerns of technical and therefore non-technical users’. Do you mean that if we get the former on board, the latter will naturally follow?
Yes, but this scenario is a moot point – at least right now. There’s not much sign of transparency in its design just yet. Even if that changes, it may not be enough to persuade enough people, if the seeds of mistrust have already germinated.
GCHQ’s words aren’t exactly reassuring ‘The back end is built to be as secure as is practical, but remember it holds only anonymous data and communicates out to other NHS systems through privacy preserving gateways, so data in the app data can’t be linked to other data the NHS holds.’
We’re guessing it’s the ‘secure as practical’ bit that is worrying you? What could that even mean?
The contact tracing app can only truly work when location data and identifying information is combined in a centralised place. There are commercial organisations whose business is to gather immense amounts of, seemingly innocuous, data fragments, which they can combine to identify people and to infer things about them.
The government’s National Cyber Security Centre says ‘The app asks the user [for] the first part of their postcode (LS1 or SW1A, for example, for NHS resource planning, mainly)’ as well as a unique ID and your phone’s model. It also says that the random elliptic curve key pair encrypts your ID ‘in a way that only the NHS server can recover’. Well, there’s some linguistic sleight of hand and ambiguity here which raises concern. We know that commercial organisations aren’t beyond using illegally harvested information and that there are hackers who’d sell it to them.
This is also pertinent when we know that the USA’s National Security Agency (NSA) has previously ‘hamstrung’ the mathematical integrity of the iPhone’s Dual Elliptic Curve Deterministic Random Bit Generator. This is important, because cryptographically secure pseudorandom random number generation is critical to non-reversible encryption. However, the NSA managed to introduce a flaw which allowed them to later decrypt it. What’s more, as revealed by Edward Snowden’s whistleblowing, the NSA even paid $10m to the security company responsible for the encryption products.
Furthermore, we know from Parliament’s Human Rights Committee that the data may never be deleted and will remain the property of GCHQ (the security services). We must simply trust that it will not be misused.
To put you on the spot now: given your own concerns, will you be downloading the contact tracing app and would you recommend others to?
Yes, with huge reservations, because it feels like the right thing to do, if we are to tackle and defeat coronavirus.
But many people will have a far greater scepticism and suspicion than I do. Given that there are prominent people who ascribe Covid-19 to 5G masts and who also don’t accept that one of mankind’s overwhelming achievements over the past 50 years is those millions of lives saved through mass vaccination, the challenge of getting people to trust the app may be greater than expected. Making the app compulsory would be a travesty for everyone, when a different approach to design could make it willingly accepted by the majority of the UK’s population.
Meanwhile, there are reports that the government is considering using the Apple/Google approach after all. Bear in mind Australia, which originally planned to use a centralised system, is switching to a decentralised one after technical problems.
To sum up, this is an utterly fascinating technical and anthropological challenge. It has the potential to do great good and, if misused or abused, to do great harm – in terms of efficacy in fighting coronavirus and to the UK public’s relationship with its system of government.
[Postscript: On 18 May, the UK government announced the phone app would appear only ‘in the coming weeks’, which many believe is later than originally suggested. This followed reports the previous day that one of the private sector companies looking after recruitment for track and trace trackers had sent emails, ‘that said the process was on hold while the government considered an alternative app‘.]