Technology has undoubtedly made life more bearable during a global pandemic. From drones being rolled out to deliver medicines and essential items across Ireland, to the rise of video conferencing platforms, such as Zoom creating a more connected community of remote workers. The affordances of our ever increasingly digital epoch have made isolation more tolerable by providing interconnectivity and entertainment. However, there is a darker side to the rapid uptake of technology during this international pandemic, that rears its head under the guise of personal data security.
The UK is currently trailing a new COVID-19 tracing service that uses Bluetooth data from mobile phones to make users aware of possible contact with infected individuals. The mobile application: ‘NHS COVID-19’ is currently being rolled out on the Isle of Wight, with the prospect of further uptake on mainland Britain in the forthcoming weeks. However, despite the negative security connotations of a tracking application, the majority of Britons are in favour of allowing the government to use mobile phone data to track the spread of coronavirus. While in theory this may prove to be beneficial for tracking the spread of the virus, granting the government access to our personal devices is a slippery slope with several security concerns.
According to the NHS website, the goal of this application is to “reduce the transmission of the virus by alerting people who may have been exposed to the infection”. The NHS claims that “the app has been designed with privacy in mind”, going on to stipulate that it does not collect personally identifiable (PII) data from users. However, this does not put to rest several security issues that have been raised. Despite the positive implications of this service that proposes to keep the British population aware of possible contagion, there is a darker side to this application that has roots in state-surveillance and personal data security. Indeed, the ethical implications of this service were only briefly touched upon in the closing paragraphs of the statement, suggesting that ethics are not at the forefront of the debate.
Furthermore, Ian Levey of the National Cyber Security Centre wrote a lengthy blog detailing the security parameters of the application. However, the topic of security doesn’t fully arise until after 10 paragraphs of general discussion, of which several paragraphs are devoted to delineating how contact tracing has been utilised more than 500 years ago. The fact that physical security is not discussed until after a drawn-out history lesson suggests that there is more than meets the eye. Levy goes on to state multiple times that the data collected is “anonymous” and can only be utilised to provide health information. However, this does not mean that users are not at risk.
In fact, Levy went on to state that “the cyber security monitoring of the system keeps logs which include IP address, but they’re strictly access controlled and are only accessible to the cyber security team looking after the app system”. Nevertheless, this does not necessarily put user’s anxieties at ease, as any information has the potential to be exfiltrated, be it by skilled cybercriminals, an unsecured access point, or even a disgruntled insider looking to make a quick profit by flogging valuable information on the dark web.
One key issue with the development of this application is the speed at which it is being developed. Usually with time-sensitive applications like this, there is often a compromise between speed and functionality. Tim Erlin, VP at Tripwire outlines this concern, stating: “it can be tough to push back on security and privacy when there’s a strong sense of urgency around any project”. Similarly, Hugo van den Toorn, manager of offensive security at Outpost24 highlighted further cause for concern, noting: “if an app is rapidly developed it is easy to make mistakes regarding user privacy and information security”. With such sensitive information on the line it is clear that we should not be rushing to download any application that may put our sensitive data at risk until it has been properly vetted.
Indeed, Joshua Berry, Associate Principal Security Consultant at Synopsys explains the potential implications of surrendering such intimate details to third parties, even under the veil of anonymity. “Contact tracing applications use Bluetooth Low Energy (BLE) advertisements to send and collect messages to identify contacts made with other users.” In fact, tracing applications allow attackers the possibility to fully read all Bluetooth communications from the cars they drive, to the music that they listen to.
If contact tracing services are to be widely accepted, we must be assured that the PII we surrender is entirely secure. This poses several questions about where sensitive data will be stored, and who may have access to it. For example, the app will harvest biographic information like age, sex and postcode only. Therefore, users must be assured as to who will have access to this data. Will it be utilised by healthcare services or will this information be sold to data analytic companies, or even used by the Home Office to triangulate user movement through Bluetooth handshakes?
Indeed, as more people are abiding by lockdown and working from home, we are seeing cybercriminals leveraging fear to seduce users into clicking malicious links. In fact, Securonix have revealed in a COVID-19 Cyber Threat Update that the number of malicious domains using the words “corona” or “covid19” have increased exponentially. Considering the NHS COVID-19 application, users may end up falling victim to more pandemic-themed social engineering techniques. This is based on the notion that if you are expecting an update from the NHS app then you may be more likely to open an email or text message containing the keywords: “Coronavirus” or “COVID-19”. This is concerning as Securonix proved that more cybercriminals are using virus-themed terminology to mislead users into downloading malicious material. Therefore, users with the mobile tracing app may be more likely to fall victim to medically disguised harmful content.
In fact, security experts expect the proliferation of this application to bring a significant rise in dangerously disguised mobile apps with similar or typo-squatted names. Jonathan Martin, Partner Director EMEA at Anomali warned that “we can expect to see a whole raft of bogus apps appearing, purporting to be the official NHS COVID-19 app that people will be tempted into downloading. As soon as that rogue app is installed on the phone, it will be compromised leading to the potential theft of a whole range of private and personal information such as bank details etc”. Unlike the official app that promises to anonymise PII, bogus shadow apps will actively seek to steal sensitive information.
While there is certainly room for improvement, hopefully the debate surrounding this application will lead to a more security-centric population who are sceptical of any attempt to harvest vast swathes of personal information. It is one thing to rise up in arms against the NHS COVID-19 app, but perhaps I am being optimistic in thinking that the government have our best interest at heart. We should take this time not just to be cynical of the government’s intentions with this amount of personal data, but also and perhaps more pertinently, how other services use and process our data. One must agree with Tom Davison, Technical Director at Lookout, who told us: “it is vitally important, that the public takes the time to understand the personal information they will be sharing and how it will be used, both now and in the future”.
I will leave you with this final question: do you know what permissions you have granted to the apps on your phone? If not, then do some research. After all, there is no point in lamenting the end of individual freedom when your flashlight app is tracking your geolocation. . .