A Pandemic Of Data Sharing?

Pandemic Of Data Sharing?

As the COVID-19 pandemic continues, the use of apps and other trace and track technologies have not delivered on their promise. Strong personal data regulation, coupled with public suspicion about how their data is collected and then used, is driving the debate surrounding Coronavirus and how technology can be used to combat its impact.

How personal data is being used has once again come under the spotlight. COVID-19 and the contacting tracing apps that rely upon personal data have generally not performed as expected. One of the critical reasons for this lack of uptake are worries regarding data security. Pandemic security continues to be hotly debated.

The global research of 16,000 mobile users from adtech company Ogury, reveals in the UK, where the centralized app model has been abandoned, only 41% of UK citizens are willing to share data with the government to combat COVID-19, and 60% don’t trust it will protect any data they share.

In France, where the app has been downloaded by just 2% of the population, only 33% of respondents are willing to share any data to combat COVID-19, the least of all countries surveyed. Even in Germany, where downloads topped 6.5 million in 24 hours, only 36% of citizens are willing to share data, and 60% don’t trust the government to protect any data they share.

Speaking to Silicon UK, Elie Kanaan, Chief Strategy and Marketing Officer, Ogury explained: “While the UK’s COVID-19 tracing app is still yet to launch, it’s already faced heavy scrutiny for its data privacy practices, both for the centralized and decentralized model. Concerns include the access of data given to third parties, the long-term storage of data and the option of additional location tracking being introduced in the future.”

Elie Kanaan, Chief Strategy and Marketing Officer, Ogury
Elie Kanaan, Chief Strategy and Marketing Officer, Ogury.

Kanaan continued: “Consumers are willing to share data when the value-exchange is fair. Spotify is an excellent example of this as users can either pay a subscription and access musical content without ads, or they can access musical content for free and receive audio ads in between songs. This is a clear and fair value exchange, which consumers understand.

“The value exchange for contact tracing apps is sharing personal information to reduce the risk of spreading and catching COVID-19. While many citizens will believe this is a fair exchange, concerns still remain about how this data will be used and stored, especially in light of the fact the UK’s test and trace system may store personal data for over 20 years. These concerns need to be addressed by giving citizens clear information on exactly how data will be used and protected.”

Mounting concerns about how personal data is being used has been quantified by Genesys who specialize in cloud customer experiences. According to their research, the vast majority of UK consumers (92%) say they are concerned about how companies use their data. Despite most shoppers saying they are concerned about data usage, the study found that over a half (55%) of UK consumers will continue to use a company that has leaked their personal data by accident.

The reason? Over a quarter (26%) say they will reluctantly carry on doing business with a company despite a breach if they feel there isn’t a convenient alternative. When it comes to penalties to the business itself, more than a third (34%) reported that reparations to those affected are the most appropriate enforceable action. Slightly fewer, 32%, believe that a government fine proportionate to the size of the data breach, such as those issued under GDPR, would be a suitable consequence.

As competing COVID-19 tracing apps continue to be developed, consumers’ attitudes towards how their data is being used in a range of scenarios from shopping to health need to be taken into consideration.

Without more widespread adoption of tracing apps and a relaxing of anxieties regarding data manipulation, any future tracing app could be fatally handicapped even if decentralized data storage and management can be achieved on local devices.

Health or privacy?

To gain a more detailed insight into how personal data collection, COVID-19 and the future of information management, Silicon UK spoke with several leading experts.

[GT] Gus Tomlinson, General Manager, Identity Fraud, Europe, GBG.

Gus Tomlinson, General Manager, Identity Fraud, Europe, GBG.
Gus Tomlinson, General Manager, Identity Fraud, Europe, GBG.

[AB] Amanda Brock, CEO, OpenUK.

Amanda Brock, CEO, OpenUK
Amanda Brock, CEO, OpenUK.

[ZR] Dr. Zulfikar Ramzan, CTO, RSA Security.

Dr. Zulfikar Ramzan, CTO, RSA Security.
Dr. Zulfikar Ramzan, CTO, RSA Security.

[BE] Bruce Esposito, Global IAM strategist, One Identity.

Bruce Esposito, Global IAM strategist, One Identity.
Bruce Esposito, Global IAM strategist, One Identity.

[RJ] Rahim Jina, COO, Edgescan.

Rahim Jina, COO, Edgescan.
Rahim Jina, COO, Edgescan.

[NW] Nick Watson, Commercial Partner, Keystone Law.

Nick Watson, Commercial Partner, Keystone Law.
Nick Watson, Commercial Partner, Keystone Law.

Is a lack of confidence with personal data security at the core of why the UK’s COVID-19 tracing app project failed at least in its first iteration?

[GT] “Contact tracing apps are getting a lot of flack right now, but I think the conversation is moving in the right direction. Rather than people debating the morality of such tools, conversations are now moving on to more practical debates – specifically, concerns around lack of GDPR compliance, data privacy, and how to best implement these solutions so as not to break public trust. It’s all about balance and, ultimately, if end users deem the benefits to themselves outweighing the benefits given to the company or entity collecting their information, it’s generally found to be an acceptable use case for personal data.

“GDPR and similar EU regulations place the consumer at the heart of privacy, aiming to give full control to the end-user. Any company using personal data should also be doing this as best practice, and ultimately, if organizations are transparent with users, allow a user the rights of their data when applicable, and is using data in the best interest of the subject, then tracing apps can be created safely.

“As COVID-19 continues to drive the evolution of the digital economy, the way consumers, companies, and governments are interacting with personal data has created tension in the way people interact with their own personal information. This means achieving a balance of security with user experience – or frictionless trust – will be key. What is promising is that consumers have been clear: they want an experience that is as easy as possible, and as safe as possible.

“That means the exchange of data needs to be based on mutual advantage. In the case of track and trace, that means handing over their personal data – securely and privately – in exchange for the hopes of increased protection and lowering of the infection rate. Despite ongoing trust issues around data privacy, consumers will broadly still share their information when the benefits of doing so are clear – in this case, eradicating COVID-19.”

[BE] “I think a lack of confidence did play a role. A major issue for contact tracing apps is getting people to use them. One study claimed that at least 60% of citizens would need to download an app for it to be effective. A lack of trust in the app discourages its use. Not only a lack of trust that one’s personal data is secure but also a lack of trust of how the data will be used. If a person identifies themselves as infected what happens next? Could this affect how I am viewed by others? Could this adversely affect my employment? What personal harm could sharing this information cause?”

[NW] “Marginally, perhaps, but not substantially. Although there are all sorts of tensions between GDPR and global commerce as the recent Schrems II judgment reveals. The lawful bases in reliance on which personal data can be processed, contemplate scenarios where personal data can be processed without the need for unnecessary bureaucracy, such as consent. Health data, which is a special category and subject to more stringent safeguards under GDPR, can still be processed without consent based on health and public interest grounds, for example, so long as appropriate safeguards are in place.

“The critical success factors are (1) trust in the guardian(s) of the personal data collected, (2) knowledge that the collection is necessary and proportionate (3) confidence that the data will be properly secured and so not vulnerable to hacking or other abuse / misuse and (4) ease of use for example installation, battery life impact etc. of the apps. The natural inertia of laziness and selfishness will play a major part in preventing the success even if the other three conditions are satisfied.”

[AB] “It appears that confidence in personal data security is playing a large part in the media response to the withdrawal of the first iteration of the track and trace app. Whether that’s the actual basis for the withdrawal is, of course, another thing.

“The involvement of Google and Apple is likely to also come under further scrutiny over the coming weeks due to this week’s European Court decision covering Schrems2. This decision invalidated the EU-US Privacy Shield, focusing the mind on the adequacy of any data processing in the US. US data privacy laws are, of course, not being deemed adequate by Europe.”

We all freely give sometimes highly personal and sensitive information to companies in exchange for goods or services we want. Why is the data a COVID-19 tracing would use seen as so different?

[AB] “Our health data falls into a special category of data called Sensitive Personal Data, which is given special protection under privacy laws including GDPR. That matters to most people, and while we see an inherent trust in the NHS, people are concerned that their data may be shared with companies. They want to know not only what is being held and how long for, but who it is going to be used by and for what purposes.

“Personally, I think that people’s concerns go beyond that. It’s the element of being tracked that causes the concerns. The Big Brother feeling of the State knowing who you have been in contact with is real, even though it is for a quite legitimate purpose. Many people are unaware of the settings on their smartphones and just how much data is already tracked by various apps and providers. The public discussion on the track and trace app has brought this back into the public conscience.”

[BE] “I am not sure it is seen as that much different. I think the tide is changing regarding sharing our personal information with companies. More people are becoming aware and sensitive to their information being shared. In fact, many now regularly just provide false information to companies. COVID-19 tracing is considered even more sensitive since a person is sharing personal health information and often location data.”

GDPR has just reached its anniversary. Has the EU, in particular, created a data regulatory regime that makes it difficult for tracing apps to be created?

[ZR] “For many years there has been a fundamental misalignment of incentives. If an organization collects consumer data, and that data is either abused or inadvertently exposed, then the harm falls upon the consumer. Regulatory compliance initiatives in the data privacy domain help incentivize organizations to take at least basic measures that protect the information they collect.

“These measures do not inhibit the creation of contact tracing apps but encourage their designers to approach data protection thoughtfully. If anything, these measures should improve the underlying application. After all, contact tracing is only as useful as the data collected. The integrity and availability of the data collected are central to the correct functioning of contact tracing applications. Both aspects (integrity and availability) are core pillars of adequate data security.

“It’s important to stress that COVID-19 is the single greatest accelerant of digital transformation in recent times. Digital interactions now permeate nearly every aspect of our lives, including how we buy groceries, collaborate, learn, celebrate, and even mourn the loss of loved ones. Every time we go online, we leave a trail of digital breadcrumbs.

“When looked at individually, these digital traces seem inconsequential, but when considered in aggregate, they create a comprehensive digital dossier that stands to expose the most intimate parts of our lives. Against this backdrop, it’s more important than ever before to double down on efforts to pass legislation that incentivizes organizations to adequately care for and protect the data they collect about consumers. The stakes have never been higher.”

[AB] “GDPR doesn’t do so much more than the original data protection regime, and IMHO was really forced by two things: updating privacy protection to recognize that the world has changed and fixing where a large section of business had not implemented the steps. It’s like asking everyone to wear a mask on public transport, and then when they don’t comply, feeling forced to make a law requiring it whilst shaking your head in disappointment at having to do this.”

Are citizens waking up to the value their personal data holds? Is this awareness of the intrinsic value of their data influencing UK citizen’s suspicion of tracing apps even though they could have a positive impact on their health?

[AR] “There has been a rising tide of awareness about the personal data being collected about each individual. This level of increased awareness is certainly a contributing factor to the adoption of contact tracing applications. It’s also important to bear in mind that there been an increasing chasm of ignorance regarding the massive nature of the underlying collection infrastructure and the extent to which it is highly interconnected. Even though people are more aware, in general, that the data they consider sensitive has the potential for exposure, few realize the true magnitude of the risks.”

[BE] “Yes, citizens are becoming savvy in the value and use of their personal information, but I think their suspicion of tracing apps is more around the risk of sharing their information versus the value of it. Citizens are often concerned about themselves first. They all want to be protected from others who may be infected, but they fear what might happen to them if others know they are infected. How could this impact their reputation, their employment, and their social and family life?”

[NW] “I do not believe people do understand the value of their personal data. The UK population likes to think of itself as too clever to be tricked. Hence scepticism and conspiracy theories dressed up as “you can’t pull the wool over my eyes; I’m too smart to be conned.”

“This can manifest itself in a reluctance to adopt such technological and, admittedly slightly intrusive technologies where the benefit to the individual is not apparent. It is only indirect and remote (individuals only benefit because of the general societal benefit). The trade-off between personal data and commerce is more readily accepted because people see the direct return on their “investment” – not that many see it that way; they tend to rationalize the trade of personal data for a discount / site access etc. (inaccurately) as getting something for giving nothing.”

It’s clear that across the EU, the attitude to data exchange is different. Why is this? Is it culture, or perhaps where tracing apps have been more widely accepted, a lack of high-profile data breaches have helped with the app’s acceptance?

[RJ] “Culture definitely plays a huge part here. Data breaches happen all the time, everywhere. What’s important is how they are dealt with, and how seriously a government and the society of that country take data privacy. Trust versus suspicion of a government will usually trump any wider independent effort to deploy any type of app that implies any kind of tracking. Both of these have deep historical roots which are difficult to shake in the short term.

“Countries where citizens maintain a higher level of trust in their government will likely fare better for any adoption of a contact tracing app compared to a government where citizens are suspicious of their government. Take Ireland for example, where a good deal of its citizens maintain a certain level of trust with their government. They just recently rolled out their contact tracing app, which had a huge adoption rate in its first week. The process was handled with radical openness and transparency, which helped to snub out any residual suspicion. Ireland suffers the same level of breaches as other countries. Trust takes time to build and is easily knocked!”

[NW] “Culture plays a large part in the social and political approach to data protection laws. In Germany, for example, where those in the East lived in a sophisticated surveillance state until 1989, the privacy of one’s personal data is particularly sacrosanct, since it represents freedom – not just from commercial exploitation, but also oppression.

“By contrast, the concept of whistleblowing, something which many in Britain and elsewhere may accept as a sensible safeguard against bullying and corruption, is hard to accept because it carries strong connotations of the old world of Stasi surveillance. States with a stronger socialist emphasis in their politics and national culture, such as the French or Nordics’ belief in the importance of the of the state in social welfare, for example, have also achieved greater support for the concept as it involves some (minor, I would argue) personal sacrifice for the greater good.”

[BE] “I do think the varied attitudes are reflective of cultural differences. A lot is influenced by how the citizens view their governments. A government’s history of corruption or misuse of power can affect one’s trust. I think this trust in one’s government has a bigger impact than high-profile data breaches. These high-profile breaches are now becoming so common that society is becoming callous to them. Often, we don’t see them as affecting us personally, but instead we view it as something that affects other people.”

Are tech companies – to a degree at least – more trusted by consumers than they trust technologies developed by their governments?

[AB] “The company versus State dichotomy is an interesting one. I do thoroughly believe that the NHS is trusted by most of us and their bravery and commitment (despite every difficulty we have seen publicly) has only increased our trust and belief in it.

“We now need to give the UK a few weeks to see where it gets to on this. Having withdrawn that app, it’s hard to see the government coming back easily from the testing failure on the Isle of Wight. Matt Hancock took such a significant risk being so public about testing with the inevitable press reaction to failure when, of course, everyone across tech knows you test to find flaws.

“Openness will of course help that, and we have seen Germany come back from their April app withdrawal in seven weeks. Thanks to their ongoing release of code via GitHub, the German Government benefited from open source collaboration. They appeased the privacy campaigners, and it allowed over 7000 bugs in the code to be identified and fixed before it was launched. As one friend said, they did just enough to keep the privacy campaigners happy.

“Unlike Governments, tech companies have had a load of our data for years. There’s no such thing as a free lunch and no such thing as a free service. Are they more trusted? I don’t know that I would put it quite that way but sharing data with them is more normalized.”

[ZR] “Trust is a highly multifaceted notion in its own right. An entity may be considered trustworthy in one context, but not in another. For example, I may trust a particular surgeon to perform an operation on a child, but I would not necessarily trust that same surgeon to babysit that same child.

“A number of tech companies have a track record of building complex digital infrastructures, so maybe more trusted by consumers in that regard. At the same time, some tech companies have business models that are predicated on monetizing data collected from their customers and may be less trusted. For each person, the decision on whether to trust a technology company versus a government entity will be personal and based on upon their perception of the relative trust trade-offs.”

The NHS App will now use tech developed by Apple and Google, both major data aggregators. Is there fundamental hypocrisy with consumer attitudes regarding the use of their personal data?

[BE] “I don’t think so. Consumers have only recently begun to realize how tech companies have been using, and at time misusing their personal data. This lack of trust is only amplified when these same companies are now helping governments with their use of personal data. The personal damage a government can do through the misuse of personal data is far greater than what a tech company can do. India’s Aadhaar program [https://uidai.gov.in/my-aadhaar/about-your-aadhaar.html], the world’s largest digital ID program, was shown to be at risk for fraud as well as system failure. This resulted not only in citizens having money stolen, but it also resulted in preventing people from accessing needed government resources, a tragedy that resulted in the death of at least 15 people.”

[ZR] “The challenge in trying to answer this question unequivocally is that trust decisions are very personal, involving the trade-offs each person makes about their data, the ability of an organization to provide a reliable service, and the value obtained for that service. The NHS may trust vendors like Apple and Google to build technology that works effectively on operating systems like iOS and Android, respectively.

“Given their knowledge of relevant technical details, tech companies may be better positioned to develop applications that can collect more reliable contact tracing results. Consumers can potentially trust that the app will function accurately. At the same time, consumers may be more wary about how their data will eventually be utilized. Ultimately, whether a consumer is willing to make the leap of faith will depend on very personal circumstances.”

[RJ] “In a strange way people are more comfortable for big tech to drive or be part of these initiatives, rather than a government only approach. Given the involvement of Apple and Google, this technology will now be used across the globe, which gives people a sense of strength-in-numbers. Users can now rely on the fact that there will be a high level of scrutiny on every aspect of these apps. An issue in the core tech found in one country could have potential benefits for the rest of the world.”

What do you think app developers can learn from how the COVID tracing apps have been received across the EU?

[GT] “As the government works on a sustainable solution to track and eradiate COVID-19, the critical thing to remember is that contact tracing apps are linked to real people, which means that identity verification and anti-fraud measures must be taken to validate and protect consumer data. The reasons for this are twofold: firstly, to protect consumer data privacy from bad actors, but also to ensure that the data collected is accurate and able to be used for anti-COVID purposes. These measures will only become increasingly important as contact tracing apps are developed, and as the public continues to question the application of technology.

“Ultimately, the truth is this: technology like contact tracing will likely become standard in life post-COVID, and rather than focusing on whether the application is ‘good’ or ‘bad’, we now must turn our attention to making sure it’s a sustainable solution for businesses, governments, and most importantly, consumers. As we move forward, our collective health and security rely on building frictionless trust with app providers – whether public or private.”

[ZR] “One of the most important lessons is that COVID-19 contact tracing apps are one element of a much more complex system. Even traditional contact tracing is not carried out in isolation. Physical contact tracers follow up with potentially impacted people, provide them with advice regarding appropriate next steps (such as self-quarantine), and even help arrange resources for that person to be successful (such as delivering groceries and medications). App developers should be cognizant of the overall end-to-end ecosystem and how their piece fits into the bigger puzzle. The challenge with COVID-19 is that our understanding is changing on a minute-by-minute basis as new data is examined. It is like trying to solve a puzzle, but where the pieces are constantly changing shape.”

It looks likely that some form of trace and track app will eventually appear probably built by the tech giants consumers know well. Taking a decentralized approach to data collection may alleviate – to a degree, at least – the anxiety that is palpable across the UK.

Ogury’s Elie Kanaan concluded: “Several countries in the EU have launched contact tracing apps, but the common theme is that trust is key for achieving widespread adoption amongst citizens. Whether offering a contact tracing app or a food delivery app, consumers are more aware than ever that their data has value and they have rights over its use. From this reception, app developers are able to learn how essential building trust by giving citizens full control over their data is. App developers should ensure they use robust consent and preference management solutions which adhere to the highest data privacy standards and therefore use data safely.”