The Swiss population recently rejected a proposed digital ID in a referendum, based in large part on privacy concerns based on the involvement of private companies in providing the ID. This follows a notable reluctance to adopt the Swiss COVID contact tracing app, also on privacy grounds.
This trend is not limited to Switzerland, and suggests a significant lack of digital trust. This lack of trust is not always evident day to day, but these rejections show how it can limit our digital future if not addressed.
There is a general ‘techlash’ that is taking place, which started with the understanding that free services come in exchange for personal data, and that there is a flip side relating to privacy and security of that data.
As our use of the Internet increased in response to COVID pandemic restrictions, fears have generally increased. These fears are then expressed when we are confronted with a discrete choice about our data, such as with the digital ID.
The digital ID result was interesting, because we do not typically get to measure the level – or absence – of digital trust. That is, in part, because there seems to a difference with respect to existing or new services.
Specifically, while digital trust has been falling, there is little evidence of less usage of existing services. That may because of familiarity with existing services and a comfort with the perceived risks; unwillingness to give up the benefits of popular services; and a general herd mentality to use what everyone else is using.
On the other hand, there is no familiarity with new services, and no immediate benefits to give up if they are not used. However, the long-run benefits from new services are not realised. Perhaps nothing illustrates this better than the recent experience with COVID-19 contact tracing apps.
Contact tracing is a traditional way of cutting down transmission by isolating those with whom a sick person had contact while infectious. While the spread of COVID quickly overwhelmed manual contact tracing, the prevalence of smartphones provided a new tool to automate contact tracing.
An early study by the University of Oxford claimed that if 60% of citizens used a contact tracing app, the pandemic would be stopped – and initial surveys showed a high willingness to use the apps.
Needless to say, it did not work out that way.
The uptake of the apps was well below 60% in all countries – around 20% in Switzerland - and the key reason given was worry about privacy. However, many of the apps are based on a privacy-protecting Exposure Notification feature developed by Apple and Google for their phones, which kept track of users by random ID, did not track location, or provide any information to any central authority.
While studies showed the existing use of contact tracing apps reduced transmission, it was clearly not enough to stop the pandemic.
One thread that comes from the digital ID vote and contact tracing app adoption is a lack of trust in the companies providing the services, and whether people’s data will be secure. But every day we trust companies with our valuables and our lives. We put our money into banks, where we cannot see it, but we are confident it will be there when we need it; we put food and pharmaceuticals in our body that we cannot test but we trust; and we put our bodies into cars, elevators, and airplanes generally without a second thought. It was not always so – just think how car safety has evolved over the past 50 years.
Each of these industries built trust based on continual oversight. While bank standards are guaranteed to protect depositors, the banks must comply with regulations. Cars have safety standards, they are tested, they are rated and they face liability if they fall short. The same is true of course for pharmaceuticals – as we are acutely aware as we desperately follow vaccines through testing and certification.
While government has the ultimate regulatory and enforcement role, other organisations may play a third-party role in developing standards, certifications, and ratings.
The next test of digital trust is likely to come with the adoption of digital vaccine passports, which can help finally restore our lives, but are already raising privacy concerns. This lack of adoption of new digital services should be a wake-up call for all stakeholders – governments, companies, researchers, and civil society – otherwise lulled by continued use of existing digital services in spite of the techlash.
This lack of trust is not unique to the tech sector, however, and can provide lessons in how to build the trust that will be needed for our digital future.
The Graduate Institute will organise a conference on these Digital Trust issues with the EPFL on 15 October. Please save the date and more information will follow on the agenda and invited speakers.