Matthew Smith (left) and his team in Bonn study how and when people place their trust in digital systems.

Common-sense surveillance

Today’s digital surveillance capabilities are incredibly powerful. And while they can help catch criminals, they also pose a threat to privacy. Last year, Matthew Smith and his team at the Centre for Cyber Trust conducted a study to learn more about how people weigh up the pros and cons of the technology.

Surveillance—both physical and digital—is everywhere. Video cameras are installed at airports, in city centres and in parks. Smartphones record our communications and browsing habits. The police rely on surveillance to track down and profile criminals. Digital surveillance capabilities in particular are expanding rapidly—and the authorities are increasingly using the data for a variety of purposes.

But what does society think about digital surveillance? When does technology become an invasion of privacy and what uses are acceptable? These questions are a central aspect of research conducted at the Centre for Cyber Trust at ETH Zurich and the University of Bonn. Matthew Smith, professor of computer science at the University of Bonn and the Fraunhofer Institute for Communication, Information Processing and Ergonomics in Bonn, says it’s ultimately about building digital systems that people will—and can—trust.

Last year, Smith and his colleagues Lisa Geierhaas and Charlotte Mädler conducted a study to learn more about  public trust in connection with digital and physical surveillance systems. A survey of roughly one thousand people was conducted in the US at the end of May and beginning of June. One key question asked survey participants to indicate how much they believe certain surveillance methods represent an invasion of privacy.

Trust is central

The results show that respondents consider digital and physical surveillance methods to be equally invasive. A large majority believe that digital surveillance technology can collect more information than traditional systems, and roughly eighty-three percent believe digital methods are misused. The corresponding figure for assumptions about the misuse of physical surveillance is at sixty-nine percent.

An important, albeit unsurprising finding is that surveillance systems are mainly accepted when they are used to protect people and prevent crimes such as terrorism or child abuse. “We’re talking about clear, but by no means unanimous majorities,” Smith says. Interestingly, disapproval of surveillance in such cases isn’t because the respondents think stopping crime is unimportant. Rather, doubts about the effectiveness of the methods and fears they could be misused are at the core of such opinions.

This result indicates that when people mistrust surveillance systems or the operators behind them, they also object to their use—even for generally accepted purposes. “This means trust is central,” Smith says. At present, however, digital methods in particular are opaque and used somewhat indeterminately. “When the police physically execute a search warrant, the process is usually transparent and carried out for a specific purpose. It’s possible to assess whether an operation is justifiable and to criticise any oversteps. But digital surveillance systems operate largely invisibly, and ordinary, law-abiding citizens may be entirely unaware they’re being observed. This can lead to mistrust.”

Finding the right balance

One way to foster trust would be to create digital surveillance technologies that work in stages—for example, systems that can identify which data are needed to help solve a crime automatically, and that can be verified to collect no more information than is strictly necessary. “We need well-balanced systems that are broadly accepted in society,” Smith says.

This is easier said than done. The study revealed a variety of viewpoints, including nuanced and debatable perspectives alongside indecision and contradictions. For instance, most respondents believe digital measures are common—and also commonly used for questionable purposes. “But only fifteen percent are concerned that their own privacy might be at risk.” In addition, Smith notes that many survey participants selected the first answer in the questionnaire more often than the second. He explains that this pattern is also sometimes seen in elections and could suggest that respondents don’t have a strong opinion.

Attitudes of respondents are also filtered by their political leanings. The team discovered that affiliation with a political party is a decisive factor in whether or not surveillance is seen as intrusive. People who support the government believe significantly more surveillance measures are necessary, whereas the opposition is much more critical. “Without trust in the real world, there can be no trust in the digital world either,” Smith says.

Are there ways to transfer trust in the real world to the digital realm? Smith says new, purpose-oriented digital tools are one part of the solution. Another part is more transparent communication. Often, the authorities do not provide details about surveillance measures, generally on the grounds that doing so would jeopardise their effectiveness. “Which is true, to an extent,” Matthew Smith concedes. “But transparency could help build trust in controversial methods. I think the authorities would be doing themselves a favour by playing with slightly more open cards.”