European Commission logo
English English
CORDIS - EU research results

Privacy and Usability

Article Category

Article available in the following languages:

New tools enable data sharing in good conscience

Our most private and sensitive information will never entirely be safe unless we get the means to make appropriate data sharing decisions. The PrivacyUs project not only provides such tools, it built them based on extensive, unprecedented behavioural research.

Digital Economy icon Digital Economy

Data breaches and leaks of private information have made us increasingly wary of websites and cloud platforms asking too many questions. Sure, SSL certificates, incognito windows and the GDPR have somewhat eased these concerns. But does this mean we’re finally using the internet as we should? The PrivacyUs (Privacy and Usability) project, undertaken with the support of the Marie Skłodowska-Curie programme, investigated common behaviours and developed new tools to make the web safer for us all. As Leonardo Martucci, associate professor of Computer Science at Karlstad University and coordinator of PrivacyUs, explains: “The greatest threat to individual data owners remains the unintended/unexpected use or disclosure of their personal data. The GDPR was certainly a step in the right direction, and so is the increased public awareness resulting from reports of personal data breaches in newspapers. But even today, individuals continue to share their personal data inadequately.” To find out why this happens, Martucci and his team investigated our attitude and behaviour as we make privacy-related decisions online. They investigated how visual cues and graphical representations influence decisions to share or not to share personal data, as well as how the choice of not sharing may negatively influence how others perceive us.

How we see and how we are seen

“The results of our experiments on users’ emotional state, visual cues and graphical representations of privacy policies show that individuals can be nudged towards deciding to share or not to share their data depending on how requests are displayed. This is important because it demonstrates that it is possible to influence user decisions,” Martucci adds. Such results also reinforce the idea that privacy-related decision-making is neither purely rational, nor purely irrational. There are actually many factors influencing such decisions. Some of them relate directly to our personality, while others – such as the context of the interaction – are purely extrinsic. Once developers and designers know exactly which factors are at play, they can develop usable security and privacy tools with user-centred designs. This is precisely what the PrivacyUs team did, as they looked into very specific cases to identify measures making users more aware of their data privacy. In one instance, they took the case of gay men engaging on dating applications. Are they willing to disclose their HIV status? How does disclosing (or not disclosing) this highly sensitive information affect the perception of other platform users? Once the team found out, they proposed a number of design considerations mitigating user stigmatisation based on their choice. Another issue the project focused on is NFC payment. “We have redesigned the NFC payment experience to improve its usability, security and privacy. The end result is an improved screen design and sensory feedback for NFC payment terminals. Besides, we looked into personal data leaks in mobile applications and studied the positive impact of the GDPR on reducing the number of personal data leaks,” Martucci notes. A third, very contemporary example that the team touches upon is the overall lack of privacy-related graphical interfaces in IoT devices. “We suggest the use of nutrition-like privacy labels to be printed on the package of those devices, so that users can easily compare IoT devices and decide beforehand on conditions for sharing their personal data. These are similar to the labels that will be adopted by Apple’s App Store starting in December 2020,” says Martucci. No matter the application, the PrivacyUs mathematical models can represent the interactive and reinforcing factors involved as users decide to share or not to share personal data. The project’s security and privacy tools, its legal analysis on unfair data practices, and the proposed legal measures to preserve the privacy autonomy of individual users are all bound to help design better IT tools and applications that we can all use collectedly.


PrivacyUs, data sharing, user behaviour, privacy, GDPR, HIV, IoT, NFC

Discover other articles in the same domain of application