CORDIS - Forschungsergebnisse der EU
CORDIS

Addressing Privacy Challenges in Social Media

Final Report Summary - SOCIALPRIVACY (Addressing Privacy Challenges in Social Media)

The SocialPrivacy project aims to understand the issues of privacy at the level of users, commercial enterprises and society (Diagnostic Phase) and then to propose solutions (Generative Phase) which help manage the trade-offs inherent in allowing companies to access users’ personal data, while providing sufficient privacy safeguards.

In this project we have shown that people’s personal data is collected on a large scale [1]; however, people are often unaware of this kind of access even though they must grant the required permissions upon installation [1], [5] or provide explicit consent. In particular it is often difficult or impossible for people to understand the reasons behind this access and the relative usage (purpose) of the data itself [1], [5]. Even if there is a legitimate need for access the information, that access may be reused for other purposes completely unrelated to it. It is often impossible to determine if there are such secondary uses.

People’s personal data is accessed because they are not aware of it. After testing small improvements aimed at helping explain the type of access [2], [5], we show that when people understand, they share less and make choices focused towards safeguarding their privacy [5]. Preserving people's privacy might be technically challenging but we have shown it can be attainable [3] with little to no effect on the performance of the tool itself (compared to similar and currently non-privacy sensitive tools). Current technologies and mechanisms are inadequate and do not show the information required by people to make informed choices about granting access to their data. Large and small corporations are able to collect this data because of the lack of this easy and clear transparency mechanisms.

Personal data is extremely profitable and can lead to enormous revenues - for example, Google’s main source of revenue is targeting advertising which uses personal information to create match users with companies selling a product. The lack of extensive, detailed privacy regulations, the difficulty in measuring compliance [4] and the difficulty in implementing and managing accountability [7] are causing large-scale (and unsupervised) access to people’s personal data, which bedrocks the principles of privacy.

Companies may need users’ personal data for revenue, market research, etc., but in order to balance the importance of user privacy against the needs of these companies seeking to use users' data, privacy settings and/or mechanisms need to convey in a simple manner the information required for users to make informed decisions.
While we have found the purpose for data access to be the predominant factor affecting users’ choices, we have shown that people’s privacy profiles are different and they can also be based on people’s trust in the service accessing the data and the context in which the data was accessed [6]. People should be informed and understand all the implications in sharing their data, and we have shown that often they are willing to share their personal data, especially if they can see a benefit to themselves (for example improving the service, or targeted ads).

The ability to visualize people’s data can lead to personal privacy disclosures, which could cause very tangible harm to people [9]. Visualizing possible implications of data access [10] (who can access it, what can be inferred, etc.) and/or the implementation of smart privacy-preserving technologies [11] can guide and help people understand possible benefits and risks associated with sharing their information publicly or even only to semi-controlled groups of individuals.

References

References
[1] Liccardi, J. Pato and D.J. Weitzner, (2014) “Improving User Choice Through Better Mobile Apps Transparency and Permissions Analysis," Journal of Privacy and Confidentiality: Vol. 5: Iss. 2, Article 1.
[2] Paradesi S., Liccardi I., Kagal L. and Pato J. “A Semantic Framework for Content-Based Access Controls” in Proceedings of IEEE PASSAT pp. 624-629.
[3] Sweatt B., Paradesi S., Liccardi I., Kagal L. and Pentland A.,“Building Privacy-preserving LOCATION-based Apps” in Proceedings of IEEE Privacy Security and Trust 2014 pp. 27-30.
[4] I. Liccardi, M. Bulger, H. Abelson, D. J. Weitzner and W. Mackay “Can Apps Play by the COPPA rules?” In Proceedings of the IEEE Privacy Security and Trust 2014 pp. 1-9.
[5] Liccardi I., Bulger M., Abelson H., Weitzner D. J., de Roure, D. “No technical understanding required: Helping users make informed choices about access to their personal data”, in Proceedings of ACM 11th International Conference on Mobile and Ubiquitous Systems: Computing, Networking and Services (Mobiquitous), pp. 140-150.
[6] F. Shih and Liccardi I., Weitzner D. J., “Privacy Tipping Points in Smartphones Privacy Preferences”, in Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (ACM CHI’15), 2015, pp. 807-816.
[7] Rahman A., Liccardi I., D. J. Weitzner, “Overview of the accountability area of research”, Draft Report 1-90 pages.
[8] Liccardi I., Abdul-Rahman A., Chen M., “I know where you live: Inferring details of peoples’ lives by visualizing publicly shared LOCATION data”, (Completed, Under submission 34th Annual ACM Conference on Human Factors in Computing Systems (ACM CHI’16), 2016) pp.1-11.
[9] Liccardi I., Abdul-Rahman A., Chen M., “Measuring people’s attitudes towards differential pricing using Game Theory”. (Final Draft, To be Submitted to ACM Transactions on Computer-Human Interaction (TOCHI))
[10] Liccardi I., de Roure D., Chen M., “Social Footprints: Visualize people personal data” (Draft, To be Submitted to ACM Transactions on Computer-Human Interaction (TOCHI)).
[11] Patton E.W. Liccardi I., Linked Data and Mobile Application Privacy, International Semantic Web Conference (ISWC), Workshop on Mobile Deployment of Semantic Technologies (2015) pp. 1-6.