Many online users are reluctant to disclose personal information online because of privacy concerns. Personal data has become an economic asset, but it is not the owners, i.e. the users, that control or monetize it. This is in the hands of the service providers whose business case often includes the use of data they collect (e.g. social networks, search engines, online retailers, and cloud hosting services).
Data protection and privacy frameworks in Member States and Associated Countries need to be implemented in a transparent and user-friendly way to help users understand how their personal data might be used, including the economic value of their data. Such knowledge will enable them to exercise choice and know and assert their rights. As the economic value of their data is not known to the average user, they are not able to evaluate the value of their data relative to the value they assign to a ""free"" service. Moreover, the users have no control over what happens with their data, e.g. they cannot verify the data is not passed on to 3rd parties. This situation may influence individuals notion of privacy which may be perceived as a non-valuable asset.
Data protection principles need to be visibly respected for the delivery of personalised public services, to increase trust in public administrations. Transparency is particularly important in an open government context, where personal data may be shared between different departments and administrations or across borders and where third parties can engage in the creation and delivery of personalised services for citizens and businesses.
The focus is on the demonstration of solutions to protect individuals' privacy by default while empowering the users to set the desired level of privacy, based on a simple to understand visualisation of the privacy level, giving them control over how their data will be used by service providers (including public authorities), and making it easier for them to verify both whether their online rights are respected and if they get a reasonable bargain. The activities may also cover tools facilitating the information of individuals about the processing of their personal data. Systems will either have to detect the privacy settings automatically, or the data will have its privacy settings permanently associated to it by the user.
Activities can include the investigation of measures to safeguard privacy in the context of mass data handling, for example where services exploiting big data, cloud services, data sharing by interconnected devices in the internet of things, and data handling in the highly sensitive context of criminal investigations.
Where relevant, actions can be proposed to apply privacy-by-design frameworks for a range of different applications to promote the usage of privacy enhanced technology.
The Commission considers that proposals requesting a contribution from the EU of between €2m and €5m EURO would allow this topic to be addressed appropriately. Nonetheless, this does not preclude submission and selection of proposals requesting other amounts.
Proposals have to address the specific needs of the end-user, private and public security end user alike. Proposals are encouraged to include public security end-users and/or private end users.
The actions supported under this objective are to provide a practical, user friendly and economically viable implementation of the legal obliations related to personal data processing and the legal obligation for prior consent. The actions will not only identify but more importantly implement privacy by design architectures. It is expected that the actions will lead to an increased user trust online, resulting in a higher uptake of online services. Actions should generate positive business cases for online privacy.
Type of action: Innovation actions