Skip to main content
European Commission logo
polski polski
CORDIS - Wyniki badań wspieranych przez UE
CORDIS
CORDIS Web 30th anniversary CORDIS Web 30th anniversary
Zawartość zarchiwizowana w dniu 2024-06-18

Privacy - Appraising Challenges to Technologies and Ethics

Final Report Summary - PRACTIS (Privacy - Appraising Challenges to Technologies and Ethics)

Executive Summary:
PRACTIS started in January 2010 under the European Commission’s Seventh Framework Programme. PRACTIS partners comprised research centres from six countries: Austria, Belgium, Finland, Germany, Israel, and Poland.

The impact of technology on privacy is not a new phenomenon, as can be exemplified by the invention of photography in the 19th century, which triggered the first technology-driven legal debates on "the right to privacy." Recent concerns stemming from the widespread use of Internet-based and cellular (smartphone) services strongly support the case of privacy-technology interaction. Future technologies are bound to pose new privacy-related challenges.

There are three main types of potential impacts of new technologies on privacy: new threats to privacy, enhancement of privacy (enabling new privacy-enhancing technologies), and changing our perceptions of privacy. Certain emerging technologies have both a potential to pose new threats as well as to enhance protection (sometimes indirectly), depending on the specific application. The third kind of impact, namely the change of perception, is the most complex. The most common perception change relates to individuals "getting used" to a certain technology (hence becoming less sensitive to its privacy aspects) and the readiness of users to "sacrifice" some of their privacy for concrete benefits (e.g. improved security or better health services). Despite the current attention focused on ICT (Internet, smartphones), it is interesting to look forward "beyond ICT", into important fields like nanoscience or medical research. Future challenges to privacy stemming from these fields could be far beyond the classical issues of personal data protection.

Thus, PRACTIS's mission was to increase readiness and awareness to the impact of emerging technologies on privacy issues among citizens, policy makers and stakeholders. The main goals were to identify and assess evolving impacts on privacy that might result from various emerging technologies and new scientific knowledge and to propose means to cope with potential future risks to privacy in both the legal and social spheres, while maximising the benefits of these new technologies. Moreover, PRACTIS strived to formulate a framework for thinking about the ethical and legal issues related to privacy in the future, when the emerging technologies will prevail, and to explore novel policy options to address the needs of citizens in a world of new technologies while maintaining privacy.

PRACTIS (www.practis.org) presents novel results on the potential impacts of various emerging technologies on privacy threats, enhancement and changing perceptions. Scenarios that reflect these evolving changes have been constructed. Changes of attitudes towards privacy among the “Web Generation” were explored by confronting high school students in several countries with different potential situations enabled by emerging technologies in their everyday lives. Revisiting ethical and legal principles for a technologically evolving society and suggesting guidelines promoting the value-sensitive design of potentially privacy-affecting technologies are some of the project’s results.

Project Context and Objectives:
PRACTIS is a project launched by the European Commission under the framework of the FP7 Science in Society (SiS) program. PRACTIS is one of a series of projects and studies that aim to cope with the future of privacy in Europe due to emerging technologies that might cause potential threats to privacy and to find ways to cope with these threats (mainly legal and ethical) to reduce their impact.
PRACTIS’ main goals are:
• To identify and assess evolving impacts on privacy that might result from various emerging technologies and new scientific knowledge and to propose means to cope with potential future risks to privacy in both the legal and social spheres, while maximising the benefits of these new technologies.
• To formulate a framework for thinking about the ethical and legal issues related to privacy in the future when the emerging technologies will prevail.
Further objective of PRACTIS is to explore novel policy options to address the needs of citizens in a world of new technologies while maintaining privacy.
PRACTIS is a forward looking activity. Analyzing emerging technological as well as societal developments in the next twenty years and beyond, this foresight study has identified and assessed privacy threats that could stem from future technologies. Fields like Nanotechnology, Biotechnology, Robotics and Information technology are some of the key activities in S&T that have been scanned for this purpose. In addition the change of privacy perception is analysed in light with emerging technologies as well as generational gap.
PRACTIS sees its mission also to contribute to the improvement of privacy protection and data security as an effective policy tool in Europe.
Despite the current attention focused on ICT (Internet, smartphones), it is interesting to look forward "beyond ICT", into important fields like nanoscience or medical research. Future challenges to privacy stemming from these fields could be far beyond the classical issues of personal data protection.

Thus, PRACTIS's mission was to increase readiness and awareness to the impact of emerging technologies on privacy issues among citizens, policy makers and stakeholders. The main goals were to identify and assess evolving impacts on privacy that might result from various emerging technologies and new scientific knowledge and to propose means to cope with potential future risks to privacy in both the legal and social spheres, while maximising the benefits of these new technologies. Moreover, PRACTIS strived to formulate a framework for thinking about the ethical and legal issues related to privacy in the future, when the emerging technologies will prevail, and to explore novel policy options to address the needs of citizens in a world of new technologies while maintaining privacy.

PRACTIS complements other research projects and studies and has focused extensively on changing perceptions of privacy due to emerging technologies in the future and the implication on ethical and legal issues. The detailed objectives of PRACTIS are as follows:
• Identification of new technologies and fields of scientific research that might impact on privacy, focusing on areas of Science & Technology (S&T) such as Information and Communication Technologies (ICT), biotechnologies (life sciences, genomics and emerging medical technologies), and cognitive science. Attention will also be paid to technologies that may emerge from the convergence of different S&T fields (e.g. NBIC convergence).
• Assessment of possible impacts on privacy that might emerge from the identified technologies and research areas. These impacts may be negative or positive, explicit or hidden, local or global.
• Analysis of changing perceptions of privacy over the last two decades and exploration of trends in perceptions of privacy as a generational phenomenon.
• Elaboration of scenarios that reflect emerging and evolving threats to privacy – and the means to counter them – taking into account changes in perceptions of privacy.
• Development of ethical frameworks for dealing with the future impacts on privacy of emerging technologies. Such ethical frameworks will help individuals to grapple with the privacy implications of new technologies.
• Assessment of current legal frameworks regulating privacy and personal data so as to explore their hidden assumptions. This will then be evaluated in light of the project’s findings, resulting in formulating legal, regulatory and other policy recommendations.
• Raising awareness about possible trends of privacy in the future among different groups in society.

The overall strategy of PRACTIS includes two main phases: First phase - collecting information and data (WP2, WP3) and the second phase - analyzing that information, extracting its implications and deriving possible recommendations (WP4, WP5, WP6)
Several corresponding activities were carried out to achieve PRACTIS goal:
• Horizon scanning of emerging technologies and techno-scientific developments which might impact on privacy in the future. State of the art review which included technologies that have the potential to invade privacy as well as those that aim to preserve it (Privacy Enhancing Technologies, or PETs);
• State of the art review on privacy perception as a generational phenomenon;
• A school survey among adolescents to analyse their privacy perception in light of emerging technologies in present and in future;
• Elaboration of scenarios that reflect emerging and evolving threats to privacy – and the means to counter them – taking into account changes in perceptions of privacy;
• Development of ethical frameworks for dealing with the future impacts on privacy of emerging technologies;
• Assessment of current legal frameworks regulating privacy and personal data so as to explore their hidden assumptions;
• A policy design and formulation process resulting in policy recommendations for policy makers.

Project Results:
The impact of technology on privacy is not a new phenomenon, as can be exemplified by the invention of photography in the 19th century, which triggered the first technology-driven legal debates on "the right to privacy." Recent concerns stemming from the widespread use of Internet-based and cellular (smartphone) services strongly support the case of privacy-technology interaction. Future technologies are bound to pose new privacy-related challenges.
Several corresponding activities were carried out to achieve PRACTIS goal and objectives. The main scientific and technological results are described as follow:

1. Horizon scanning of emerging technologies and techno-scientific developments which might impact on privacy in the future (WP2):
The main objective of was to scan the technology horizon in order to identify emerging technologies with potential impacts on privacy, and to assess these impacts. Special effort was made to look beyond ICT (the "usual suspect"), and to discuss the aspects of new technologies emerging from other fields, such as Nanotechnology, Robotics, or Cognition.

We assumed that there are three main types of impacts of new technologies on privacy: threats to privacy, enhancement of privacy (better protection), and changing our perceptions of privacy. The first type is the most straightforward, and includes for instance technologies that make it easier for information to be collected about people. The second type refers mainly to Privacy Enhancing Technologies (PETs). Certain technologies have both a potential to pose new threats to privacy as well as to enhance privacy (sometimes indirectly), depending on the specific application. The third kind of impact, namely the change of perception, is the most complex. The most common perception change relates to individuals "getting used" to a certain technology and their readiness to "sacrifice" some of their privacy for concrete benefits.

State of the art review was conducted included technologies that have the potential to invade privacy as well as those that aim to preserve it (Privacy Enhancing Technologies, or PETs). Of which five broad "families" of technologies were scanned: Nanotechnology and New Materials; Medicine, Biology and Biometrics; Robotics; Cognition-related technologies and Information and Communications Technologies (ICT). This effort resulted in a list of 79 selected technologies representing the above mentioned "families". In addition a worldwide online expert survey was conducted. For the survey 39 technologies were selected, based on the 79 technologies described in D2.1. For each technology the respondents were requested to assess the foreseen time-frame of its widespread use, its threat to privacy, its influence on changing people's sensitivity about their privacy, and its possible ability to contribute to privacy enhancement. They were also asked to rank the three most threatening technologies in each of the five technology families.
266 experts responded to the survey (namely, answered at least part of the questions). Table 1 presents all the main survey results, classified by the technology fields.
Table 1: Consolidated results of the Expert Survey


Threat levels: 0=no threat, 1=very low, 5=very high
Sensitivity change: 1=less sensitive, 3=no change, 5=more sensitive
Timeframes of Widespread Use: Almost 85% of the technologies under consideration will be widespread before 2025. The technologies can be divided into four time-frames of widespread use, as exhibited in Table 2:


Table 2: Time-frames of widespread use
Early: Now – 2015 Near: 2016 – 2020 Medium: 2021 – 2025 Far: 2026 – 2035 and beyond
e-Health Nano-enabled personalized medicine Nano-based surveillance Invisibility Cloaking
Identifying bloggers’ emotions Authentication using activity-related biometrics Molecular Nanosensors Medical Nanorobots
Cloud Computing Photonic sensing by low cost components Portable Full Genome Sequencing Brain-to-brain communication
Voice-driven search Sensors for robots Intelligent Medical Implants Transcranial Magnetic Stimulation
RFID Deception Detection Techniques Advanced Artificial Intelligence Advanced fMRI
License plate recognition Advanced speech recognition Small toy/household robots Self-replicating nanoassemblers
Online behavioral targeted advertising Internet of Things Robots as Social Actors
Mobile phone tracking Reality Mining Integrated surveillance by robots; "cyborg insects"
Capturing info through “side channels” Smart meters Mind Reading commercial gadgets
SIGINT with COTS hardware Wireless tire pressure monitoring systems
Facial recognition Sense Thru The Wall
Augmented Reality AI-based Conversation

Table 3 presents the technologies considered by the experts as posing the highest threats and/or sensitivity changes (ranks higher than 3.5 on a scale 1 to 5). The technologies in the table are ordered by their threat level.

Table 3: Levels of threat and sensitivity change
Technology Threat Sensitivity Change
1. Brain-to-Brain Communication 4.73 4.10
2. Integrated surveillance by robots; "cyborg insects" 4.64 4.50
3. Facial Recognition 4.47 3.83
4. Mobile Phone Tracking 4.33 3.78
5. Mind Reading commercial gadgets 4.31 4.00
6. Reality Mining 4.08 3.74
7. Internet of Things 3.89 3.66
8. Portable Full Genome Sequencing 3.89 4.24
9. Nano-based Surveillance 3.80 4.24
10. SIGINT with COTS hardware 3.80 3.60
11. Online behavioral targeted advertising 3.70 3.53
12. e-Health 3.69 4.11
13. Authentication by activity-related biometrics 3.58 (3.17)
14. RFID 3.53 3.68
15. License Plate recognition 3.52 3.72
16. Cloud Computing 3.51 3.65
17. Sensors for robots (3.50) 4.00
18. "Sense Thru the Wall" technologies (3.42) 3.58
19. Advanced fMRI (3.40) 3.60
20. Advanced speech recognition (3.40) 4.00
21. Deception detection techniques (3.33) 4.00
22. Identifying bloggers' emotions (3.20) 3.70
23. Invisibility Cloaking (3.13) 4.36
24. Advanced AI (2.94) 4.07
25. AI-based conversation (2.91) 3.82
26. Nano-enabled Personalized Medicine (2.90) 3.78
27. Transcranial Magnetic Stimulation (2.75) 4.00
28. Molecular Nanosensors (2.70) 3.60
29. Self-replicating Nanoassemblers (2.20) 3.88

Important observation is that several technologies that pose a relatively high threat to privacy are likely to be widely used in the near future (till 2018), e.g. facial recognition, mobile phone tracking, reality mining, and the Internet of Things.
The threats to privacy posed by different technologies vary more than the associated changes in people's sensitivity about their privacy: The threat levels vary between 1.86 and 4.73 while the changes in sensitivity vary between 2.86 and 4.5.
Ranking of most threatening technologies in each field: The experts were asked to rank the three most threatening technologies in each of the five technology fields. Fig. 1 shows the top three threatening technologies in each field.
In general the experts tend to associate higher threats with larger increases in sensitivity. Exceptions are, for example, self-replicating nano-assemblers and medical nano-robots, which pose a relatively low threat to security (2.2 and 2.36 respectively) and at the same time a relatively high increase in sensitivity (3.88 and 3.33). A possible explanation for this could be related to the relatively far future when these technologies will be widely used (2036, 2028). Perhaps for some technologies which will be widely used only in the long term, the perceived threat is rather low (from today's point of view), but the foreseen sensitivity change (when the technology is actually widespread) is high.


Fig. 1: Ranking of top three threatening technologies within fields
Technologies that enhance privacy: Some of the technologies presented in the survey could potentially help people to enhance their privacy, and not only threaten it. Typical explanations were: helping users to detect intruders or surveillance devices/sensors, the possibility to incorporate appropriate privacy settings or protection means, and reducing of arbitrary information collection about many people by focusing on specific targets.
Highlights of Respondents Comments: According to some experts, the combination of several technologies poses more severe threats than a single technology, for example the combination of advanced sensors with wireless networks and sophisticated software: "Cyborg implants, memory chips, augmented reality, wireless networking, cloud computing, etc. in conjunction with brain-computer interfaces (BCI) offer amazing post-human capabilities… the potential is there to hack…create false memories, provide false behavior-changing information, perhaps even record and make public individuals thoughts, emotions and past experiences". Several respondents asserted that in a modern society the benefits of new technologies are inherently accompanied by paying the price in terms of personal privacy, and that absolute prevention of privacy intrusion is impossible. As one expert wrote, society "will simply have to get used to a reduction in personal privacy resulting from increased use of technology". Another expert viewed this as a rather positive trend, as in his interesting opinion less privacy means more decent behavior and self-control: "To improve the future of all it may be better to get over our recent modern obsession with privacy and accept that any and every action and thought that we have may be open to investigation and to public exposure… The abolition of privacy (or perhaps a return to a very limited degree of privacy, similar to the levels in which human social behavior and morality developed) might make for a better, fairer world". The opinion of several experts is that privacy intrusions can be minimized by user awareness, appropriate use of technologies, and incorporation of privacy protection means. Technologies can enhance or compromise privacy – depending on how people choose to use them. These considerations should be applied in the early stages of technology development. As one expert put it, "Often technology is developed without a parallel development of suitable regulation to protect privacy, and only after first incidents of privacy violation a discussion in the society starts. It would be very helpful to start this discussion much earlier without negative events as triggers. Then, a more objective and less emotional view on the topic is possible".
Conclusions
The outcome of the Horizon Scanning in WP2 was the identification, description and assessment of selected technologies with potential privacy implications. Despite the current attention focused on ICT (e.g. Internet, smartphones), special effort was intentionally made in to look forward "beyond ICT", into important fields like nanotechnology, robotics, biology and cognition. In the new world of technologies enabled by research in these areas, future challenges to privacy could be far beyond the "classical" issues of personal data protection if this new world includes unprecedented capabilities like "mind reading", molecular sensing or seeing through walls.
Three main types of impacts were considered: threats to privacy, enhancement of privacy (better protection), and changing people's sensitivity about their privacy.
An important change of perception of privacy, supported by several experts' in the PRACTIS expert survey, relates to individuals "getting used" to technologies and their willingness to accept certain privacy violations (or even not to perceive it at all as such), if the benefits (e.g. better health service or improved security) are perceived as worthwhile. In other words, there is a trade-off between the (perceived) privacy and the (perceived) benefits.
In the final WP2 report (D2.2) selected 39 privacy-challenging technologies were described in detail, followed by the expert survey results regarding each technology. Experts' assessments and opinions have been elicited concerning the following aspects of each technology: The foreseen time-frame of widespread use, the level of threat to privacy, the influence on people's sensitivity about their privacy, and (if applicable) the relevance privacy enhancement.
All the technologies examined in the survey pose threats to privacy to some degree. The highest threat levels are attributed to the following technologies (the years in brackets indicate the likely time-frames of widespread use): Integrated surveillance by robots: cyborg insects (2023), Brain-to-brain communication (2028), Facial Recognition (2012), Mobile Phone Tracking (2012), Mind Reading commercial gadgets (2023) and Reality Mining (2018). Hence, evidently some of the most threatening technologies will be widely used in the relatively near future (2012 – 2018).
Most technologies are likely to make people more sensitive about their privacy. Top increases in sensitivity are attributed to Cyborg Insects (2023), Invisibility Cloaking (2028), Nano-based Surveillance (2023), Portable Full Genome Sequencing (2023), e-Health (2012), Brain-to-brain communication (2028), and Advanced Artificial Intelligence (2023). Important impacts (threats as well as sensitivity changes) are attributed to several technologies in the domains of Cognition, Biology, Nanotechnology and Robotics.
The general tendency seems to be to associate increased sensitivity with high threat. But there are exceptions, with regards to some far-future technologies. Probably in some cases far-future technologies are perceived as “less threatening”, but likely to make people more sensitive when widely used. Several potentially threatening technologies have aspects that may contribute to privacy enhancement as well, depending on the context of their actual use.
Some comments of respondents reflect the opinion that the combination of several technologies may pose more severe threats to privacy than single technologies and that in a modern society absolute prevention of privacy intrusion is impossible due to a tradeoff between benefits and some privacy sacrifice. Nevertheless, privacy intrusions can be minimized if the implications are considered in the early stages of technology development – implying the high importance of the notion of "Privacy by Design".
2. Privacy perception as a generational phenomenon:
State of the art review on privacy perception has been conducted. The main results show that perception of privacy is not a universal one and is shaped not only by individual differences (age, gender, internet experience) but also macro-level factors like nationality and national culture. (Cho et al.2009). The empirical data shows that a change of privacy perceptions is underway. However, the awareness for privacy intrusions and the support for privacy rights did not diminish in the past 20 years. The users are aware about possible risks and developed different strategies to handle them. The users’ ability to respond to these new risks is improving with the rising awareness for possible risks. Therefor awareness rising seems to be one of the most important measures to minimize existing risks of privacy intrusions and abuse of personal data. Such educational clarification must include general information about the long-term risks of posting of personal information as well as more detailed information on how to use existing privacy options/tools on SNS. This raising of the children's’ awareness cannot be done only by their parents but also must be a new task for teachers and data protection officers.

Based on the literature review the main objectives of the first part were to analyze and describe the changing perceptions of privacy in European countries and to identify trends in the interpretation of privacy as a generational phenomenon.
The second objective was to analyze the future role of governments and data protection agencies in the protection of privacy and to examine how possible privacy threats due to data collection of businesses and the state are evaluated.
To accomplish these objectives two surveys were conducted: (1) an exploratory survey 1.428 high-school students from all of the six partner countries of the PRACTIS project (Austria, Belgium, Finland, Germany, Israel, and Poland) on privacy perceptions. The purpose of the exploratory survey was to shed light on privacy perceptions of the younger generation. Additionally, an adult control group of 125 adults were surveyed to identify possible differences in the privacy perceptions of so called "Digital Natives" and "Digital Immigrants". (2) A worldwide online experts survey that focused on changing privacy perceptions, changing support for data protection supervisors, and changing importance from governmental to business data collection. In the preparation of the online expert survey altogether 54 expert interviews were carried out that were used to develop the questionnaire for the online survey. The main results and conclusions of the school survey as well as the online expert survey are summarized in the following sections:
Our results show that adolescents consider privacy and data security as fairly important when asked explicitly, yet still report privacy-threatening behavior, which can be observed when the actual activities of students are examined.
The conclusions drawn from this pattern of inconsistency are firstly, adolescents perceive social network sites as part of their private sphere, where they exchange private information with their peers; secondly, they handle private data in a differentiated way trying to explicitly manage who gets which information. For the decision, which information is given to whom the context seems to matter. Finally, they are ready to trade off privacy for benefits, like discounts or increased convenience.
Generational differences are confirmed by the PRACTIS school survey. Adults show a higher awareness for privacy and a lower usage of SNS and future technologies. However they are also willing to trade off privacy for benefits , especially for security.
Another result shows that political, societal, and cultural factors influence privacy perceptions. Interesting differences are found between students from different countries and with regard to age, gender and daily online time.
The complex findings of the study lead to the conclusion that adolescents’ sensitivity for privacy seems to change towards a more flexible concept of privacy rather than diminish due to future technologies.
New and emerging business models not only impose new threats to privacy, but also have an impact on privacy perceptions
Selling personal information to third parties is – based on the expert responding to the PRACTIS survey – a profitable business model and these thus new and emerging practices impose new threats to privacy and also impact privacy perceptions. The experts precise threats to privacy induced by firms and companies as stakeholders as a problematic lack of transparency for consumers/citizens on which data is collected and which purpose this serves. Interestingly, our results show that experts are convinced that these data collection efforts are not happening by chance, but as consequences of intentions.
We further conclude that these results have major implications for the policy level. Experts are convinced that privacy branding is good for business reputation; and this result points at possible new business models to be strengthened. The claim for the role of governments in regulating privacy and protecting citizens in privacy matters is palpable also for this question: the majority of the experts is convinced that ‘privacy by design’ should be regulated for businesses. This could be done by imposing minimal standards on services and products, or implementing other process-oriented privacy assessment for technologies.
Governments are still ascribed a major role in privacy protection despite identified threats imposed by e.g. the integration of public databases
The issue of the role of the government is central to both surveys in terms of protection of privacy and threats to privacy. 49.8% of the experts of the online survey identify the state as an actor imposing threats to privacy and in this respect the majority of the respondents consider the integration of public databases on the international level as the main threat. Asking for a balancing of privacy as a relative right with e.g. security leads to a high number of undecided experts, as well as students. The expert survey does not have a subgroup analysis for countries, but the school survey does. And, the analysis displays how culturally sensitive the question of trust/mistrust into the state might be as well as the understanding of privacy as absolute of relative right. Israeli students show a higher level of agreement for the control of state/governmental institutions of Internet traffic, in contrast to Austrian and German students, who show more concerns about the citizens’ right to privacy.
The majority of the experts consider the state as the stakeholder mainly responsible for protecting privacy. In contrast, the idea of firms and companies enforcing privacy through self-regulation is only exceptionally considered as very important. Still, awareness-raising measures are considered as essential to complement the public control. Interestingly, the results point at governments not being informed sufficiently about new technological developments, as well as not pro-actively taking measures for privacy protection.
The working conditions of Data Protection Agencies seem to be tough and this result of the exploratory survey points at the need for further research on this question.
The questions about the working conditions of DPAs in Europe show that there is no clear good assessment. Especially the statements that (1) the DPAs have enough autonomy to fulfill their tasks, and that (2) their financial situation is good enough lead to more disagreement. Experts consider the control of governments and private businesses practices in data collection and handling major tasks and concerns for Data Protection Agencies. The experts think that national data protection agencies have yet not sufficiently defined their tasks. Public authorities’ way of handling personal data is ranked first, as important task and concern for DPAs; followed by the collection and analysis of data by private businesses. Among other tasks mentioned range: control of new and emerging technologies which gather and analyze our personal data (e.g. Ambient Assisted Living Technologies, E-Health), Privacy in the internet, CCTV systems installed by private companies such as in shopping malls, and awareness-raising measures (e.g. education media). There is little knowledge among the expert community about non-governmental actors and/or institutions active in privacy protection. Interestingly, the privacy experts responding to the survey have little knowledge about other data protection supervisors in their country. The few responses range from consumer protection organizations, national research ethics boards, to private life commissions.

3. Elaboration of scenarios that reflect emerging and evolving threats to privacy – and the means to counter them – taking into account changes in perceptions of privacy:
PRACTIS presents five different scenarios on the future of privacy based on the results from the technology foresight analysis, the surveys on privacy perception and the ethical, legal and social analysis which have been conducted during the project. These scenarios need to be considered by policy makers as routes for planning and define ethics, legal and social norms and regulations to protect privacy:

Scenario1: “Privacy has faded away” - people have given up privacy voluntarily – there is no privacy left. Society values the improved goods and services provided by emerging technologies much more than privacy. Privacy is perceived as a commodity by most of the people, and they are eager to make trade-offs between different goods and privacy. Privacy still exists of course, and if there is a will, people may start to act differently and perceived privacy differently. However, at that stage, it is not important and not valued by society or people. On the contrary, people want to expose their private lives and distribute their private information freely. In this sense, the society described in scenario 1 is very different from the contemporary world. Additionally, human mentality might have changed, due to the process where people have given up their privacy peace by peace, ending up in a society with complete absence of privacy. For example, in the contemporary world people with curtains open might feel anxiety and be irritated, if the passer-by sees inside his or her living room. These people might feel that their private space is exposed and intruded by this unknown stroller. However, the same situation in a society as described in scenario 1 might cause totally different reactions and feelings than described above. Since, people's perception of privacy is totally different and even their human mentality might have changed.
Scenario 2: “People want to maintain as much privacy as possible”- people believe in privacy. In the beginning, most of people perceived privacy as something less important. Therefore, they were willing to expose their private lives and information to the public. Additionally, most of the people were such inspired about emerging technologies and the new possibilities they offered, that people did not even realize, how much privacy they were losing in exchange. However, due to the progressive change of the prevailing social norms, people started over time to perceive privacy differently. Both, physical and mental privacy became important to the people and most of them started to protect their private lives and information. Additionally, the state started to encourage people to maintain their privacy with various measures and actions. Therefore, the state of privacy got stronger and more equally distributed.
Scenario 3: “People have lost control of their privacy”, people have lost control over their privacy. Private companies or the state monitor and gather information constantly about everybody. Therefore, people have no opportunity to protect their privacy since there is none left. Privacy has been taken from them and most people did not notice the development path until it was too late. People practically “sleep walked” into the world without privacy without noticing anything and suddenly they have no choice but to live with it. Most of the people perceive privacy whole differently in the society described in scenario 3 than the people who are living in the contemporary world. For example, they do not mind or pay attention to a fact that somebody is constantly gathering their private information or monitoring them. Due to the complete absence of privacy, people’s state of mind or human mentality might have changed. However, if people have not confronted negative situation caused by the private data and information gathered about them, they might have even blocked the fact out of their minds, that there is no privacy.
Scenario 4: "Segmented privacy”, the state of privacy is two-folded in scenario 4, on the one hand, it is generally available, but on the other hand, only wealthy people can afford to buy it. Privacy is not for sale as an item itself, but it is for sale via emerging technologies where privacy settings are considered. People perceive privacy as something desirable and highly valued since it has become a market value. In scenario 4, social norms and values are developed in a way that it is socially preferable to have privacy than live without it. In this sense, it is likely that people’s actual perceptions of privacy vary widely. Therefore, there can be someone who considers it is important, but do not have afford to buy it. On the other hand, a person with the newest privacy enhancing technologies might actually perceive privacy less important, but buy it anyway since it is socially preferable. Then it is possible, that the actual perceptions are buried under the fact that the value of privacy is dictated by the markets. There is also another possible development path: a long term use of emerging technologies with high privacy settings or with no privacy setting at all, might have shaped people’s perception of privacy. Therefore, the society described in this scenario is divided in relation to the perceptions of privacy according to which social class a person belongs to.
Scenarios 5: "Tailor-made privacy”, the state of privacy is very strong in scenario 5, privacy is available to each individual in a form that it prefers to have it. Privacy is highly valued in a society and it is shown in a development process of emerging technologies. Privacy is perceived differently by each individual. Someone prefers to have it, while another ignores it completely. In a society described in scenario 5, there are as many perceptions of privacy as there are individuals. This is possible, because people have genuine awareness and knowledge about the possibilities and disadvantages of emerging technologies. Therefore, they can make informed decisions regarding their privacy. Additionally, people are not oppressed by the private sector or the state and therefore they have actual freedom of choice regarding their privacy.

A SWOT analysis has been used, as a content analysis method, to recognize the most potential impacts of the scenarios in relation to privacy. As a result, privacy climates were described and their potential impacts on ethical principles were analysed. Therefore, the SWOT analysis was used as a tool rather than an objective itself.
Privacy Climates
In the privacy climate of scenario 1, the future citizens seem to be intimidated by the prospects of being publicly humiliated. As this may be true for now, a new norm may develop where public humiliation and the open documentation of these will become the norm. People may start looking suspiciously at persons who have not subjected themselves to publicity with their bad behaviour and vices. While such social developments may take place, there are other more technical aspects that can have societal impacts. For instance, the total transparency may put minorities at risk, as access to personal data combined with intolerance may put them in very vulnerable positions. This risk is further enhanced by the possibility of social movements, which may suddenly advocate intolerance in the name of the common good. These issues effect negatively on the autonomy of individuals. Additionally, in terms of democratic values people have a harder time to exercise their own political freedom and rights. However, a more transparent society and a better access to political decision-making should be seen as positive attributes with regard to the state of democracy in society.
If privacy climate 2 would become reality, then society might not change that much in the European context. Values such as human dignity, equality and democracy will be respected as they are today. Also the rise of nationalist movements can be witnessed already today, since right-wing nationalist parties gain popularity across Europe . Additionally, the concept of the paternalist state that attempts to regulate all the fields of life, is already present to some extent in the well-fare state context. However, social norms are created more democratically in an organic way. As a result of the growth of a “civic” society individuals are not only equal amongst them, but are also on a more equal footing in relation to the lawmakers. Additionally, autonomy is strong as individuals are allowed and even encouraged to make up their own minds.
In the privacy climate of scenario 3, the conscious common individual may feel that his or her autonomy and human dignity is compromised as machines have gained a certain supremacy over human beings. Democratic values can be seen as less important and stakeholders have only limited possibilities to engage in decision-making. This may result in poor execution of social justice. However, there is a possibility that political and social apathy will spread in society with the result the citizens are quite content with the way things are. Hence, what may seem as a dystopia at the moment may later be perceived as the natural state of things. The future citizens may well be very happy about the decreased workload and improved services that the machine supremacy has brought along.
The findings of the privacy climate of scenario 4 are very interesting. Generally speaking, the ethical principles of human dignity, autonomy, equality and democracy have often been seen as interdependent. However, while the value of human dignity is revered in this privacy climate, it is also more dependent on the social and economic status of the individual. People are also free to act with great autonomy as long as they possess enough resources. The impacts on equality and democracy can be described as negative to a certain extent, since there is a great division between the more privileged segments of the society and the underprivileged. Additionally, the state of democratic values can be seen to be on a worrisome state as solidarity among the different segments of society is low and the execution of social justice is unconvincing.
Also the privacy climate of scenario 5 is very revealing. The high level of human dignity, equality, and autonomy fosters the realization of democratic values. Additionally, the level of social justice delivered is at a very high level in society. While it may appear to be the most utopian alternative available, also it possesses some inherent weaknesses. For instance, despite its many positive attributes (see figures 5a and 5b) there are also some challenges and problems especially regarding human dignity and democracy. Potentially, a situation may arise where citizens are not oppressed by the state, but rather by criminal groups or terrorists taking advantage of the absolute right to privacy. In contrast, in the contemporary world, the right to privacy protected by international treaties is a right that is highly protected but not absolute. Like today, there is an on-going societal debate on whether individual freedom should in some cases be limited in favour of public security and “the greater good”. From the examples discussed above it seems likely that the technological development will have considerable impacts on societies on at least two levels. On the one hand, the technological development will change people’s living environment and afford new opportunities, but also create new vulnerabilities. Furthermore, the way people see social norms and value systems will change. People being exposed to different technologies and to the public eye in a new ways, may produce new digital natives that perceive privacy, publicity and other developments in a new manner. What may today seem as deeply intimidating may one day be perceived as nothing more than a minor nuisance. Total publicity may be seen as the natural state and the supremacy of robots may be considered as a very comfortable way of producing services and managing the society. Regardless the circumstances we face in future, maintaining some level of self-dignity and sense of empowerment would be advisable.

Impacts on the Four Ethical Principles
Table 4 presents all ethical principles within the different scenarios. This is done in order to grasp the changes that have happened in twenty years. Therefore, the description how they appear in each scenario is not presented here, but a conclusion is made, whether the state of each ethical principle is moving more towards positive or negative condition during the process. This kind of explicit categorisation is not fully justified statistically; it is an attempt to capture the essence of the impacts on the four ethical principles within each alternative future.


Table 4: Impacts on the four ethical principles within each scenario
Scenario / Ethical principle Scenario 1: “Privacy has faded away” Scenario 2: “People want to maintain as much privacy as possible” Scenario 3: “People have lost control of their privacy” Scenario 4: “Segmented privacy” Scenario 5: “Tailor-made privacy”
Human dignity Compromised Cherished Compromised Respected, but dignity is dependent on the social and economic status of the individual. Cherished
Autonomy Threatened Respected, but there is some tension between the individuals’ right to autonomy and the government strive for universal well-being. Compromised Cherished among those people who have enough resources. Cherished, there is, however, an on-going debate whether individual freedom should in some cases be limited for the case of security and “greater good”.
Equality Somewhat increased, but can also turn into inequality. Increased Two-folded: theoretically people are in equal position as there is no privacy left, but in practise it may affect differently on each person. Compromised Cherished
Democracy Threatened, but the same time there are a number of positive attributes regarding democracy. Cherished Threatened On worrisome state. Respected

The analysis has shown that there are many alternative privacy climates possible in future. Furthermore, their impacts on ethical principles vary. The results presented in this report deepen our and decision makers’ understanding of these different alternative privacy climates and their impacts on ethical principles. The results provide valuable input for stakeholders and policymakers in Europe on potential implications on privacy as exposed in these results.









4. Development of ethical and legal frameworks for dealing with the future impacts on privacy of emerging technologies:
The main objective within this part was to delve into the legal and ethical aspects of the connection between emerging technology and privacy. It also discusses the relations between people attitudes, experience and feelings, and the legal and ethical approaches to privacy and technology. The work was based on analytic discussion of legal, social and ethical issues, as well as on qualitative/comparative analysis of focus groups meetings consisting of participants of heterogeneous background and characteristics. The focus groups were held in most of the countries that were partners in PRACTIS.

The privacy experience does matter. For K. Rambo (2008) “abstract privacy protects abstract autonomy, without inquiring into whose freedom of action is being sanctioned, at whose expense.” It does not mean that the legal and the ethical frames have to follow the ‘opinions’ but that those two frame have to listen to people experiences concerning their understanding of privacy and their difficulties to protect it. Even if people have quite difficulties to define what privacy is, a majority of them do consider it as a central element of our social life and democratic organization. But at the same time, a very large majority of them testify that its protection is somehow fictional. The findings below help to better frame this important tension or paradox people experience. It is important to emphasize two critical challenges which raised by emerging technologies for ethical, legal and social frames: The social, ethical and legal challenges and Social challenges: The difficulties to “close the door”.

Findings 1: People still do consider privacy as very important, but according to them what they do value the most when speaking about privacy is tranquility, the non ‘reachability’. To put it in metaphoric terms: the right to close the door. This consideration has no more any spatial figure: it is rather a personal moment that people want to preserve. But this tranquility appears ever more difficult to maintain due to the complete blurring between private and public spheres on one hand and the pervasive presence of technologies on the other. The paradox raised by some is that this tranquility sometimes urges the individual to give more data in order to be well profiled and less disturbed.
Findings 2: People are very aware of the business value of their personal data and daily experience the ‘commodification’ of their privacy. This means also that people are less and less reluctant to trade their data if they can receive substantial benefit from this (services, discount, security…) if they trust the providers and if the data are not sensitive.
Findings 3: People do consider that some very sensitive data should strongly be protected by the law as the ones regarding sexuality, opinions, religions and health. Those data are considered by them as the basis of diversity and therefore should not be used to classify or discriminate people. The concentration on this basis of sensitive data is strongly shared by the different FG.
Finding 4: The frontiers between personal data and profiles are no longer clearly and visible for the people. For instance, one female (30y) explained that since 5 months, she receives everyday personal message regarding her future pregnancy. She considers it as an intrusion in her privacy since she imagines, without knowing it exactly, that those messages are related to Google searches she made in the past regarding this matter. For her, there is clearly a privacy breach regarding her personal-sensitive-data.
Finding 5: There is a clear divide between “privacy haves and privacy have nots”. Some people, due to their dependencies (financial, medical, familial…), are experienced as having low or no privacy rights. Many examples are given to illustrate those situations. This divide should not be confused with a second degree divide which regards the capability to protect your privacy.
Finding 6: People do experience their privacy protection as a very isolated experience and not really a comfortable one. People give many examples of tricks and artifices they use to protect themselves which could testify of their difficulties to find their way with the privacy protection. Most of them do value the legal protection but think at the same time that it is not sufficient to protect them since it lacks of enforcement.
Finding 7: The consent is a critical key of privacy protection, however it is at the same time the weakest piece of privacy. So, it should be reinforced and more innovative in order to make it really informed. Some proposals are made:
• First, technological innovations should support a better transparency of the flows generated by the personal data
• Second, consent should be given for a limited time : people can change their mind
• Third, consent should be given for categories or classes of services providers according to their concern for the privacy
• Fourth, labels could help this categorization and provide at least better information of people regarding to whom and for what they are giving their consent.

Ethical challenges: Privacy as a precondition for the development of the ‘self’ and of democracy
Let’s recall. Privacy is the “precondition to the enjoyment of most other fundamental rights and freedoms” . These values (dignity, self-determination, social justice) – worthy of legal protection, which also serve as fundamental building blocks of the liberal democratic society – either stem from the same ethical source, i.e. human dignity, or are positively affected by privacy, or are on a similar normative level. Privacy enables, as an important by-product, the exercise of other fundamental human rights and the fulfilment of other ethical values. On a personal level, privacy is tightly related to the people’s capabilities to develop their own narratives, choices, preferences and ways of life without being influenced or constrained by opaque socio-technological agents drawing their epistemology and their self-relationship. These capabilities of self-determination have to be combined with the preservation of the tranquillity, a requirement which clearly emerges from the focus groups. The preservation of the tranquillity has to be related to the bounded attention or rationality of human beings, a human asset which has to be saved since it is at the very root of the human intelligence and reflexivity.

Legal challenges: transparency and proportionality
D. Solove, in his recent book, describes the evolution of the relationships in our Information Society using two paradigms drawn down from two novels: “The Trial” of Kafka and the “1984” or “BIG BROTHER” of Orwell. He denunciates the radical and increasing opacity of the data capture and data flows which permitted by the increasing use of ICTs and their ubiquitous character. The increasing asymmetry of informational powers is also due to the huge number of data collecting and processing by data controllers; this enables them to define profiles and to take the “appropriate” decisions on the basis of the data they are capturing about our behaviours, our movements, facial emotions, clicking habits: in other words on the basis of a lot of instantaneous slices of our lives we never expected they might be of a certain significance. One adds that Information systems might keep memory of all these events by storing that at long term. Information systems have a memory an individual has not.
This phenomenon comes together with the emergence of certain applications which are linked to the technologies of ubiquitous computing, inducing what we might call the “Observation Society” paradigm, Under this paradigm, the data controller combines multimodal capture of data “extracted” from human bodies with an implicit understanding and interpretation of this data as valid and privileged source of “truth” about the persons, their preferences, intentions, etc., following the assumption that the ‘body does not lie’. Decisions are taken a priori on the basis of this data and profiles rather than on information by the data subjects. Since the Data subjects are not aware of this they are faced with decisions they are unable to understand and definitively to contest .
The two main principles enacted in the texts dealing with privacy and data protection – transparency and proportionality – have to be complied with. Undoubtedly, the emerging technologies challenge these two principles . On the one hand, our computers are functioning to a large extent without possibility for us to know exactly what they are exchanging, receiving and processing. The lack of transparency, the fact that data subjects are not aware of what data are processed and for which purposes, raise many challenges. On the other hand, the fact that the data capture is so easy, the data processing capacities have grown to an unexpected level and that data even when they concern instantaneous slices of my life might be kept for an unlimited period raises difficulties for complying with the proportionality principle. Today, economic efficiency including in the interest of the consumers or the citizens (see the e-government efficiency myth) and private or public security are presented as justifying the processing.

Political recommendations
Consent versus accountability:
Consent is one of the central keys but also problems of the current protection frame. Lets’ recall some of the problems.
• The first problem or issue is that consent leaves the people alone to take a responsibility that is a collective one (because of the externalities of the individual choices) and should be collectively assumed (because of the asymmetry of information between the individual and the provider).
• The second problem is that the consent asked to people is very fragmented, each service provider asking for singular consent. This situation contributes to fragment people’s privacy conscience but also their true capability to manage it.
• The third problem related to consent, as it is experienced currently by the people, is that it is distorted.
• The fourth and main critical problem is that the consent is given for personal data to be processed, but that the process (retrieval, transfer, storage…) is highly obscure or opaque for a very large majority of people.

Recommendation 1: The first recommendation regards the accountability, not an ex-post one but an ex-ante which shows clearly to the ‘user’ what will be the circuit and the process operated on her-his data. This corresponds to the sense people give to enlighten consent.

Accountability versus reputation
The main criticisms addressed by people concerning their current protection of privacy is first of all the opacity of consent but also its very individualistic approach, considering that it is up to each individual to endorse the load of her/his protection. For the participants, privacy protection should be managed on a more transparent and more collective basis. In a world very dominated by the self-image and the reputation, the two next recommendations could help the privacy protection.
Recommendation 2: In order to limit the isolation of people dealing individually with their consent, we suggest to explore the added value that could be bring by a social regulation based on a system of people’s notation of reputation.
Recommendation 3: The labeling appears as a promising path. By labeling, we mean an organization, analogue to the one developed by EU for the sustainability, the Eco-Label. Labeling requires the definition of privacy criteria in order to rank the services providers according to their privacy’s care. It also demands independent experts to evaluate and to rank the providers according to the defined criteria. At the end, it needs a permanent follow up in order to guarantee the trustfulness of the labels system.

Privacy versus attention
Beside the privacy concern and very related to it, there is another concern which has been raised during the focus group. This is the attention’s one. Control, capture, affect, channel, manipulate the attention of people becomes, as underlined by many observers, one of the critical factor of the business competitiveness. As well underlined by H. Simon in the early 70’s, our attention is a limited (bounded) asset. And therefore, one can observe a true blooming of technological innovations which aim at saving people attention as profiling, facial recognition, emotional system… . But the same systems can be analyzed as very problematic for the privacy side and more generally for our epistemological relation to the world which could lead towards a “loss of social and personal orientation”.
So, clearly, the attention is a very critical concern which should not be reduced to a cognitive issue but set as a political major concern.


5. A policy design and formulation process resulting in policy recommendations for policy makers:

The last part of the project was dedicated to policy design and recommendations based on the previous results. The following are the main results for future privacy in light of emerging technologies:

Privacy-oriented technology assessment: privacy by design
PbD is a relatively new concept, and is described as a process of "building fair information practice principles (FIPs) into information technology, business practices, and physical design and infrastructures".

Privacy-Enhancing Technologies (PETs) may be regarded as one of the outcomes/enablers of PbD. There are several types of PETs and one helpful classification distinguishes between substitute and complementary PETs. Substitute PETs aim for zero-disclosure of personal data, whereas complementary PETs aim to provide control over personal data or offer provable guarantee of privacy.

PbD and PETs have had a limited success so far. The problem is not that the cost of privacy protection is too high, but rather that the cost of privacy loss is currently too low.

There are certain disagreements between experts regarding PbD implementation, which seem to indicate that there is a gap between the legal, social and technological conceptions of privacy, and hence of PbD. It seems that different players in the privacy field have different ideas of what privacy is, when they refer to PbD. PbD can be an effective measure to protect data subjects' privacy, as has been already demonstrated in several test-cases. PETs are essential to PbD and are an important enabler, especially when future technologies are concerned. New research such as the emerging "Smart Data" concept may lead to more effective PETs, in particular as solutions for privacy challenges posed by new and emerging technologies. Companies and organizations should be convinced that actual implementation of PbD, including adoption or development of effective PETs, will build consumer confidence, trust and loyalty – which are necessary for business success. The current situation is, however, that many companies may feel that the business incentives to use private information whenever possible outweigh the incentive to protect privacy.

The term PbD is a fundamental term in the protection of privacy. Regulating and enforcement of PbD practices are very instrumental to the success of privacy protection. This principle is derived from the technological threats analysis and from the legal and ethical guidelines discussion. At present PbD is not a recognized concept as we learn in the surveys of the students, adults and experts.

Analysis of the legal implications of emerging and future technologies
The objective was to examine whether the data protection principles enshrined in the EU Directive and other global and regional legal instruments are able to cope with new technologies. The primary conclusion of the task is that many technologies have or will become a privacy challenge. A privacy challenge is the situation in which an emerging and future technology is likely to have a negative implication on privacy, and where this negative implication is likely to be perceived as such in the future, in light of the analysis of the changing perceptions of privacy.
Technologies might pose a threat to privacy, enhance it, or change our perceptions thereof. On the threatening side: complexity, which limits the technological transparency and data subjects' ability to comprehend the meaning of data processing and control it; expansion of the data processing in all possible dimensions (quantity, scope, quality, ubiquity, and simplicity of use on the data controller's side), an increase in covert collection and processing of data, non-linear processing which do not squarely fit within the current legal perception, and at the same time, an increase in privacy enhancing technologies.
On the enhancing side, some of the emerging and future technologies will act upon data immediately, without collecting or storing it. These technologies might have implications for our liberty, but from the data protection perspective, taken on its own, they are an improvement. Some of the emerging technologies enhance the protection of identity against theft, enhance data security and do enable more user control. However, the general trend seems to be that these features are incidental to the surveyed technologies, rather than their main purpose. Other PETs might be developed, following either market demand or governmental encouragements. Arguably, many of these trends are already noticeable today: more data is being collected and processed, of more and more kinds that up until now were beyond the reach of others, in more complex ways, in a more accurate form, and with less visibility. The emerging and future technologies indicate that all of these trends are likely to be substantially enhanced.
Privacy is a fundamental human right, protected by the law, but no less important, it is a social norm. The level of privacy we will have in the future will affect our lives as individuals and as members of social communities. Technologies as they will be used, will affect out levels of privacy, but they are not a given fact. Technologies and their use can be regulated, directly or indirectly. It is a matter of choice.

Policy recommendations
PRACTIS offer a set of policy recommendations which based on a deep analysis in three main tracks: The technology track, the legislative and ethical track and the behavioural and perceptive track.

In general the project suggests a basket of solutions, which include, in addition to some legal proposals, empowerment of individuals by way of raising awareness and education, the use of Privacy Enhancing Technologies (PETs), concepts such as Privacy by Design (PbD), as well as organizational suggestions, such as adopting the procedures of Privacy Impact Assessments (PIA), and the appointment of Privacy Officers within organizations (public and private). In more details the following are the major recommendations which could be implemented and use mainly by decision makers in governmental institutes as well as other social and industrial organisations:

1. Data Subject's Control: The fundamental principle is that the data subjects should maintain control of their personal data: It is for the data subject to make decisions about whether her/his wishes to share the data, with whom, under what circumstances, when and how.
2. Privacy by Design (PbD): It is imperative to regulate that privacy consideration be examine during an early stage of the development of a new technology, namely, the design performed in the very beginning of the life cycle.
3. Encourage the development and exploitation of Privacy Enhancing Technologies (PETs).
4. Education program towards "safe use of the Internet"
5. Define the roles and broaden the scope of the duties of Data Protection Agencies
6. Consent: of the person submitting data to an organization is a critical key of privacy protection.
7. Define by law and regulations the right to close the door and to be forgotten: the ability to enter into a state of non-‘reachability’ of an individual.
8. Legislation targeting individuals the breach privacy:
Meeting Points: to create, by legal and technological means, additional meeting points between the individual data subject and the data controller, so that he or she can (re)gain control over their personal data.
9. Define by law and regulations the requirements for transparency and proportionality.
10. Data categorization: to divide personal data into two categories: informative data (e.g. name, address, academic degrees); evaluative data (e.g. the opinion of a boss, what the teacher is thinking about a student). Evaluative data should be treated in a more sensitive way.
11. Labelling: refers to instituting the requirement to label each pertinent IT product (e.g. smartphone, SNS, computer application) with a label stating its compliance with privacy protection.
12. Initiation of a 'grey ecology': Explore the potentialities of the initiating a “grey ecology” concept. The grey ecology will function like the current 'green ecology', namely a set of values and standards that maintain a sufficient level of privacy.
13. Reduce privacy divides by: raising awareness of privacy issues, so to address divides that are the result of ignorance; Providing assistance to those in the need, (people with disabilities, elderly and children); Simplifying enforcement means so that they are more accessible to people; adding legal avenues for enforcement on behalf of those who do not access the judicial system; Regulating data controllers' behaviour, by requiring them to use simple, easy to understand, ways of conveying information about their data practices and Mandatory PETs; regulation of prices of technologies.

Potential Impact:
The rapid penetration of emerging technologies into our daily life require preparation in advance and raise awareness on potential threats to privacy especially among the citizens in general and adolescents in particular.
PRACTIS (www.practis.org) presents novel results on the potential impacts of various emerging technologies on privacy threats, enhancement and changing perceptions. Scenarios that reflect these evolving changes have been constructed. Changes of attitudes towards privacy among the “Web Generation” were explored by confronting high school students in several countries with different potential situations enabled by emerging technologies in their everyday lives. Revisiting ethical and legal principles for a technologically evolving society and suggesting guidelines promoting the value-sensitive design of potentially privacy-affecting technologies are some of the project’s results.

The project outlines some policy recommendation for the "clients" of PRACTIS, namely the governing bodies of the EU. It is imperative to note that the recommendations are aimed to facilitate policy devising, legislative processes, and implementation of privacy control measures within the EU. The recommendations are not meant to be a basket of advices to the citizens. If we tried to do so, we would end up with dozens of advices, commandments and guidelines to the ordinary person about "thou shalt do this" and "thou shalt not do that". This, of course, was not the intention of the project. The Goals and Objectives specified in the DoW state: "… resulting in formulating legal, regulatory and other policy recommendations." Therefore, all the recommendations that follow relate to policy and its derivatives: legislation, regulation and other measures.
The following will address the importance of having a long-term vision concerning privacy preservation. This we name a "grand policy". We later articulate a list of recommendations categorized by classes. At the end of the chapter, we suggest some avenues for further research.
The "Grand Policy"
By the term "grand policy" we mean the overall direction towards which the EU wishes to strive, namely, how the EU envisions the level of privacy wished to be preserved among its citizens. Some examples of such directions are illustrated in the scenarios, but there could be others.
The grand policy of the EU and its members should be decided on before specific recommendations should be adopted and implemented.
Section 3.4.2 of this report portrays five possible privacy related scenarios towards which our society can proceed. These are: 1. “Privacy has faded away”; 2. “People want to maintain as much privacy as possible”; 3. “People have lost control of privacy”; 4. “Segmented privacy”; 5. “Tailor-made privacy”.
It might be that the list is not exhaustive but each of those scenarios is feasible and can be reached, particularly when the "natives" of the Internet era will replace the "immigrants" at the front seat of the governing bodies of our society. This will take place in about 10 to 15 year. The EU should delve and elaborate on those avenues and select what is the preferred direction. The fulfillment of the recommendations that follow should comply with the major question: where do we want to be in terms of privacy in 10-15 years from today?

For example, if the aim is to achieve scenario 5: “Tailor-made privacy” , some parts of the current development trends of technologies need to change in future. For instance, considerable investments are needed to make technologies more transparent. This means that the whole production chain needs to be reshaped, and more resources need to be invested to develop technologies towards the concept that they are transparent at an early stage of the development process. Another obvious conclusion would be that societies need to invest more in education. That is one truth, but the other truth is that if the technologies are not transparent, the education loses most of its benefit. An immediate conclusion is that governments need to regulate the development life cycle of an emerging product when it is suspected to be related to privacy maintenance. In order to obtain a future where technology enables people to achieve tailor-made privacy, consumers have to be able to make more informed decisions about whether or not to purchase certain kinds of products on the basis of prior knowledge about the possibilities and disadvantages of emerging technologies. However, the solution is not so simple. Even if technologies such as smartphones were to become more transparent, it would not help to achieve tailor-made privacy because people are sometimes even “forced” to use such devices in contemporary societies. For example, at the moment, it is impossible to do most kind of qualified work, if you do not use a mobile phone or a computer. One conclusion would be to require the inclusion of security settings during the development phase of different technology applications so that people can really choose tailor-made privacy, even though they are “forced” to use these technologies. This is the principle guiding the idea of Privacy by Design.
The above discussion was just an example of measures required to proceed towards Scenario 5. Similar considerations should be made if another scenario is selected as the goal for our privacy status.

Policy recommendations
The need for a Basket of Solutions: There is no single magic solution for privacy issues. The complexity of the technologies, the unsolved obscurity of the concept of privacy, and its inter-dependence on social norms and technological developments, means that this is a dynamic field, and one that is unlikely to settle down in any near future. This, taken together, renders a panacea solution impossible. Accordingly, the project suggests a basket of solutions, which include, in addition to some legal proposals, empowerment of individuals by way of raising awareness and education, the use of Privacy Enhancing Technologies (PETs), concepts such as Privacy by Design (PbD), as well as organizational suggestions, such as adopting the procedures of Privacy Impact Assessments (PIA), and the appointment of Privacy Officers within organizations (public and private). Note that each of these avenues does not stand alone.
In order to present the recommendations in an orderly fashion, we have clustered them into a number of categories and we will list them by categories. The categories that were chosen are the following: Technology; Law and regulation; Organizational issues; Education and Social issues.
Technology
1. Encourage the development and exploitation of Privacy Enhancing Technologies (PETs):
Privacy Enhancing Technologies (PETs) may be regarded as one of the outcomes (or enablers) of PbD. The EC Communication on PET issues a definition of PET deriving from the PISA project – "PET stands for a coherent system of ICT measures that protects privacy by eliminating or reducing personal data or by preventing unnecessary and/or undesired processing of personal data, all without losing the functionality of the information system" . PETs' classification is based on functionality:
• PETs for anonymization (e.g. TOR software for anonymous web surfing18)
• PETs to protect network invasion (e.g. Latent Semantic Indexing to identify standard users)
• PETS for identity management (Credential systems providing authentication without identification)
• PETs for data processing (privacy preserving data mining)
• Policy-Checking PETs (e.g. EPAL, OASIS XACML – policy specification, organization and verification tools).
There are not very many technologies that can be defined as PET. Some technologies can be exploited both ways – for privacy protection as well as to counter privacy. If the EU adapts the encouragement of PETs as a dominant policy, the EU can benefit economically from that by exporting such technologies to foreign countries.
2. Data Subject's Control:
The fundamental principle is that the data subjects should maintain control of their personal data and the flow of the data from one digital location to another: It is for the data subject to make decisions about whether her/his wishes to share the data, with whom, under what circumstances, when and how. The notion of control reflects the underlying theories of privacy and its ethical basis of human dignity and autonomy.
A possible technical solution for increasing the data subject control over his/her data could be to inverse the roles of the data subject and the data collector: Suppose each individual uploads his/her personal data to a cloud computer that can be accessed by everyone, subject to authorization. Each organization that wishes to retrieve data about an individual should obtain permission from the data subject to access his/her confined profile. The permission also designates what data items can be fetched from the depository. Consequently, the control shifts from the collector to the data subject.

Law and regulation
3. Privacy by Design (PbD): PbD is described as a process of "building fair information practice principles (FIPs) into information technology, business practices, and physical design and infrastructures". In simple words, each engineering product and each ICT application has to undergo a development life cycle before being put on the shelf. The life cycle is strictly defined by organizations that set standards in various fields such as ISO, PMI and the like. The life cycle examines many characteristics of the product such as its design, reliability, security, user friendliness, and the like. So far, very little has been done about incorporating aspects of privacy into the life cycle of a technological development. Consequently, the threat to privacy raises as a "surprise" after the product has been distributed to the market or the application has been installed on the computer/tablet/smartphone. It is imperative to regulate that privacy consideration be examine during an early stage of the design, namely in the very beginning of the life cycle. This regulation should hold for every field of technology that might have an impact on privacy. In addition to regulation, education and adoption of PbD voluntarily should be encouraged.
4. Consent: The consent of the person submitting data to an organization is a critical key of privacy protection. However, it is, at the same time, the weakest piece. Therefore, it should be more innovative and better reinforced in order to make it really effective. Some proposals are made:
• First, technological innovations should support a better transparency of the flows generated by the personal data;
• Second, consent should be given for a limited time : people can change their mind;
• Third, consent should be given for categories or classes of service providers according to their concern for the privacy; providers that do not comply with generally accepted privacy requirements should not get the individual's consent;
• Fourth, labels could help this categorization and provide at least better information of people regarding to whom and for what they are giving their consent.

5. Define by law and regulations the right to close the door: The right to close the door is the ability to enter into a state of non-‘reachability’ of an individual. At a certain moment, an individual may say "leave me alone". I give up your favors and benefits in return for keeping my privacy. This might prevail for a limited time or regarding a certain set of activities or until a change notice is announced. It relates to the right to be forgotten, namely to be wiped out from SNS and other voluntary depositories of data.
6. Define by law and regulations the right to be forgotten: The requirement for consent and the right to close the door are necessary for privacy protection, but they are not enough. There should be an easy and "friendly" procedure to withdraw from every SNS and voluntary database on which individual's data are recorded. Today, in some SNSs and marketing databases it is almost impossible or very complicated to delete one self's information. It should be enacted that a withdrawal procedure should be clear, easily accessible.
7. Legislation targeting individuals the breach privacy: Most of the privacy related legislation is directed to limit governmental and business organizations, which collect, process and disseminate data about individuals. However, what about individuals who collect data, distribute it without permission and open their webpages, list of friends, photographs albums and the like to the public or to a number of friends, whereas the subjects of those photos or addresses or personal stories and anecdotes have never granted a consent to do so? The area of friends revealing private information about other friends is hardly regulated or legislated. It should be deliberated because due to the rapid growth of SNS use the phenomenon poses a strong threat to privacy.
8. Meeting Points: Today, usually the data subject and the data collector and controller have a meeting point at the first stage, if the data is collected directly form him or her, and if the subject understands the process. However, once the data is collected, the power of the subject to control the use of the data is limited. It is processed by the data controller, those who work for the controller (employees) or with the controller (outsourcing), and those who receive data from the controller, wherever they are located. The duties imposed on the data controller aim to assure that the data subject's rights and interests are not breached later on. The general policy recommendation is to create, by legal and technological means, additional meeting points between the individual data subject and the data controller, so to re-empower the data subject, so that he or she can (re)gain control over their personal data. These additional meeting points should enable the data subject to have a second and third choice, to make an informed, free decision, or to reverse a prior decision.
9. Define by law and regulations the fundamental principles of individual's rights that prevail in a democratic society: The fundamental values (principles) are: dignity, self-determination, social justice. These values are threatens by the aforementioned emerging technologies. Therefore, it is required that they be protected by legal "umbrella".

10. Define by law and regulations the requirements for transparency and proportionality: The two main principles – transparency and proportionality – have to be complied with. Undoubtedly, the emerging technologies challenge these two principles. Therefore, the definition of both should be presented in laws, and incorporated into the PbD procedures.
11. Define the roles and broaden the scope of the duties of Data Protection Agencies: DPAs have been established in a number of countries within the EU and in other countries. However, their functions, scope of authorization and enforcement power have not been commonly defined and harmonized among the various countries. Their domains of activity, their rights and authorization should be clarified and enshrined by laws. Governments and the EU should reassure that: (1) the DPAs have enough autonomy to fulfill their tasks; (2) the DPA are financed well enough to fulfill their tasks. This will also enable coordination among DPAs in various countries, in order to cope better with the non-existence of national borders when it comes to Internet crimes.
12. Labeling: The term labeling refers to instituting the requirement to label each pertinent IT product (e.g. smartphone, SNS, computer application) with a label stating its compliance with privacy protection. (This is analogue to the one developed by EU for sustainability, the Eco-Label). Labeling requires the definition of privacy criteria in order to rank the services providers according to their privacy’s care. It also demands independent experts to evaluate and to rank the providers according to the defined criteria. After the initial labeling, it needs a permanent follow up in order to guarantee the trustfulness of the labels system.
Organizational issues
13. Technology scouting: Usually, our attention to privacy threat focuses on ICT. However, the report has identified a large variety of technologies that might pose potential threats to privacy. Among them included nanotechnology and new material development; medicine, biology and biometrics; robotics and cyborg development; cognition ("mind reading"); and ICT. The only way to explore the threats to privacy posed by those technologies (and maybe by some others) is to establish a permanent scouting unit whose main duty would be to identify threatening technologies far in advance to their implementation and proliferation. Such unit should be composed of a small number of experts specializing in the aforementioned technologies, who are capable of tracing new R&D projects, forecast their outcomes and ring the alarm when needed. The scouting should be independent and not affected by commercial considerations. (It might be a department within the DPA). Just as an example: Imagine how the smartphone technology would look like if ten years ago it was examined for its threat to privacy.
14. Data categorization: Not every data item is required to the same degree of privacy protection. For instance, name and address are less sensitive data than political affiliation or medical records. The opinion of the boss about an employee is more sensitive than the job title of the employee. Generally speaking, we can divide personal data into two categories: informative data (e.g. name, address, academic degrees); evaluative data (e.g. the opinion of a boss, what the teacher is thinking about a student). Evaluative data' in principle, should be treated in a more sensitive way. However, this categorization is two rough and should be much more refined by dividing the organizational and governmental data into layers, where each layer requires a different degree of privacy protection.
15. Chief Privacy Officer (CPO): Appoint a CPO in each organization in the business, the NGO and the public sectors. The CPO will be on charge of privacy preservation and compliance to privacy laws and regulations. This person will also handle complaints regarding breach of privacy and advice decision makers on privacy related issues. It should be assured that the CPO will be independent in the organization (similar to an internal auditor or to an ombudsman).
Education
16. Education programs towards "safe use of the Internet": The findings of the student survey indicate a clear and risky trend of adolescents to disregard privacy considerations when they interact online with so called "friend" over the SNS. The "natives" perception of privacy is significantly different from that of the "immigrants". We must assume that the amount of time the young ones spend in front of the SNS (be it via a PC, a laptop, a tablet or a smartphone) will not decrease in the future, but rather grow significantly. Therefore, the most effective way to prevent the "privacy fading out unconsciously" scenario is by education. Law and regulation can be complementary to education but cannot replace it. Appropriate education will encourage the demand for satisfying privacy by way of bottom-up in addition to top-down (namely, regulation).
17. Education of adults in the spirit of "yes we can": The focus groups show that even if you value the privacy as it is the case for most of the non-native people met during the focus group, to protect their privacy is becoming an impossible challenge due to the digital economy (personal data versus access to services). The weakness of the 'informed consent' has been pointed out as one of the most critical difficulties they experience in their everyday life. It is important to educate adults that they should insist on maintaining their privacy when they wish to, that it is possible to obtain privacy policy from various businesses and organizations and to select privacy preferences, and that they should not surrender to privacy threats believing that there is nothing they can do. They do not have to grant consent when they don't accept the privacy terms.
Social issues
18. Initiation of a 'grey ecology':Explore the potentialities of the initiating a “grey ecology” concept (suggested by Paul Virilio, The grey ecology will function like the current 'green ecology', namely a set of values and standards that maintain a sufficient level of privacy. The grey ecology will orientate the political and industrial authorities towards actions and research which promote ‘clean technologies’, which are sustainable regarding privacy protection.
19. Reduce privacy divides: Identify the many causes of privacy divides and study them so to determine which need to be addressed and in what priority. Based on the findings of such further research, possible policy responses include:
20. Raising awareness of privacy issues, so to address divides that are the result of ignorance. The form of raising awareness can range from "soft" avenues of general campaigns, to compulsory forms, such as including the issue in school curriculum.
21. Providing assistance to those in the need, such as people with disabilities, the elder, and children.
22. Simplifying enforcement means so that they are more accessible to more people; adding legal avenues for enforcement on behalf of those who do not access the judicial system. For example, allowing NGOs to sue, or permitting class actions in appropriate cases.
23. Regulating data controllers' behavior, by requiring them to use simple, easy to understand, ways of conveying information about their data practices. The data collector and controller should invest in additional ways to convey information (such as the use of images, non-textual means).
24. Mandatory PETs; regulation of prices of technologies. More specifically, the regulation could also require that whenever a traceless technology alternative exists, it must be offered at the same price as the tracing technology.
PRACTIS findings have been published in several public reports:
• Horizon scanning report on emerging technologies and techno-scientific developments which might impact on privacy in the future. (D2.2)
• Changing perceptions of privacy and the changing role of the state – results from the school surveys and expert survey (D.3.4)
• Scenario descriptions of future privacy (D4.1)
• Potential changes in privacy climates and their impacts on ethical approaches (D4.2)
• Future ethical, legal and social frameworks for the privacy/technology interface (D5.2)
• Privacy-Oriented Technology Assessment: A dynamic Life Cycle Assessment approach for privacy (D6.1)
• The legal implications on privacy of future emerging technologies (D6.2)
• Privacy implications of emerging technologies for stakeholders – Policy recommendations (D6.3)


PRACTIS has worked a close cooperation with all relevant members of society: policymakers in government and municipalities, representatives from social organizations, experts from academics in the fields of technology and social sciences (privacy, law etc.) and representatives from the citizens. All relevant stakeholders have been involved in various stages during the project through:
• Interviews with experts in the field of privacy and data protection in the academia, industry, government and social organization.
• Surveys with experts and policy makers from relevant fields in technology and social sciences.
• Surveys with adolescents in high schools.
• Focus groups with citizens
• Workshops with policy makers in government and social organisations as well as industry.

A comprehensive dissemination plan has been implemented to increase awareness of the project results and to exploit them and make them sustainable among a wide range of target audience. The following objectives were accomplished:
1. Mainstreaming the European privacy protecting frameworks, including ethical aspects of national policies;
2. Increasing awareness at stakeholder and potential individual user levels; to work towards a new mind-set based on learning and collaboration in ways that will contribute to a knowledge society;
3. Increasing sensitivity to the relations between privacy and the deployment of emerging technologies, especially in the context of their impact on ethical, even esthetical values;
4. Exposing the relevant groups to the ethical and legal debate on privacy and technology including relevant frameworks to cope with possible future societal problems.
5. Exploiting the project’s materials, findings and results among stakeholders and ensuring its sustainability after the end of the project.

Dissemination of the project results was carried out through various channels:
1. Project website - (http://www.practis.org). The site includes information on the project objectives and the activities and events that have been conducted during the project. All public reports of the project's results were regularly uploaded. It was also use for the day-to-day communication of the project partners.
2. Electronic newsletter - Three issues of an electronic newsletter (accompanied by its paper version) were produced in the running time of the project - in January 2011, 2012 and 2013 . The issues contained articles, written by project participants, about privacy-related issues, results of surveys carried out in the project, information on PRACTIS conferences and other events. The electronic version of each issue was sent out at to the addresses collected in the mailing list.
3. Flyers and printed materials were produced for the needs of the project. They were widely distributed at conferences, workshops, seminars and other events organized by the project.
4. National and international conferences as well as workshops have been conducted during the lifetime of the project. Each partner conducted final workshop in his country with the relevant stakeholders from academia, industry and government institutions to present the project results. A final conference of the project was held as part of the EDBT/ICDT 2013 international Conference (March 20th-23rd).
5. Publications – several papers and publications have been published by the partners during the project (five of them were peer-reviewed).
6. Seminars and lectures have been conducted during the project by the partners to present the project results to various audiences (students, policy makers, pupils etc.)
The series of PRACTIS reports provide novel information and data on future privacy in light of emerging technologies. There is a need for EU and national authorities and institutions to follow and implement PRACTIS's recommendations to ensure the sustainability and the success of this process. Furthermore, PRACTIS encourages additional research to deepen the understanding of the phenomena of generational differences in privacy perception due to future technologies as well as the cultural differences and the ethical, social and legal challenges. Further research is also needed to explore the notion of Privacy Divides, identified in the project. Such divides mean that different people enjoy different levels of privacy for external reasons, rather than their choice. In this sense, privacy is also a matter of equal participation in democratic civic societies

List of Websites:
www.practis.org