Skip to main content

Detection Technologies, Terrorism, Ethics and Human Rights

Final Report Summary - DETECTER (Detection Technologies, Terrorism, Ethics and Human Rights)

Executive Summary:
DETECTER EXECUTIVE SUMMARY
DETECTER was a three-year Collaborative Research Project under the European Union Framework 7 Security Programme that ran from 2008-11.
DETECTER identified human rights and other legal and moral standards that detection technologies in counter?terrorism must meet.
It surveyed current and foreseeable applications of detection technologies in counter?terrorism, and conducted cutting?edge legal and philosophical research into the implications of human rights and ethics for counter?terrorism in general and detection technologies in particular.

DETECTER also successfully pioneered methods of discussing ethics and human rights issues with counter-terrorism professionals using detection technologies, and with technology developers in private meetings. Its research was constantly informed by these stakeholder interactions.

DETECTER research examined:
- the ethical wrongs of terrorism and the ethical risks of preventive counter-terrorism policing, including through use of privacy-invading technologies and profiling
- the human rights implications of unilateral exceptions to international law, especially international law on privacy, a theory of which is also constructed and defended
- the legal implications of data-mining in counter-terrorism, by reviewing data-mining programmes, critically assessing methods of evaluation for such programmes, and finally drawing conclusions about their compatibility with international law
- the human rights implications of pre-screening immigration controls involving detection technologies, and proposed a model for the issuance of humanitarian visas for safe travel to the EU
- the legal possibilities for better regulation of surplus information gathered in the context of Internet monitoring for counter-terrorism purposes
- the strengths and weaknesses of current monitoring mechanisms for counter-terrorism including technology use
- the human rights risks of selected detection technologies, in particular location-tracking technologies, the privacy implications of which are analysed in detail

Project Context and Objectives:
Motivation
Since 9/11 and the terrorist bombings in Madrid (11 March 2004) and London (07 July 2005), policing and intelligence activity have increasingly focused on methods of preventing future attacks, and not just on identifying the perpetrators of offences already committed. Preventive police work includes the use of detection technologies. These range from CCTV camera-surveillance of suspicious behaviour in public places to secret Internet monitoring and data-mining. Technological advances such as these make possible new kinds of illegality and immorality, as well as new kinds of protection of life and liberty. Surveillance, detention or denial of entry at a border may be based on misidentification or may be unjustified for other reasons, such as their invasion of privacy. Yet the moral and human rights risks of current detection technology have not been comprehensively listed, let alone studied, by governments in Europe or elsewhere. DETECTER aims to rectify this, by identifying the distinctive moral and legal risks of uses of different detection technologies, within the context of the moral and legal risks of counter-terrorism in general.



Objectives and strategy
The overall objective of DETECTER is to identify human rights and other legal and moral standards that detection technologies in counter-terrorism must meet, taking into account the effectiveness of these technologies as judged by law enforcement bodies responsible for counter-terrorism, and other relevant authorities.

DETECTER achieves this objective in two ways: by surveying current and foreseeable applications of detection technologies in counter-terrorism (Work Package 2), and by conducting cutting-edge legal and philosophical research into the implications of human rights and ethics for counter-terrorism and the use in counter-terrorism of such technologies for (Work Packages 3-9). Regular project research meetings, attended by all consortium partners, ensured that the research conducted under each work package was informed by that of others. Regular meetings with technology developers and end users ensured that the human rights and ethical aspects of such technologies were communicated to those who make and use them, and that the research produced by DETECTER remained outward-looking and thus relevant to the real world of counter-terrorism.


The overall objective of the DETECTER project is:

(O) To identify human rights and other legal and moral standards that detection technologies in counter-terrorism must meet, while taking into account the effectiveness of these technologies as judged by law enforcement bodies responsible for counter-terrorism, and other relevant authorities

This will be achieved by meeting the following, more specific, objectives:

O1.To survey detection technology products and identify their relative moral risks as means of preventive counter-terrorism
O2.To identify the relative moral risks of preventive counter-terrorism measures in general vs. reactive counter-terrorism measures
O3. To assess the moral justification for taking the risks examined under O1 and O2, given an analysis of what is wrong with terrorism in various forms
O4. To survey substantive norms of international law governing counter-terrorism, with an emphasis on human rights law
O5. To survey developments in the declaration of exceptions, claims of inapplicability, or unilateral modification in respect of international law for the sake of counter-terrorism
O6. To assess the developments in O5 with special reference to
a. Data mining
b. Electronic surveillance of Internet traffic
c. The use of pre-entry screening measures for migrants,
including asylum-seekers
O7. To survey official and publicly acknowledged methods of monitoring of secret counter-terrorist activity, including the use of detection technologies, by international, regional and national legal institutions
O8.To identify types of product in current detection technology development that are high-risk or low-risk when judged by moral standards or those of international human rights law
O9. To disseminate to policy makers, manufacturers and law-enforcement officials assessments of both desirable and undesirable features of detection technology products, as well as general standards for products to meet

Project Results:
Significant Results
This section outlines the research results produced by the project over its lifetime. It also summarises the proceedings of the periodic research and stakeholder-related meetings. These results are presented by Work Package.

Work Package 02 Detection Technology Review and Liaison

This pivotal work package put consortium partners in touch with technology developers, commissioners and users, both aiding the targeted dissemination of DETECTER research and informing that research. By tracking the rapid development and changing use of detection technologies, many of which have not been subjected to ethical and human rights analysis before, this WP also enabled DETECTER to break new ground. It met objectives O1.: to survey detection technology products and identify their relative moral risks as means of preventive counter?terrorism; and O9: to disseminate the results of research to policy makers, manufacturers and law enforcement officials assessments of both desirable and undesirable features of detection technology products, as well as general standards for products to meet. It met DETECTER’s aim of delivering impact by engaging the theoretical expertise of the partners with the judgements of law?enforcement officials about what technology and law is effective against terrorism, and developers’ understanding of what is technically possible and marketable. In practical terms, it provided technology updates and assisted with the organisation of those parts of the project meetings that brought together academic partners and technology stakeholders.

The work package leader was Tom Sorell, and the research fellow producing the deliverables associated with this WP was John Guelke.


2.2.2.3 Significant Results
D10.1 Report on first meeting with technology developers and end users. The report first summarises the presentation from Tom Ormerod of the University of Lancaster on D-SCENT, a project focussed on using ‘scent trail’ (the informational data given off by an individual’s activity) to combat terrorism in public places. A presentation from Iain Darker of the University of Leicester on MEDUSA followed. MEDUSA combines psychological and technological approaches to construct an image processing algorithm to automate the detection of firearms and alert CCTV operators. James Ferryman of the University of Reading, presented on SAFEE, a completed FP7 project which aims to use audiovisual surveillance of passenger behaviour to identify potential terrorist intentions. The next presentation, from Colin Edwards and Ian Vickers of Scotland Yard dealt with a database used to log and cross-reference information from seized computers relevant to terrorist offences. The final presentation, from Rory Doyle of Smiths Detection on a particular Smiths scanner product, was followed by a response by Professor Martin Scheinin of EUI. In this response, it was suggested that automating image viewing of body scans would avoid many of the threats to privacy such technology poses.
The presentations in this meeting outlined the aims, basis in research, and technological and other challenges faced by those developing different detection technologies. The question and answer sessions that followed drew attention to the relevant ethical and human rights principles with which the use of such technologies might be in conflict, in particular the value of and right to privacy. Recurring queries were raised about the validity of the psychological assumptions underpinning the research presented, their effectiveness, and the extent to which developers’ research succeeded in demonstrating public acceptance of the interference with privacy involved in their application.
D10.2 Report on 2nd meeting with technology developers and end users. This report summarises an introductory talk and four presentations with question and answer sessions. The introduction, by Marco Malacarne, Head of Security Research and Development EC, outlined the particular challenges facing contemporary security research in the EU. These include the inadequacy of absolving technology researchers of the responsibility to consider ethical issues, the conflict between economic interests and ethical requirements, and the increasing complexity of the technological landscape. All four of the presentations were given by representatives of FP7 security projects: Jan Derkacz of FP7 project INDECT (Intelligent information system supporting observation, searching and detection for security of citizens in urban environment); Chief Inspector Dave Fortune of FP7 project ODYSSEY (Strategic Pan-European Ballistics Intelligence Platform for Combating Organised Crime and Terrorism); Jorgen Ahlberg of FP7 project ADABTS (Automatic Detection of Abnormal Behaviour and Threats in Crowded Places); and Vittorio Murino of SAMURAI (Suspicious and Abnormal behaviour Monitoring Using a Network of Cameras for Situation Awareness Enhancement). All four presenters summarised the objectives, methodology, and expected results of their projects. Jan Derkacz highlighted in particular the attention paid by INDECT to security and privacy issues in the architecture design. The question and answer sessions raised concerns about technological vulnerabilities, including risk of employment for illegitimate purposes, risk of discrimination, and risks related to sharing across jurisdictions with differing interpretations of the value of privacy. The challenge and imperative of trying to build human rights compliant technology was also discussed.

D10.3 Report on 3rd meeting with technology developers and end users. The first presentation was by Christopher Westphal, of Visual Analytics. The first half of the presentation explained the work of Visual Analytics, which involves employing software to search multiple databases simultaneously to find anomalous entities and anomalous relationships between different entities that could indicate suspicious behaviour. The second half of the presentation examines a particular law enforcement application of the technology, examining wire remittance activity for the purposes of investigating human smuggling. The second presentation, by Coleen McCue, introduced the work of Geospatial Predictive Analytics, which is rooted in the principle that the occurrence or presence of events (e.g. crimes) is neither uniform nor random in distribution. Geospatial Predictive Analytics applies advanced statistical analysis and modeling techniques to spatial relationships. It aims to help police move beyond reactive activity to predictive policing, for example, by predicting where and when shooting incidents are likely to occur. The basis is not merely a record of what areas have been past crime sites, but environmental features that make a particular spatial area likely to become a site. These features are assigned probability density functions in order to accurately rank potential future sites of incidents. The third presentation, by Professor Emil W. Plywaczewski and Dr Wojciech Filipkowski of the University of Bialystok, Poland, described the work of the Polish Platform for Homeland Security. The activities of the PPHS, which include research participation in the 7th Framework Programme, INDECT, are aimed at creating integrated computer tools to support police and other law enforcement services with the most modern technology to assist in their efforts to improve public security. The fourth presentation, by Carlos Gacimartín of XplicoAlerts, discussed the research completed for WP03 of FP7 Project INDECT. Xplico is a linux-based software tool for interception and interpretation of telecommunications information and internet traffic. Traffic is intercepted by placing a black box directly on the connection – often a Digital Subscriber Line. The software analyses phone call, internet, email and picture file information sent over a protocol. Already Xplico is able to interpret most protocols: web traffic, emails, VOIP and FTP file sharing. It is made available to anyone as open source software. The question and answer session saw concerns raised about the risks to privacy associated with making the highly invasive software developed by Xplico available to everyone through open source. Concerns were also raised about the proportionality of the use of some of the technologies presented, especially when they are used speculatively as opposed to in the pursuit of individuals or information linked to a particular crime.

D10.4 Meeting four of six with technology developers and end users took place in Lund on the 12th and 13th of November, with presentations from six speakers: Radu Mares of the Raoul Wallenberg Institute, Janine Hillier from Virginia Tech, Erik Bladh from Axis , Bosse Norhem from the Lindholmen Science Park, Casper Bowden, the Privacy Advisor to Microsoft and Daniel Lindahl from the Swedish Research Defense Industry. Radu Mares gave a presentation on Business and Human Rights, explaining the work and principles in this area of Professor John Ruggie, former Special Representative of the Secretary General to the UN on human rights and transnational corporations and other business enterprises. Janine Hillier defended the claim that privacy is a fundamental right valued by individuals and that technology should be designed to protect this right. Erik Bladh reported the advances being made by Axis in the field of intelligent video. Bosse Norhem gave 3 examples of important projects being pursued at the Lindholmen Science Park: ‘Quickwins’ a single, interoperable system enabling real time collaboration between different emergency and security services; a study on the transport of dangerous goods, and a project to obtain live video from an accident scene for better COP (common operation picture). Casper Bowden argued against the common data protection orthodoxy that that information processing systems can be equally regulated by the principles of data protection. Finally Daniel Lindahl argued that serious security and privacy risks derive from individual users of technology’s lack of understanding of their working and that there needs to be a substantial cultural shift if this is to be remedied. John Guelke produced the report of the meeting.

D10.5 Meeting five of six with technology developers and end users took place in Oslo in February 2011. There were presentations from four speakers: Jorgen Ahlberg (an academic researching the development of new detection technology), Jurgen Schwarz ( a former policeman with experience providing security in Afghanistan now working for UNIVAL ), and Colin Tansley and Grant Moss (both counter-terrorism professionals, with 31 and 30 years service in the policing respectively). Jorgen Ahlberg presented results from two FP7 funded projects on intelligent video, PROMETHEUS and ADABTS. He argued that traditional CCTV surveillance has a significant limitation – the limited attention of the viewer – to which problem intelligent video offers a solution. However, automatically detecting even predefined events is difficult, and other challenges such as the detection of ‘anomalous behaviour’ raise difficult ethical questions. Jurgen Schwarz gave a presentation on ‘HEDD1’ a product that claims to simultaneously detect all types of commercial and military explosives including liquids. Colin Tansley’s presentation addressed the use of technology in counter terrorism from an intelligence practitioner’s perspective, including of the uses and limitations of CCTV, intrusive audio surveillance, substance detectors and forensics. Grant Moss , now a Security Management Specialist in a UK National Health Service Trust, explained how technology was being used to secure large hospitals. These are potential targets for terrorism and, as large organisations employing many people and treating even more, they have to consider the possibility that they may treat or even employ people involved in terrorism. Hospital security raises particular difficulties as the duty to protect the security of the hospital and wider public comes into tension with the duty to protect patient privacy and dignity. John Guelke produced the report of the meeting.

D10.6 The sixth meeting between technology developers and end users was held in Abo on the 10th of May 2011. Tom Sorell introduced the first session, which had three speakers. Olov Fäst, General Manager of the Swedish Space Corporation presented the company’s Airborne Maritime Surveillance package, a multi sensor aircraft used for detecting oil spills and, more controversially, illegal migrants. Thomas Anderson from Carl Zeiss electronics presented their FP7 funded AMASS project, which had created autonomous powered buoys mounted with sensors for transmission of audio and visual surveillance. Ari Virtanen from the VTT Technical Research Centre of Finland presented the FP7 funded TALOS project, which was developing a series of autonomous, driverless vehicles, again mounted with audio and visual surveillance capabilities. Martin Scheinin introduced the second session on biometrics, which had two speakers. Ricardo Vieira of Vision Box in Portugal spoke on the range of possible uses for biometrics in airports, and how the Vision Box company is innovating in areas where biometrics have not so far been employed. Gillian Ormiston, Global Market Manager for Safran Morpho in France also spoke on the use of biometrics in airports, comparing the state of development of fingerprinting, facial recognition and iris scanners. She also identified a range of dangers to secure identity management at airports including the weakness of identity security in certain jurisdictions such as the UK. Tom Sorell closed the day’s proceedings with reflection on the moral riskiness of a number of the technologies presented. John Guelke produced the report of the meeting.

D12.1 John Guelke compiled 12.1 an annotated bibliography, including sources identified for the quarterly technology reviews, the blog, and the research for WP03. It also includes sources from DETECTER WP bibliographies and solicited additional monthly contributions from consortium partners. Drafts of the bibliography were updated regularly and made available on the secure area of the DETECTER website. The completed document summarises some 60 books, journal articles and government and NGO reports.

D12.2.1 The first technology review begins by providing a list of the technologies surveyed, including scanners, cameras, bugs, malware, trackers, data-mining. Examples are given of novel technologies and links to product and developer websites are provided. A note on the taxonomy of dangers associated with these technologies follows. For each type of technology, the taxonomy of dangers categorises along two axes: the possibilities or risks that come with the use of a particular technology (eight kinds of use are listed: from ‘normal’ to ‘risk creep’ and ‘abuse’) and the distinctive harms these may lead to (five harms are listed: the risk of invasion of privacy, miscarriage of justice, discrimination, chill and non-criminal disadvantage to those targeted).
D12.2.2 The second technology review begins by explaining the rationale for the format chosen , which is distinct from alternative approaches, in particular the PRISE project, insofar as it is continuous with categorisations used by industry and therefore more user friendly to them. The taxonomy of dangers has been expanded to include fifteen possible misuses, (including use in preventive policing, the transparency of process to the target and the risk of unauthorised access) and eight dangers (including indirect intrusiveness that follows not from the initial scrutiny but from consequences arising from it, and the violation of existing norms). A product categorisation follows, including audio surveillance, scanners, vehicle trackers, phone monitoring, computer monitoring, data mining and biometrics.
D12.2.3 The third technology review includes, in addition to the categories listed above, a technology news section, including Europe-wide debates over SWIFT and Intelligent Transport System as well as news relating to technology developments in Italy, UK, Romania and Belgium. This is followed by a description of new products, which are listed by the following categories: scanners, audio surveillance, cameras, phone monitoring, biometrics, databases and data-mining. A product categorisation and taxonomy of dangers completes the review.
D12.2.4 The fourth technology review maintains the format of the third. The news section includes entries on Europe-wide debates, and updates from UK, Italy, Germany, Bulgaria, the Netherlands, Austria, Ireland and Slovakia. Product news includes an overview of recent developments. A product categorisation and taxonomy of dangers completes the review.
D12.2.5 In response to suggestions from James Thurman of UZH, this fifth technology review experiments with an alternative, more user-friendly format for the taxonomy of dangers. Four tables introduced in this survey are designed to visualise some of the analysis of the relative risks of detection technology from DETECTER Deliverable D05.2. Each table compares the moral risks incurred by CCTV in public places, full body scanners, substance detectors, covert cameras, bugging, phone monitoring, location tracking, internet monitoring and databases and data-mining. Three tables deal with the three key moral risks identified in D05.2 – intrusion, error leading to arrest or detention and chill – and the final table summarised this information. The tables highlight the severity of moral risks taken on the basis of a colour code: green to indicate the least risk, yellow to indicate an intermediate risk, and red to indicate the greatest risk. The remainder of the review includes technology news from the EU, Romania, France, Germany, the UK and Italy, and a product categorisation as above.

D12.1.6. The sixth Technology Quarterly Update retained the format established in the fifth, comprising a section on news items concerning the use of detection technology covered in the European press (with stories specific to Macedonia, the UK, Belgium and Italy), a section detailing new products and four tables visualising the analysis of relative moral risks carried out in D05.2 and, where appropriate, explaining the implications with specific examples of technology products.

D12.1.7. The seventh Technology Quarterly Update continued the format originally reached in the fifth. It reports a number of important news stories from across Europe. Stories reported from France, include further development of the notorious Hadopi file sharing legislation and use of fingerprinting technology in expulsions of Roma. From Germany it reports developments of privacy protections responding to use of social networking and Google StreetView. In Romania, the quarterly covers a draft ordinance on ID Cards. In Switzerland measures targeting music uploaders are found to be in breach of data protection law. Finally in the UK the quarterly reports on complaints against the PHORM targeted advertising system and mounting pressure on the Prime Minister in the wake of the News of the World phone hacking scandal.

D12.1.8. The eighth Technology Quarterly Update begins with reports on news at the European level of the continuing Google Street View controversy and of criticisms of the Article 29 Working Party’s criticisms of EU/Us data sharing arrangements. It moves on to report on the passing of the ‘LOPPSI 2’ law in France, the use of punitive fines by the UK Information Commissioners Office, claims that Google search facilities are incompatible with Spanish privacy law, proposals to allow tax authorities to copy hard drives without court orders in Denmark, the referral of Austria to the European Court of Justice for failure to keep data protection authorities independent and the rolling out of cameras in Portuguese traffic police cars. It surveys trends among new products coming to market including increasing use of biometric voice identification over mobile phone, access to private CCTV networks via mobile devices and increasing prevalence of audio surveillance functionality being integrated into new CCTV systems. As with D12.1.5 onwards, it includes the survey of the ethical risks incurred by the use of different detection technologies based on analysis in D05.2. and distinguishing between moderate, intermediate and severe risks in a table with the use of a visual key.

D12.1.9 The 9th Technology Quarterly Update reported amongst others, trends of wider use of facial recognition technology and off the shelf products for large database use, often with facility for retrospective auditing of use. News stories mentioned included the continuing controversy at the European Parliament over US access to the SWIFT banking database, Viviane Reding’s March speech at the European Parliament and a number of judicial decisions made against Google at European and Member State level. It includes the survey of the ethical risks incurred by the use of different detection technologies based on analysis in D05.2. and distinguishing between moderate, intermediate and severe risks in a table with the use of a visual key.

D12.1.10 The 10th Technology Quarterly Update featured a report on the continuing emergence of products based on facial recognition technology and handheld and portable ANPR reading systems. News items mentioned included ongoing negotiations over US access to European Passenger Name Records data, criticism of Facebook for enabling tracking of users even when logged out, and the phone hacking scandal at the News of the World, leading to the newspaper’s closure. It includes the survey of the ethical risks incurred by the use of different detection technologies based on analysis in D05.2. and distinguishing between moderate, intermediate and severe risks in a table with the use of a visual key.


2.2.3 Work Package 03. Ethical Norms of Counter-Terrorism

This project is driven in part by a conviction that counter-terrorism policy in the EU can be made more just if those responsible for formulating it are aware of its ethical implications. These implications are investigated by WP03, which identifies the moral risks of counter-terrorism measures and detection technology and analyses them against the background of the wrongs of terrorism and the presuppositions of liberal politics. WP03 is being used as the ethical basis for the research in WPs 08 and 09 and is both informing and being informed by the WPs that study different technologies. In addition to the deliverables produced, expected final results include policy recommendations relating to the regulation of the use of counter-terrorism detection technologies.

The WP leader is Prof. Tom Sorell, at UoB. Dr. John Guelke and Dr Katerina Hadjimatheou are Research Fellows.

2.2.3.3 Significant Results
D1: The report of this first research meeting on Detection Technologies, Ethics, and Counter-terrorism includes an extended abstract and list of points raised in discussion for each of the four papers that were presented. The first paper, by Tom Sorell, analysed the value of privacy in liberal theory and in relation to the aims of counter-terrorism. The second, by John Guelke, identified and analysed two potentially conflicting aims within liberal democracy: the aim of improving intelligence through the use of surveillance, and the aim of ensuring genuinely free association between citizens. The third, by Tobias Feakin (from the London security think tank, the Royal United Services Institute), identified a range of risks to the liberal value of privacy of preventive counter-terrorism measures. The fourth, by David Banisar (from the NGO Privacy International), provided a critical legal analysis of the practices of counter-terror data mining and internet surveillance. The ideas and discussion that emerged from this meeting and are presented in this Report together reflect a conviction that the value of privacy may, according to liberal justice theory, deserve greater protection than it is currently afforded in EU counter-terrorism policy.
D05.1 Tom Sorell’s Research Paper, entitled ‘Moral Risks of Preventative Policing in Counter-Terrorism’, both represents the completion of Task 1 and makes significant progress towards project objectives O2 and O3. The paper establishes a version of Kantian theory as the liberal theoretical framework from which ethical norms regulating counter-terrorism are drawn. It then argues that, consistent with this framework, the profound moral problem with terrorism lies in its repudiation any ideal of non-violence, and of any political order. Some forms of profiling may be justified in pursuit of terrorist criminals, provided they are evidence-based and proportionate. Some interference with normal protections of privacy may also be justified, although not those that proscribe the expression of pro-terrorist thoughts. Provisional policy recommendations emerging from this paper include: the proposed repeal of the glorification of terrorism provisions in the UK Terrorism Act (2006) and comparable legislation elsewhere e.g. in Holland; funding of social science research into radicalization and terror networks, with the aim of making it maximally evidence-based; funding of research into the limitations of profiling; and funding to encourage participation by Muslims in local and national democratic institutions.
D05.2 The research paper, entitled ‘The Relative Moral Risks of Detection Technologies’,by John Guelke and Tom Sorell, both represents the completion of Task 2 and makes significant progress towards project objective O1. The paper argues that the right to privacy both is an important value in itself and protects the liberal value of freedom of association and conscience as well as political participation and social solidarity. It also suggests that the ethical risk posed to those values is greater when detection technologies penetrate “home” spaces where intimate functions or activities tend to be performed or penetrate a privacy zone associated with the human body. Another source of risk is the justification of surveillance by appeal to ethnic or cultural traits or behaviour. Examples of provisional policy recommendations suggested by the analysis include: the proposed modification of UK practice on the use of body scanners at airports to offer alternatives (e.g. pat-down) that can be used if consent for the scan is withheld (thus bringing the UK into line with other European countries and the USA); the introduction of amendments to legislation regulating the installation and activation of wiretapping software on smartphones to bring them under the same kind of restrictions as phone-tapping and bugging; the reflection in guidance to judges approving surveillance applications of the greater risk to privacy of Internet surveillance compared to scrutiny of telephone records; the proposed retention of records of both the grounds given for the decision to include an individual on a no-fly list and the identity of the officer responsible for the decision, and the establishment of procedures enabling appeal against such inclusion.

D05.3 Taking Moral Risks Given Analysis of What's Wrong with Terrorism
The third deliverable of WP03 begins with ethical analysis of the various morally bad features of terrorism. The outputs of this analysis are relevant to the question of what kind of obligations states have in preventing terrorist attacks. States must reconcile duties to prevent attacks with moral standards. This is often challenging because counter-terrorism is inherently difficult to carry out without invading spaces protected by norms of privacy, falsely identifying innocents as suspects or damaging valuable relations of trust. These risks can be justified, but only on the basis of sufficient evidence. Thus state use or funding of research into technology whose effectiveness is questionable is impermissible.

D05.4 Moral Risks of Profiling in Counter-Terrorism
This paper draws on contemporary ethical analysis of profiling to respond to EU Regulation of Profiling, in particular the EU Parliament’s 2009 Recommendation on the Problem of Profiling. The paper defines profiling, and claims that profiling in counter-terrorism differs from targeted policing in four main ways: i) it relies primarily or mainly on associations and generalisations rather than specific evidence; ii) it relies on ethnic generalisations as evidence of criminality; iii) it is used primarily as a preventive or ‘predictive’ measure; iv) it is used by a range of security agents including border guards, airport and transport security officers, and police officers. The paper proposes that profiling raises two main categories of ethical risk: risks to privacy and risks to equality. The sources of the privacy and equality risks arising from profiling are its tendency to rely on generalisations and associations and its tendency to profile for ethnicity. The fact that a profile is used predictively is less indicative of ethical risk than the fact that it is based on generalised, rather than specific evidence. The risks to equality of the profiling of ethnicity can be reduced in the following four ways: i) Profiles including ethnic traits should always be ‘written’ or ‘formal’, rather than left to the discretion of individual officers. ii) Profiles that select discriminated-against individuals for suspicion should be applied as far as possible in non-visible or discreet ways, to minimise the risk that the measure will humiliate those profiled and encourage prejudice towards them. iii) Profiling should, as far as possible, be used in ways that provide a net benefit, in this case in terms of security, to the ethnic group profiled. iv) When profiling is used against some ethnic groups it should be used against all ethnic groups similarly situated. Finally the paper argues that the profiling of things raises very few of the ethical and human rights risks relating to privacy and equality that are raised by the profiling of people, and the profiling of places –as targets of terrorist attack-- is less likely to raise ethical concerns than the profiling of people who might carry out these attacks. Admittedly, the monitoring of profiled places might yield possible suspects, but this can be more evidence-based than other kinds of profiling.


2.2.4 Work Package 4. Unilateral Exceptions to International Law
Counter-terrorism is being conceptualized by some government justice ministries as a war, and military activity is subject to fewer human rights and ethical constraints than civilian activity. It is a question whether counter-terrorism is best seen as a war, legally speaking, and whether there are ways of maintaining civilian human rights protections at the same time as counter-terrorist policies are being pursued. This work package addresses this question by analysing the broader human rights implications of developments in international law and national jurisdictions concerning measures against terrorism. More specifically, it develops a systematic critique of legal constructions and doctrines that deny or reduce the applicability of human rights law in countering terrorism. In addition to the deliverables produced, this WP will provide policy makers with easy-to-use tools applying the above analysis, specifically in respect of detection technologies.

The WP leader is Prof. Martin Scheinin at the EUI. Mathias Vermeulen is Research Fellow.


2.2.4.3 Significant results
D02: This research meeting on Detection Technologies, Human Rights, and Counter-Terrorism was held at the EUI in Florence on 18 Feb 2010. The first session of the day focused on the identification of terrorist suspects through detection technologies. Ehud Givon (from WeCU technologies, Israel) spoke about a new technology his company had developed which allegedly could detect individuals with hidden criminal intent. Jacques Verraes gave a presentation on data protection issues related to the identification of terrorist suspects. A lively discussion focused mainly on the perceived flaws of Givon’s new technology from a human rights and end-user perspective. The second session saw consortium member Rozemarijn van der Hilst presenting a DETECTER Deliverable (Human Rights Risks of Selected Detection Technologies - Sample Uses by Governments). Two external academic experts, Emilio Mordini from FP7 project HIDE and Robert Leenes from the University of Tilburg, discussed, respectively, the human rights aspects of body scanners and new location-based surveillance technologies. In the afternoon Martin Scheinin presented a paper on ‘Developments in the Declaration of Exceptions, Claims of Inapplicability, or Unilateral Modifications in Respect of International Law for the Sake of Counter-Terrorism”. Jonas Christoffersen of DIHR acted as official commentator to the deliverable.

D06.1 The first research paper, entitled ‘Unilateral Exceptions to International Law: Systematic Legal Analysis and Critique of Doctrines that Seek to Deny or Reduce the Applicability of Human Rights Norms in the Fight Against Terrorism’ (D06.1) both represents the completion of Task 1 and makes significant progress towards objectives 4 and 5 and final project objective O9.

D06.1 concludes that counter-terrorism measures may qualify as permissible limitations on human rights, when they are properly construed. This calls for a rigorous test for permissible limitations, rather than an all-encompassing act of 'balancing'. Martin Scheinin developed such a permissible limitations test for the right to privacy within the International Covenant on Civil and Political Rights, which was presented in a new report to states represented at the United Nations Human Rights Council in March 2010. This report was positively received by a large number of States during the discussion in the plenary.

D06.2 The second paper, entitled ‘EU Law and Policy Approaches to the Applicability of Human Rights Norms in the Fight against Terrorism’, both represents the completion of Task 3 and makes significant progress towards project objectives 4 and 5 and final project objective O9, as it applies the analysis presented in the first deliverable of WP4 to European Union (EU) law and policies. D06.02 builds upon D06.1 and concludes that a number of inadequacies and deficiencies in the European Union’s ‘area of freedom, justice and security’, especially relating to the collection, processing and exchange of personal data, have led to an overly broad use of permissible limitations and restrictions to the right of data protection in the context of the fight against terrorism. The paper closes with 8 concrete policy recommendations which together attempt to remedy the inadequacies and deficiencies of the EU’s counter-terrorism policy.

Deliverable D4.3 advised that public authorities should undertake a privacy impact assessment before deciding on the use and deployment of a specific detection technology. Such an analysis should identify potential interferences with the core of the right to privacy, and propose (technical) measures to counteract these, since it is the core of the right to privacy that ultimately should determine the limit for the use of these detection technologies. Further, it was argued that the necessity of introducing or continuing to use detection technologies, and retaining the data they produce, must be established and supported by concrete evidence, and it should then be evaluated for its degree of intrusion into the private life of individuals in order to ensure a proportionate and least invasive outcome. It is advisable that the effectiveness of certain technologies is assessed on a regular basis.

While ‘traditional’ detection technologies, which are used secretly in private places in a law enforcement context, are relatively well regulated, this deliverable found that the regulation of the use and deployment of relatively new detection technologies in public places such as body scanners, ANPR and CCTV cameras and GPS trackers is considerably less developed. It was argued, however, that that the use of such detection technologies must also be regulated by (European or national) law that is sufficiently precise. Such law has to define the exact purposes of, and places where, these detection technologies can be used, and should include clear rules on who can use the technology and access the data that can be obtained by these mechanisms.

The paper argued further that detection technologies which are used in secret constitute the gravest interference with the core of the right to privacy. The distinction between detecting movements and expressions should only come in as a secondary step to determine the gravity of the interference with the right to privacy. Hence, every detection technology which is used in secret should be regulated by a statute law that defines the following six safeguards in order to avoid abuses of power: (1) the nature of the offences which may give rise to the use of the technology; (2) the categories of people which the technology may be used against; (3) the limit on the duration of the use of the secret measure; (4) the procedure to be followed for examining, using and storing the data obtained by the technology; (5) the precautions to be taken when communicating the data to other parties; and (6) the circumstances in which such data may or must be erased.

Work Package 5. Counter-terrorism, detection technologies, and pre-entry screening in immigration controls.
Illegal immigration is seen as a security problem in the European Union, but while border control including pre-entry screening is continuously increased, little was done to mitigate the human rights risks of such procedures. Work Package 5 contributed towards rectifying this by articulating legal norms regulating counter-terrorism, detection technologies and pre-entry screening. It also assessed the human rights implications of migrant screening policies in the European Union, the use of detection technologies in pre-screening policies, and carrier liability arrangements. Work package 5 meets objective O6a, which is to assess the developments in O5 with specific reference to the use of pre-entry screening measures for migrants with an emphasis on refugee law.

Ms. Kristina Stenman was the WP leader and functioned as a resource person for researchers working on the various deliverables. Professor Elina Pirjatanniemi was responsible for the overall management of the DETECTER project within Åbo Akademi University. In addition, she provided academic supervision and advice to the whole team of researchers. Ph.D. student Audelina Ahumada-Jaidi was a key member of the research team from the start, and was responsible for completing research on the deliverables as well as of the day-to-day running of the project.

Significant Results

D14.1. Deliverable 14.1 is a legal analysis of pre-entry screening measures in the European Union, entitled Border control and internal security in the European Union – information, technology and human rights implications for third-country nationals. This research paper seeks to map the human rights and protection problems encountered as a result of the integration of counterterrorism, including through the use of detection technologies, in EU common border policy. The deliverable shows that the relevant legal issues concern primarily the right to non-refoulement and the right to privacy, but also particular data protection principles. It examines the development of a common border policy in the EU with particular regard to the increased emphasis on counter-terrorism and internal security considerations, including through the use of detection technologies. In addition, it examines the use of centralized electronic EU-operated immigration data bases (SIS/SIS II, VIS and EURODAC) and the introduction of biometric identifiers as a way of adding value to the fight against terrorism. It also elaborates on the increasing role of Frontex in border control activities and surveillance at the external borders of the EU. In relation to pre-entry screening measures, the paper includes a section on carrier sanctions and their extension to cover issues related to the fight against terrorism and other serious criminality.

D11 Report on research meeting on “Border security, technologies and human rights”
On 11-12 May 2011 the Institute for Human Rights held a research project meeting with a specific focus on the themes elaborated under Work Package 5. The report covering the thematic programme of the meeting summarizes the presentations and discussions between members of the DETECTER research themes and invited guests, including both experts from the academic field and representatives from relevant organizations and authorities (deliverable 11). Among the themes discussed were the role of and challenges connected to the use of detection technology within the work of the Finnish Border Guard; the role of FRONTEX in the development of technology for border control purposes, including a presentation of specific research on ethics of border security elaborating, inter alia, on ethical difficulties created by the technologies themselves; the approach on immigration policy and security by the Migration department at the Finnish Ministry of the Interior, and; fundamental rights challenges as elaborated on within the framework of a FRA project on the treatment of third country nationals at the external border of the European Union. Furthermore, there were presentations dealing with the role of the EU in securing access to protection within an increasingly security-based border control regime; the different legal regimes applicable on maritime border control measures and their compatibility with the international human and refugee rights framework; problems arising within the context of electronic data exchange within the EU, and, finally; challenges arising from the development of a legal system allowing for the use of Passenger Name Records for law enforcement purposes. Among the questions that came up for discussion were the risks of “over-harmonization” of European systems, especially with regards to databases and the fact that the definitions of crimes are not the same all over Europe; the issue of pre-border surveillance and member states’ extraterritorial obligations attached to asylum claims in this context; the fact that often the potentials (and risks) of detection and surveillance technology are overstated, while in reality there are lots of challenges concerning how to make technological systems function properly and efficiently; the possibilities of using DNA testing at borders, and; possibility and risks of relying on automated processes within the framework of data exchange.

D14.2 Preventing Irregular Immigration through Interception: Recommendations for a human rights compatible maritime border policy in the European Union. The report discusses extraterritorial border control by EU Member States at the southern maritime borders of the EU and focuses on issues related to the obligation to respect the principle of non-refoulement when intercepting or rescuing refugees and asylum-seekers at sea. The report begins by discussing official discourses within the EU that frame irregular immigration as a security threat and the integration of counter-terrorism in the development of a common EU border policy. It also argues that the use of terminology such as “illegal” immigration risks presenting undocumented immigrants and criminality as closely connected and, therefore, prevents a genuine recognition of irregular immigrants as holders of human rights. It identifies gaps in the applicable law governing maritime interception and discusses the confusion between interception and search and rescue missions. It also suggests that the EU-project EUROSUR, through the development and establishment of extended and sophisticated maritime surveillance systems, is utilized to strengthen the enforcement of Member States’ search and rescue duties at sea. While recognizing that the range of different actors involved in maritime interception complicates responsibility sharing, it supports a human-rights based understanding of obligations and responsibilities that arise in this context. The report concludes by formulating a set of recommendations in order to promote that maritime border policy by EU Member States is carried out in compliance with the principle of non-refoulement. Additional work on the report is underway with a view to publish it as a peer-reviewed article.

D14.3 Paper on outline of pilot model for the issuance of humanitarian visas
D14.2 begins by briefly describing the major problems connected to the international protection of refugees, including the fact that access to Europe and European asylum procedures has become almost impossible, at least expensive and risky, due to intensified and externalized border surveillance that is based on the objective of fighting illegal entry, criminal acts and terrorist attacks. Taking the perspective of the legal framework protecting human and refugee rights it is stated that in order for the EU and its member states to comply with their international obligations to protect refugees, the current situation at the borders must be counterbalanced by finding ways for legal entry into Europe for persons in need of protection. Taking into account that visa policies and practices form part of migration policies and serve as a tool for controlling the access to Europe, the paper discusses recent developments of the common EU visa policy as well as national regulations and practices in order to explore the field of obstacles and possibilities for issuing humanitarian visas. Different proposals and statements concerning so called protected entry procedures are presented, their relation to resettlement programs is discussed and, finally, as a result of the research, a model for issuing humanitarian visas is presented.



Work Package 6. Data Mining and Profiling
Work Package 6 contributed to the overall goals of the project by focusing on the legal and ethical implications for a particular type of detection technology and related methods—namely, data mining and profiling. Law enforcement and intelligence communities within EU Member States have shown an increased willingness to use data mining tools, the European Council has recently shown interest in terrorist profiling, and Member States and EU authorities have shown a determination to increase the effectiveness of coordinated European counter-terrorism action. In the light of these developments, an analysis of data mining methods and technologies is crucial in order to ensure that, if implemented, this is done in a way that is consistent with human rights guarantees. The overall objectives for the work package are to determine in what circumstances it is permissible under international and European human rights law to use data mining as a counter-terrorism tool and to articulate a set of principles and best practices for the use of data-mining in the counter-terrorism context. In addition to the deliverables produced, the final results include policy recommendations relating to the regulation of the use of data mining and profiling as well as related data handling in general. Work package 6 meets O4, which is to survey substantive norms of international law governing counter-terrorism, with an emphasis on human rights law, and O6a, which is to assess the developments in the declaration of exceptions, claims of inapplicability, or unilateral modification in respect of international law for the sake of counter-terrorism, with special reference to data mining.

The WP leader was Dr. Daniel Moeckli, at UZH with James Thurman as research fellow.

Significant results
D03. Report on Research Meeting 3: Data Mining, Human Rights and Ethics. The report of this third research meeting on data mining, human rights and ethics provides summaries of the speakers’ presentations, comments, questions, replies from the moderated discussion and points raised in the open floor discussion for each of the four panels. The first panel sought to provide explanations of what data mining is as well as distinguish data mining from other forms of data analysis. Stephen Fienberg of Carnegie Mellon University began by presenting a definition of data mining and different forms of data mining functions. He then spoke about machine learning and the task of using data mining to discover terrorists. His presentation placed particular focus on behaviour-based detection methods. Colleen McCue of SPADAC, Inc. presented the application of predictive analytics to police work. Her presentation demonstrated how data processing may be used to improve resource allocation and in some instances may assist in the prevention of crime. She stressed the importance of analytic process over technology. The second panel was designed to provide the perspectives of law enforcement and intelligence agencies in the use of technology and data analysis in counter-terrorism. Sam Lincoln discussed the tasks carried out by intelligence agencies, the role of technology and the importance of adhering to the law in intelligence activities. Chris Westphal of Visual Analytics, Inc. spoke about law enforcement tasks from the perspective of an independent data analysis contractor. Some of the major themes of his presentation included the importance of having a sound business process in place and providing analysts with appropriate training. The third panel sought to explore European data protection law and its application in the realm of law enforcement and intelligence activities in particular. Hans- Peter Thür, the Swiss Federal Data Protection and Information Commissioner, began by explaining the situation in Switzerland and the circumstances under which data mining on the part of intelligence agencies is permitted under Swiss law. Bénédicte Havelange from the Office of the European Data Protection Supervisor spoke in detail about different principles of European data protection law and their implications for data mining and profiling as well as about developments with respect to data mining and data analysis in the law enforcement and counter-terrorism contexts at the EU-level. Herbert Burkert of the Research Center for Information Law at the University of St. Gallen, in addition to addressing legal issues, urged the meeting participants to take a global perspective of the interplay of law, technology, and politics in the security arena. He made a number of predictions about future developments with respect to the implementation of detection technologies and data mining in national security. Gus Hosein of the London School of Economics and NGO Privacy International spoke about the role of NGOs in the privacy debate and the push and pull of political opinion between “unreasonable” poles. Ramon Barquin of Barquin International and member of the US Homeland Security Advisory Committee provided an overview of the problems of data availability and privacy, examined government data mining with particular focus on the US, and pointed toward the development of privacy-preserving data mining methods. Lastly, the meeting concluded with an open discussion of four major themes of the presentations.

D08.1: The survey of counter-terrorism data mining programmes sought to uncover background, descriptive information with respect to past, existing, and planned data mining programmes for purposes of countering terrorism. The authors determined the nature of these programmes and their operation in order to provide factual input for deliverables D08.2 and D08.3. As noted above, the survey expanded to include related programmes. The report opened with an introduction which provided a definition of data mining which the authors had adopted for the survey and presented some brief background information on data mining and data mining processes as well as brief notes on the structure of the survey. Each description of a programme or database project attempted to include information about data sources, access, and functions to the extent these items could be determined from available sources. The published survey included some 29 programmes from the United States, Germany, NATO, and the EU. The survey also included entries on 4 US databases, 2 developments in EU counter-terrorism policy, and 3 FP7 security research projects of potential interest. Since publication of the survey, programmes and projects at the EU-level have continued to be examined.

D08.2 Evaluation of Counter-terrorism Data Mining Programmes
Research performed in connection with D08.2 led to the conclusion that comprehensive studies of data mining performance in the context of counter-terrorism are generally not publicly available. The research team thus saw that there was an unmet need for governments to provide documentation of the effectiveness of data mining programmes that were utilized in counter-terrorism efforts. Such documentation presupposes that thorough testing was undertaken. It was additionally recommended that such testing be carried out by a competent, independent body.

In order to limit human rights impact, it was also recommended that governments avoid using data mining programmes that rely on personal data where possible. The use of data mining for investigatory purposes where known suspects were involved was also believed to pose less risk of unwarranted rights infringements.

D08.3 Counter-terrorism Data Mining: Legal Analysis and Best Practice. In order to provide optimal compliance with legal requirements and in the spirit of developing best practices for the use of data mining in the counter-terrorism context, D08.3 provided recommendations on three interrelated levels. The first level of recommendations concerned the development of a legal framework. It was suggested that authorization for the use of data mining must be explicitly provided by law. Additionally, the law should define the grounds for which the use of data mining would be permitted, the scope and default maximum duration of such activities, procedures for the authorization of data mining, data handling practices, and remedial procedures for instances of abuse or unintentional infringements. The second level of recommendations concerned the establishment of an effective and sufficiently independent oversight body as well as an institutional framework for the authorization of the use of data mining designed to provide sufficient safeguards against abuse. Lastly, measures on the ground-level are also needed to ensure compliance with the law. Such measures should include training for agents and officials concerned with the use of data mining in counter-terrorism, the development and implementation of internal procedural rules, and appropriate IT system design and architecture.

Work Package 7: Electronic Surveillance of the Internet
This work package is driven by the conviction that common European guidelines are needed in order to have efficient search criteria that are in accordance with human rights. The work package identifies search criteria by using a categorization system that is effective and compliant with human rights, and identifies ways in which surplus information can be vetted to avoid human rights infringements, and proposes safeguards for the use of surplus information across the EU.
Significant results
D15.1 analyses how profiling and Internet searches can be carried out so as to minimize the risk of human rights abuses and protect the foundations of a pluralistic democracy. The paper describes profiling and Internet searches, lists their benefits and drawbacks, and suggests a classification system for search and profiling criteria that can be used to prevent human rights abuses. The criteria are classified according to their threat to human rights and ethical norms. Four categories are identified: necessary, useful, and non-essential and questionable/prohibited. This classification facilitates decisions about whether the criteria can be used in accordance with human rights law alone or in combination with other criteria, or perhaps not at all, by examining the consequences for the persons targeted by the search. Such a classification can also be used in post factum controls, i.e. after a search, to control the use of e.g. discriminatory criteria that have been used carelessly or in violation of human rights law and to vet out excess information that can be harmful to the person profiled. It therefore enhances the access to justice of innocent persons.

D15.2 Vetting Surplus Information
This paper describes the problems with surplus information gathered on the internet and suggests ways of minimizing it. Surplus information is a necessary by-product of information which is collected with the required authorization (such as a court-order or warrant) but that is not covered by that same authorization. It is a consequence of doing profiling on the internet when searching for terrorists. There are mainly two problems with surplus information: it is gathered without proper permission and it can be used in ways that may lead to harmful legal and extralegal consequences for individuals.

Harmful legal consequences may arise when an individual is subject to prosecution on the basis of surplus information that was collected without an authorising permit. Extralegal consequences may arise when that information is used as a basis to impose non-judicial restrictions on an individual, such as travel bans, being listed on a no fly list and/or in the Terrorist Screening Database, being prevented from getting an employment etc.( See chapter 1.2.2.1. for Extralegal repercussions).
In order to minimize such harmful consequences it is recommended that the amount of surplus information be kept to a minimum. This can be achieved by:
• using carefully formulated search criteria,
• having scrupulous vetting procedures for both machines and human operators,
• establishing clear rules on how to deal with surplus information, taking into account the original permit and making sure that it is not undermined. This can help to prevent what are known as “fishing expeditions”: the use of information-gathering techniques to collect large quantities of information speculatively, in the hope that some of it may be significant.

D07 Research meeting 4: Internet surveillance, human rights and ethics. Report on the meeting This meeting was held in Lund on Friday 12– Saturday 13, November 2010. Speakers on “Policy making and democratic oversight of intelligence services” were: Judge Krister Thelin, member of the UN Human Rights Committee; Sir Francis Richards, former Director of GCHQ; Anders Danielsson, Director General of SÄPO (Swedish Security Service); Dr. Ramon Barquin, President and CEO of Barquin International; Dr. Cecilia Ruthström-Ruin, Chief Terrorism Prevention Branch UNODC; Prof. Martin Scheinin, EUI, in his role as the UN Special Rapporteur on Human Rights and Counter-terrorism.

The speakers covered a wide range of subjects, such as the different working methods of the United Nations Human Rights Committee in setting the standards in counter terrorism; the dilemmas that intelligence services are facing due to the importance of gaining public trust, the risks linked to the fact that the work is conducted secretly and the importance of public scrutiny and access to remedies when violations occur; and the different influences on the policy-making process in the area. In order to facilitate the transcription and to reassure that the most interesting parts of the presentations and discussions were raised in the report, the meeting was taped and these tapings were later used during the writing process.

Thanks to the meeting and report, we were able to cooperate with the European Union Agency for Fundamental Rights as well as with other agencies and have made one of the world’s largest manufacturers of CCTV cameras aware of the implications of using high tech in countries that have some ways to go in regards to human rights furthering human rights in counter terrorism.

D15.3 The use of surplus information in a court of law
D15.3 shows that across Europe and the United States there are two main ways of minimizing surplus information and the effects of this information in a court of law, namely reactive and proactive measures. It is proposed that proactive measures in general be used since they aim to prevent any form of transgression before any damage can be done. The best proactive measure is one used very early in the process making sure that no surplus information at all is collected. But as this is very difficult to achieve other measures are necessary such as banning surplus information as evidence in a court. In order to attain this, there needs to be a clear procedural regulation that governs the use of evidence, clearly specifying what surplus information is and what kind of information is allowed and what is not allowed. There also needs to be a way of disallowing evidence, as there is in the United States where the Fourth Amendment may require the exclusion of evidence obtained through a search or seizure that violated the Amendment.

In order to have effective legislation that takes into account both the human rights aspects of surveillance and the legitimate aim of the security agencies to prevent crime and terrorism, a more elaborate study is needed that will highlight what is at stake and provide the legislator as well as practitioners with tools to make the necessary decisions. A start would be to base such research on the German debate about evidence based on criteria such as violation of human dignity, protected in Article 1 of the Grundgesetz, or any other serious violation of a fundamental right. This would obviously only be a starting point but could nevertheless serve as a beginning for forming EU-legislation on the matter.

Another proposal is to establish a European ombudsman for dealing with surveillance and information-gathering. The ombudsman would be independent and focused solely on controlling that searches and the results of searches are properly used in a way that instils trust in the populace and ensures that no human rights violations take place. The ombudsman could also be useful in starting and maintaining an ethical discussion of what price is reasonable to pay in an open society when it comes to surveillance.


Work Package 08. Monitoring of counter-terrorism activity including detection technology use
Ensuring that global and European institutions are able to monitor and control state counter-terrorism measures is essential for protection of international human rights standards, especially given recent informal co-operation agreements between European heads of government, which may conflict with pre-existing legal commitments on the part of the same governments to safeguard freedom of association, free expression and privacy. This work package examined the capabilities of global and European institutions to scrutinize and control secret state counter-terrorism measures, including those using detection technologies, and the interaction between the international institutions and domestic courts in this regard. In doing so it contributed to the fulfillment of project objective O7: To survey official and publicly acknowledged methods of monitoring of secret counter-terrorist activity, including the use of detection technologies, by international, regional and national legal institutions. In keeping with the technical scope and content of Activity 6.5.01 this work package helped DETECTER to meet the requirement of establishing a network to analyse the wider context of government security policies and responses to security threats, in particular the context of global legal institutions and standards. In addition to the deliverables produced, the final results included recommendations of improved international monitoring mechanisms. The WP leader is Prof. Geir Ulfstein, UiO.

Significant results

D.9 Meeting Report on Research meeting 5: Monitoring mechanisms, human rights and ethics
Meeting 5 was held in Oslo on 10 and 11 February 2011, 25 people participated. During the first day of the meeting 8 speakers were invited to present their latest research findings. Each presentation was followed by lively discussions with the meeting participants.

Peter Clarke former Head of the Counter Terrorism Command in London, started the meeting with holding a presentation on the topic of the benefits of using technology in counter-terrorism efforts. Peter Burgess, researcher in philosophy, political science and cultural history, gave the second lecture and addressed the more philosophical question: what is security? In the discussion that followed, the question was raised whether ‘security’ isn’t more of a value based on acceptance, rather than anxiety. Peter Burgess answered to this that if a society wouldn’t feel threatened, terrorism would not be possible. That means that we, the people, are a necessary factor in ‘enabling’ terrorism.
Inger Marie Sunde, former Senior Public Prosecutor with ØKOKRIM, gave a presentation on the topic of Trojan Horses and discussed the difficulties arising from using such technology, both in terms of controlling the technology as well as issues related to jurisdiction.

Ingvild Bruce, co-author of the book Fighting Terrorism by Multilevel Criminal Legislation, presented the topic of the criminalization of preparation of terrorism act and its consequences. Bruce argued that the implementation of multilevel criminal legislation (both international and regional), has resulted in an exceptional expansion of criminal liability for terrorism and terrorism related acts in many European national legislations over the past decade.

Dr. Ing. Asmund Skomedal, Research Director of the DART department at Norwegian Computing Center gave a presentation on ‘Digital Forensic Readiness and Privacy’. The question was discussed whether there is a technical solution to counter the tendency to store more and more data and to design Privacy Enhancing Technologies early on in the design. Skomedal argued that due to engineering culture, IT developers may not be inclined to build in privacy measures on their own accord, therefore there is a need for clear guidelines and rules on what type of data should be excluded and what should be anonimized.

Heidi Mork Lomell Ph.D researcher at the University of Oslo and vice chair of the Living in Surveillance Societies COST Action, gave a presentation on the Myths and Facts on video surveillance. She argued that both allies as well as opponents of CCTV often base their position on incomplete information about the real possibilities of CCTV. More practical understanding of how CCTV is operated is needed.

Prof. Charles Raab, Professor Emeritus and Honorary fellow at the University of Edinburgh, gave a presentation on the importance of privacy in a democratic society. In his presentation, Raab explained the importance of privacy by considering political and social values of privacy that go beyond the conventional understanding of privacy as an individual human right.

Jon Wessel Åas, practicing lawyer in the fields of media law, civil liberties and litigation, gave a presentation on the risks posed by applying surveillance to large parts of the population. Åas discussed whether control mechanisms such as judicial oversight, could compensate for the shift towards indiscriminate and blanket gathering of information, such as the Data Retention Directive envisages.

D16.1: This paper shall according to the Description of Work ‘[s]urvey interactions between governments and international human rights reporting bodies in relation to secret detention centres, over-flights in Europe in 2006, invasions of personal integrity privacy through counter-terrorism’. The paper examined Concluding Observations by the Human Rights Committee on the basis of reports submitted by states parties, as well as Views adopted by the Committee on the basis of complaints from individuals. The Committee has on the basis of these activities established the following guidelines for states’ implementation of Article 17 of the International Covenant on Civil and Political Rights on the right to privacy: interference in the right to privacy must be foreseeable; mechanisms should be established to prevent abuse of collected information and to ensure review, supervision and redress; and vulnerable groups should be protected. The Committee has, however, not established clear guidance about which substantive measures would be considered a violation of the rights to privacy. The paper examines also the activities of the Parliamentary Assembly of the Council of Europe following reports by news media and NGOs about secret detention centres and overflights in Europe as part of US counter-terrorism strategy. The inquiry uncovered suspicious patterns of flights military and civilian aircraft and indications of secret detention centres. Political pressure was put on national governments. But the Council of Europe failed to follow up implementation by member states. The UN organs have generally been more reactive, but they have, within their mandates, addressed these matters. A joint study on global practices in relation to secret detention in the context of countering terrorism of the Special Rapporteur on the promotion and protection of human rights and fundamental freedoms while countering terrorism, the Special Rapporteur on torture and other cruel, inhuman or degrading treatment or punishment, the Working Group on Arbitrary Detention and the Working Group on Enforced or Involuntary Disappearances was released on 26 January 2010. D16.01 is not restricted to detection technology, but provides a basis for drawing more general conclusions about the effectiveness of global and European human rights monitoring and control in counter-terrorism.

D16.2: This paper examined the review of counter-terrorism measures in the light of human rights protection by the courts of the United Kingdom, Germany and the United States. The USA and the UK have both been scenes of terrorism attacks. The German Constitutional Court has had several cases relating to the rights of privacy. The study includes, but is not limited to, cases dealing with detection technology. This country-specific and broadly inclusive approach has the advantage of examining relevant general principles and the usefulness of the national judiciary in upholding human rights standards in combating terrorism, including in cases involving the use of detection technologies. The German judgments are to a great extent concerned with detection technology and the rights to privacy. The variety in the cases in the UK is striking, including detention, access to evidence in court cases, and deportation. The US cases concern to a great extent the rights of detainees at Guantanamo, but some lower court cases deal specifically with surveillance of private citizens. Despite their substantive variations the case law from these three jurisdictions yields some general principles about protection of international human rights while countering terrorism: interferences in individual freedoms should have a clear legal basis; individuals should have access to the evidence; measures should be non-discriminatory; measures should be proportional; interferences should be subject to judicial control and breaches should be sanctioned.
D16.3 Recommendations of improved monitoring mechanisms of secret counter-terrorism activities
This paper contains recommendations for improved monitoring mechanisms for secret counter-terrorism activities. As in the previous deliverables under this work package the focus is on national courts, and the European and global control mechanisms, and the interaction between the institutions at these three levels. This deliverable contains recommendations for improvements of these functions based on review of counter-terrorism by national courts; review of invasion of the right to privacy through counter-terrorism by the UN Human Rights Committee; and review of the use of secret detention centres and over-flights in Europe by European and global control mechanisms.



2.2.9 Work Package 09. Human Rights Risks of Selected Detection Technologies

The use of detection technology offers many advantages in the fight against terrorism. Yet at the same time, it poses a threat to the enjoyment of some human rights, like the right to privacy.
To minimise this threat, it is important that developers and manufactures of such technologies are encouraged to take human rights and ethical considerations into account early on in their planning to avoid problems later. The overall objective of WP09 is to rank selected technologies not elsewhere studied in the DETECTER project according to the risk they pose to human rights. This contributes to project objective O8: To identify types of product in current detection technology development that are high-risk or low-risk when judged by moral standards or those of international human rights law. It was decided that the focus would be on technologies not elsewhere studied in the DETECTER project, for example, technologies capable of intercepting communications or tracking locations, as this is a frequented method for terrorist investigation. The ranking of technologies will be based on legal and ethical standards..

Objectives
The objectives of WP09 for period 2 were to getter a better insight in how detection technologies are used in practice and how counter-terrorism professionals view the use of technologies in counter terrorism in terms of efficiency, reliability, effectiveness, and their human rights implications. In order to obtain this insight, semi structured interviews with government officials working in counter terrorism were to be conducted. By analysing these interviews, a better insight was obtained in the potential risks frequently used technologies pose to human rights. These WP-specific objectives contributed to the fulfilment of project objective 08: to identify types of product in current detection technology development that are high-risk or low-risk when judged by moral standards or those of international human rights law.

Significant results
D17.1 This paper provided a rigorous analysis of the legal requirement that interferences with the right to privacy must be ‘proportionate’ and derives further obligations as logical consequences of this condition. The paper first analyzes the importance of the right to privacy and concluded that privacy is important for individual well-being, as well as the proper functioning of a democratic society. The right to privacy is vested in different national, European and International laws, which prescribe that the right to privacy may only be limited by measures that have a sound legal basis and are necessary in a democratic society for, amongst others, the protection of national security. From an analysis of the European Court of Human Rights case law it emerged that when using detection technologies in counter-terrorism, governments should take account of: legitimacy, proportionality, necessity, transparency, factors concerning the person targeted, the sensitivity of the data sought, the effectiveness, the possibility of function creep and the extent to which PET’s are implemented.

The paper argued that in order to improve the legal framework, legislators would be wise to ensure adequate procedural safeguards against abuse of the technology, by creating clear laws and review when the detection technology is first applied, while it is in use, and afterwards. The first two stages of review may take place without the subject’s knowledge, as long as there is some form of judicial review. Ideally this should be the responsibility of the judiciary, but a combination of judicial and political bodies will also suffice, as long as the political body is not part of the executive. These procedures must be actually effective in protecting against abuse and the Government should be able to demonstrate this based on statistical data. Lastly, the paper noted that it could be useful to make the intelligence service’s noncompliance with privacy protective legislation unlawful, amounting to a criminal offence.

These legal requirements were translated into questions to discuss with technology developers at DETECTER meetings. These centre on ‘Privacy By Design’; the automatic deletion of data at a set interval; the anonymization of excessive data not necessary for the prevention of terrorist acts; and the security of data by limiting access and sharing.

D.17.2 Interviews with technology users.
This deliverable presented the results from the interviews with counter-terrorism professionals. The study showed that the main reasons given for using technology differ between the police in the United Kingdom and the Secret Police (PST) in Norway. The PST focuses mainly on the task of preventing terrorist attacks from taking place, while police in the United Kingdom have the additional task of gathering evidence that will be accepted in court.
In terms of choice for a particular technology, the main factor relates to reliability and the technology’s ability to provide accurate information. Since the trust of the public in these services is of vital importance, one mishap (either by misuse or false positives) due to technology can be costly in terms of the relationship with the public. Another characteristic is the amount of data a technology generates. Two other factors mentioned by the participants included costs and whether the technology is legally permissible.

Annex to D17.2
The hypothesis testing study that formed the basis of Deliverable 17.2 was originally designed to include three countries (Norway, Netherlands and UK), with four semi–structured interviews being conducted with counter-terrorism professionals in each country. It was anticipated that it would be difficult to recruit participants due to the understandable secrecy that surrounds counter-terrorism work and also secrecy contracts signed by many professionals doing this work. This did indeed prove to be the case. Although we were not aiming for data saturation, the difficulties with recruitment were a serious limitation Accordingly, we decided to run a focus group of primarily police or related personnel with experience in counter-terrorism from as many different EU countries as possible (including, obviously, the countries targeted in the original study). Our aim was to look for confirming or disconfirming evidence of some of the key findings of the original study with a view to strengthening the results. A focus group was chosen as the vehicle for data collection, and this was convened in Brussels to coincide with the DETECTER final conference.

The topic guide covered four areas: types of detection technology used in counter-terrorism investigations; impact on the relationship with the public of the use of these kinds of technologies; technology and its use of police time; the role of ethics in decision-making around the use of detection technologies.

Within the confines of the topic guide for a single 90 minute focus group, we were able to gather confirming data for selected results of the first study. We are therefore confident that we are able to publish combined results that do accurately reflect at least some of the attitudes and concerns of CTPs about the use of detection technology. New themes emerged in the focus groups that we would like to explore further, along with themes from the interviews that could not be raised in this focus group.

A copy of the report that constitutes the Annex to D17.2 was sent to our EC Project officer in December 2011.


D.17.3 Characteristics and uses of selected detection technologies, including their potential human rights risks. Deliverable 17.3 has sought to identify characteristics of detection technologies used in counter terrorism and the potential human rights risk they pose. The choice to focus on location tracking technologies was informed by the results of the interviews conducted for Deliverable 17.2. During those interviews, participants indicated that tracking the movements of a person or group of interest is usually the starting point of investigations in efforts to prevent terrorism. This deliverable identified that it is not so much the tracking itself, but rather the creation of huge databases including detailed personal information that is problematic.

It further explained that aggregating the data in these databases reveals intimate details of a person’s private life, which include information about a person’s habits, (future) whereabouts, religion, and can even reveal sexual preference or political views. This means that location data can fall within the ambit of ‘special category of data’ or ‘sensitive personal data’. The storing and processing of such data by governments is an interference with the private life of the individuals concerned.

The collection of data, especially in relation to mobile phone location data, is necessary for a number of legitimate reasons, and can in many ways not be avoided. General concerns related to counter terrorism technologies apply also to location tracking devices. The risk of de facto tracking of everyone’s movement and storing of the location data for a period of up to two years can have a significant ‘chilling effect’ on society. People may be less willing to participate in public life when, for example, participating in a demonstration or political gatherings will entail authorities having an electronic record revealing your attendance. This is a loss for any well-functioning democracy.

The chilling effect and the blanket and indiscriminate storage of sensitive data make that the proportionality of the use of location technologies really depend on how strong the safeguards against abuse are. Focus should therefore be on safeguards that restrict access to stored data, which include measures to prevent data leaks. Imposing strict regulation on the access of location data by requiring judicial authorization and having a built-in function that creates an ‘electronic trail’ of those accessing the data might mitigate concerns.

D.17.4 Ranking, in terms of their human rights risks, the detection technologies and uses surveyed in WP09.
This deliverable sought to make a ranking in technologies according to the risk they pose in violating human rights. It first set out the methodology to be used to make the ranking and secondly identified the factors determining ‘risk’. The selection of technologies to be ranked was informed by the interviews and focus group conducted for D17.2. A choice was made to select only technologies that serve a similar purpose: identifying at an early stage whether a person or group of person is likely to be involved in terrorism.

While all technologies seek to divulge personal data from the subjects they target, there is a difference in how ‘sensitive’ the data is. Also, the ‘sensitivity’ of the data depends on how easily ‘understandable’ the data is. Technologies that produce data that instantly (i.e. without the need to aggregate) reveal intimate personal information therefore carry a higher risk to violate the right to privacy.

In addition to the degree in the sensitivity of the data produced, these technologies vary in how they are used. What has become clear is that technologies that require judicial authorization and are in use for shorter periods of time pose a lesser risk to violate human rights. Of relevance here is also whether the use of the technology is overt or covert, whereby the covert use of technologies carries a higher risk of violating human rights. However, it remains questionable how to determine whether technologies are used in an overt or covert way, simply because it can be difficult to ascertain whether people have understood the implications of their choices, for example the choice to use a mobile phone.

Technologies that gather and subsequently store data carry a high risk of violating the right to privacy. If the data storage is in the form of electronic files, the risk increases because of the risk of ‘data leaks’. If the data is stored in a database shared between different agencies, the risk increases even more.

Work Package 10: Dissemination
D19 Lessons learned document
This very short deliverable takes the form of reflections on DETECTER recommendations (attached in an Appendix), in the light of reactions of end-users. At project meetings and advisory board meetings throughout the project, members of the Consortium have had the opportunity to interact repeatedly with currently serving or recently retired police and intelligence officers, technology developers, and representatives from the NGO and policy-making community. Some of these interactions have led us to modify or at least concede the need to modify our recommendations. Others have alerted us to ethical and human rights risks and responses to these risks that our deliverables did not anticipate. D19 is a summary of the associated lessons learned. These relate to: thresholds for authorised surveillance; disclosure and secrecy; surveillance of things and places versus people; human in the loop; algorithms and abnormality; dual use of technology; detection technology export ethics; counter-terrorism and border issues; data-sharing and uneven human rights sensitivities.

Potential Impact:
Impact
DETECTER has produced a substantial body of cutting edge legal, empirical, and philosophical research, some of which has already been published in important academic journals. This research is relevant to security research programme makers and (mission-oriented) security research performers, as well as the law-enforcement consumers of security research and was disseminated to them throughout the project at project meetings and external events. The research also supports clearly articulated applied policy recommendations for security policy makers, offering them advice on how to take counter-terrorism measures that protect both the security of European citizens and their human rights. DETECTER research and policy recommendations were presented at the DETECTER final conference. A Lessons Learned document was drafted by Tom Sorell. It incorporates the policy recommendations emerging from DETECTER research and responses from DETECTER to criticisms of these made by counter-terrorism policy-makers as well as technology users and developers. It was disseminated to a broad mailing list as well as uploaded to the project website and posted on the LIFT blog.

Impact
Impact is achieved primarily through dissemination of research to policy makers, manufacturers and law-enforcement officials assessments of both desirable and undesirable features of detection technology products, as well as general standards for products to meet.

During the lifetime of the project, presentations were made from and to technology developers and users at the DETECTER research and technology meetings. DETECTER partners have presented and discussed the work of the project at conferences. Project deliverables were posted to the website as have videos and other multimedia material disseminating project work. And dissemination opportunities in the form of research events were organised by UoB. These include a technology user-group interaction day in Sept.2011 a one-day workshop on the themes of Terrorism, Ethics, and Technology in August 2011, and a one-day Police Meeting in Jan 2012. A lessons learned document incorporated the policy recommendations emerging from DETECTER research and responses from DETECTER to criticisms of these made by counter-terrorism policy-makers as well as technology users and developers. It was disseminated to a broad mailing list as well as uploaded to the project website and posted on the LIFT blog.

Videos presenting DETECTER are available on UoB’s Global Ethics youtube channel: http://www.youtube.com/user/GlobalEthicsUoB. All project deliverables have been posted on the DETECTER website.

The largest dissemination event of the project, the Final Conference (M34), was held in September 2011 in Brussels. Christiane Bernard of the EC introduced the event. Five speakers presented. Jeff Jonas, IBM and David Pepper, ex-GCHQ spoke in the morning and three MEPs with relevant interests, Renate Weber, Rui Tavares, and Jan Albrecht, followed in the afternoon. There was a good counter-terrorism presence at the meeting with many counter-terrorism police in attendance. Audience members included the INDECT team, academics from across the EU, campaigners for privacy, representatives from national intelligence services and national information commissioners’ offices and representation from the office of the EU Counter-Terrorism Coordinator.

Impact on policymaking through the Lessons learned document
In order to both respond to some of the reactions to DETECTER research from end-users of detection technologies, a document was produced that drew out policy recommendations from the project. At project meetings and advisory board meetings throughout the project, members of the Consortium have had the opportunity to interact repeatedly with currently serving or recently retired police and intelligence officers, technology developers, and representatives from the NGO and policy-making community. Some of these interactions have led us to modify or at least concede the need to modify our recommendations. Others have alerted us to ethical and human rights risks and responses to these risks that our deliverables did not anticipate. D19 is a summary of the associated lessons learned. These relate to: thresholds for authorised surveillance; disclosure and secrecy; surveillance of things and places versus people; human in the loop; algorithms and abnormality; dual use of technology; detection technology export ethics; counter-terrorism and border issues; data-sharing and uneven human rights sensitivities.

Dissemination at Conferences
Below is a chronological list of all the and academic and public policy events at which DETECTER partners disseminated information on and research arising from the project during this period. Media interviews and other exposure are also listed.


- 26/01/2012 A 1-day meeting with 25 counter-terrorism police and academics was hosted by UoB academics in Birmingham. John Guelke, Tom Sorell, Kat Hadjimatheou and Rose van der Hilst presented Detecter research and police responded.

- 8/12/2012 John Guelke gave a presentation and led a discussion on privacy at Mozilla at their headquarters in Mountain View, California on the 6th of December 2011. The title of the talk was ‘Should Privacy Survive’ and argued for robust protections for privacy ahead of merely catering to consumer preference. The talk was live streamed on the ‘Air Mozilla’ site and a video is archived at http://videos.mozilla.org/serv/air_mozilla/public/brownbags/2011_12_06_brownbag_Should_Privacy_Survive.flv

- 3/11/2011 Katerina Hadjimatheou presented DETECTER research on Profiling in Counter-terrorism to postgraduate students at Waseda University, Tokyo

- 2/11/2011 Tom Sorell, John Guelke, and Katerina Hadjimatheou met with the Defence Research Officer at the British Embassy in Tokyo to discuss DETECTER research.

- 1/11/2011 Tom Sorell Presented DETECTER Research on Privacy and Counter-terrorism to Postgraduate students at Waseda University, Tokyo.

- 1/11/2011 Tom Sorell Presented DETECTER Research on Privacy and Counter-terrorism to a British Council audience at British Council Tokyo.

- 29/10/2011 Tom Sorell, John Guelke, Katerina Hadjimatheou presented a panel of DETECTER research at the 6th Applied Ethics Conference in Sapporo, Japan.

- 4/10/2011 Rozemarijn van der Hilst gave a presentation on the DETECTER project at the Norwegian Research Center for Computer and Law, UiO. (This event was covered in a 2 page article in Advokat Bladet Nr. 10-2011.)

- 10/09/2011 Martin Scheinin participated as a speaker in a 9/11 conference convened by the European Federation of Journalists, and spoke about DETECTER work. http://europe.ifj.org/assets/docs/041/211/4409429-dc365d3.pdf

- 8/9/2011 Martin Scheinin was filmed for a DG Enterprise video on the DETECTER Project.

- 7/09/2011 The DETECTER Final Conference was held in Brussels. Speakers Jeff Jonas (IBM) and Sir David Pepper (ex-GCHQ) responded to DETECTER policy recommendations and Tom Sorell presented the DETECTER project.

- 6/09/2011 Tom Sorell was filmed for a DG Enterprise video on the DETECTER project.

- 6/09/2011 2 Focus groups with 13 counter-terrorism practitioners (mainly ex-CT police) from 5 EU countries were held in Brussels to support the work of WP2 and WP9. Participants received copies of DETECTER deliverables and responded to questions arising from DETECTER research.

- 1/09/2011 John Guelke, Katerina Hadjimatheou, and James Thurman presented DETECTER research on Privacy and on Profiling in Counter-terrorism at the DETECTER-organised workshop on Counter-Terrorism at the Manchester University Political Theory Conference.

- 28/07/2011 Rozemarijn van der Hilst gave a guest lecture at the International Summer School at the University of Oslo on the topic of the right to privacy in the counter-terrorism context.

- 28/06/2011Rozemarijn van der Hilst represented DETECTER at the FP7 DESSI Workshop in Oslo.

- 01/6/2011 Karol Nowak met in Vienna with United Nations Office on Drugs and Crime Dr. Cecilia Ruthström-Ruin, Chief, Terrorism Prevention Branch UNODC to discuss DETECTER and how the results of the project can be disseminated via UN channels.

- 3-5/05/2011 John Guelke presented the DETECTER project in a showcase on European Research Projects at the Living in Surveillance Societies ‘Ghosts of Surveillance’ conference from the. He also presented a paper entitled ‘Is Privacy a Social Value?’

- 29/04/2011, Audelina Ahumada presented deliverable 14.2 “Preventing Irregular Immigration through Interception: Recommendations for a human rights compatible maritime border policy in the European Union” and its relation to the DETECTER project at one of the monthly in-house sessions gathering students and personnel, including guest researchers, at Abo’s Institute for Human Rights.

- 31/03/2011 Rozemarijn van der Hilst spoke at the ELSA (European Law Students
Association) Annual Seminar in Bergen titled: 'Human Rights in the Fight against Terrorism', for an audience of 60 students on the topic of the DETECTER project and D.17.2 Interviews with End Users.

- 21/03/2011 Tom Sorell presented DETECTER research on the at the University of Birmingham Philosophy Society.

- 25-27/01/2011 Rozemarijn van der Hilst presented D17.2 ‘Interviews with End Users’ at the Ph.D. evening of the Computers Privacy and Data Protection Conference, in Brussels.

- 25/01/2011 James Thurman presented at the 4th International Conference on Computers, Privacy & Data Protection on in Brussels.

- 13/1/ 2011 Tom Sorell presented DETECTER research at the University of Edinburgh Philosophy Society.

- 14-15 October 2010. Rozemarijn van der Hilst presented the DETECTER project at the SAMRISK Conference in Stavanger.

- 14/10/2010 Tom Sorell presented a paper to University of Hertfordshire Philosophical Society on Privacy, Surveillance and Counter-Terrorism

- 10/2010 Kat Hadjimatheou took part in a television discussion programme on terrorism for the ‘Real Talk’ programme on Brit Asia TV.

- 23/09/2010 Mathias Vermeulen took part in a FP7 Project Virtuoso's stakeholders’ consultation on the topic of ‘Privacy, open source information & border security’ at TNO in The Hague.

- 23/09/2010 Tom Sorell presented to SRC2010 (annual Security Research event sponsored by the Commission) in Ostend a paper on implications of DETECTER for commissioning research in future FP7 calls.

- 10/09/2010 John Guelke presented ‘The Use of Surveillance Technology by State and Private Actors’ at a conference on ‘Political Economy of Surveillance’ held at the Open University, Milton Keynes

- 08/2010 Martin Scheinin released DETECTER deliverable D06.1 as a EUI-LAW Working Paper

- 30/07/2010 John Guelke took part in a radio discussion on surveillance for security on ‘The World Today’ on the BBC world service.

- 01/07/2010 Martin Scheinin represented DETECTER in a one-day workshop on "Societal Security in Research & Development", convened by the Commission. He moderated one of two parallel working groups, on "Ethical and human rights aspects of security research" and reported on work done within DETECTER.

- 28/06-03/07/2010 Mathias Vermeulen took part in the European Criminal Law Association Network for The European Area of Criminal Justice in Brussels.

- 25/06/2010 Mathias Vermeulen presented at the ECPR Fifth Pan-European Conference in Porto.

- 24/06/2010 James Thurman presented "Human rights dimensions of increased access to and sharing of data in the fight against terrorism" at the ECPR Fifth Pan-European Conference on EU Politics in Porto.

- 19/06/2010 Martin Scheinin presentation at closed workshop between academics and Mr José Manuel Durão Barroso, President of the European Commission and some members of his cabinet. The presentation was based on DETECTER deliverable D06.2 and included a reference to DETECTER "EU Agenda for Global Governance: Countering International Terrorism"

- 16/05/2010 Karol Novak. ‘Terrorism and Counter-Terrorism: Legal and Ethical Implications’ at Jus Humanis International Human Rights Network Spring Forum, Lund, Sweden.
- 10/05/2010 James Thurman. 'Data-mining at the Border' at FP7 project INEX workshop, Brussels.
- 10/05/2010 Karol Novak. 'Terrorism and Counter-Terrorism: Legal and Ethical Implications’ (keynote speaker) at The Human Cost of Terrorism conference, Lund.
- 14/04/10. Rozemarijn Van Der Hilst. 'The legal requirements for the use of interception of communications.' Annual Living in Surveillance Societies Conference, City University, London.

13/04/10. Tom Sorell. 'Privacy, Intrusion and the Prevention of Terrorism' Annual Living in Surveillance Societies Conference, City University, London

21/03/10. Tom Sorell. BBC West Midlands Radio interview.

14/03/10. Tom Sorell. BBC Midlands Report (Television) on DETECTER Project

09/03/10. Martin Scheinin spoke at a UN press conference on the use of detection technologies in the fight against terrorism, and its impact on the right to privacy. (http://www.unmultimedia.org/tv/unifeed/d/14694.html) See also: http://www.ohchr.org/EN/NewsEvents/Pages/CounterTerrorismAndPrivacy.aspx. According to Google news, the remarks were picked up by at least 60 news outlets over the world, including by Reuters, Iranian press agency Press TV, Democracy Now and specialised airport press such as Aircargo Asia Pacific.

09/03/10 DETECTER Poster sent by UoB to be presented at second STRAW (Security Technology Active Watch) workshop in Madrid

08/03/10 Martin Scheinin spoke about DETECTER at the plenary session of the United Nations Human Rights Council at the 8th of March.

22/02/10 Mathias Vermeulen spoke about DETECTER at the Working Group on Countering Terrorists' Use of the Internet of the UN Counter Terrorism Implementation Task Force.
- 3/02/2010 Rozemarijn van der Hilst introduced the DETECTER project at the WG4 Meeting of the COST Action Living in Surveillance Societies in Gothenburg, Sweden.
-
04/12/09 Tom Sorell presented DETECTER at a FRONTEX meeting, Warsaw.
- 01/12/09 Karol Nowak. DETECTER was presented in poster format and by using special leaflets at the FOCUS Workshop on Swedish Security Research (www.foi.se/focus).

26/11/09. Karol Novak. DETECTER and issues concerning counter-terrorism were presented at a course for prosecutors in Stockholm delivered by the Swedish Prosecutor General.

19/11/09. Martin Scheinin presented DETECTER in his PhD seminar at the European University Institute.
- 19/11/2009. Tom Sorell presented DETECTER to the Public Administration MSc Seminar at University of Birmingham. This seminar had as participants members of the West Midlands Counter-Terrorism squad, as well as other public officials doing the course (around 30 people).
-
12/11/09. Mathias Vermeulen and Martin Scheinin presented DETECTER at a Workshop convened by the EUI Law Department jointly with the Research Group on Constitutional responses to terrorism, within the International Association of Constitutional Law at the EUI.
-
- 29/10/09. Mathias Vermeulen talked about DETECTER and related legal challenges within the Stockholm Programme at a conference organised by the FP7 funded INEX project in Brussels.
- 29/10/09 Mathias Vermeulen. ‘Conflicting and Converging Legal Regimes in Europe’, INEX Conference on Exploring the Internal/External Security Continuum, Brussels.
- 4/10/09. Mathias Vermeulen discussed DETECTER in the privacy workshop at the Wilton Park Conference on Terrorism, Security & Human Rights: Opportunities for Policy Change, which was organised by the British Foreign Office.
-
12/10/09. Karol Nowak. DETECTER and Policing was disseminated at the Police College in Växjö.

31/09/09. Professor Geir Ulfstein. Presentation of the DETECTER project at the Norwegian Research Council Conference, University of Oslo.
- 23/09/09. Tom Sorell. Presentation on DETECTER for UK Home Office Framework Programme 7 event, London.
- 18-19/09/09. Martin Scheinin talked about DETECTER in the session ‘Courts and the right to privacy’ at the round table on 'The Fight Against Terrorism: Challenges for the Judiciary' at the EUI in Florence. The roundtable was jointly organised by the Council of Europe, the Venice Commission and the European University Institute.
- 17-21/09/09. Rozemarijn van der Hilst. Paper presentation during the research course at the international conference ‘Should States Ratify Human Rights Conventions’, Oslo.
- 08/05/09. Tom Sorell. Presentation on DETECTER for EU Research Connections Conference, Prague.

List of Websites:
Project Website: www.detecter.eu
Contact: detecter@contacts.bham.ac.uk