Alongside the Project’s desk research, we have sought and interviewed a range of actors involved in the scrutiny and review of counter-terrorism law and policy. Originally, we aimed for 12-15 ‘elite’ participants. To date, we have completed 24 interviews, much exceeding our own expectations. Our surprising success prompted us to reflect on the process of identifying and seeking interviewees, conducting interviews and qualitatively analysing the data produced. Was it all plain sailing? What are the benefits of interviews for research? And what have we learnt from the challenges of using interviews to produce empirical data in our research? In this blog, we reflect on these questions from both the methodological and the researcher’s perspective.
Why interview to research CTR?
As we have explored in our blog over the course of this project, counter-terrorism is an aggregation of multifaceted, multi-agency law and policy that operates in a contested, security heightened space. Understanding review is an important aspect of accountability, particularly where law changes at an unusually fast-pace and much policy functions behind a wall of secrecy owning to security concerns. These same factors mean that counter-terrorism review can be hard to understand on paper alone. As we identified when we mapped the institutions and actors who might be involved in CTR, there are many institutions and actors, working at different levels and activities within the counter-terrorism space. Speaking to the people who undertake this work would help us to understand how counter-terrorism review works in practice, what values those actors are pursuing, as well as testing some of our hypotheses about the structure, interaction and quality of review actually conducted. It would also illuminate what expectations CTR actors have of the impact their work and what recommendations for improvement they suggest. In short, interviewing those who undertake counter-terrorism review would ground our research in the reality of what appears to be a complex, highly differential form of accountability. It would show us whether they were gaps in the system or whether the sprawl of different routes to review means that aspects are scrutinised than previously thought, even if they do happen outside the public eye.
We chose to speak to these actors because they represented the offices, institutions and actors who actually conducted reviews of some kind, whether that be independent, within Parliament, as an arms-length body or regulator, or as a civil society organisation. Focusing on those who undertake scrutiny rather than public perception was a difficult choice. After much discussion and exploring the viability of different ways of capturing public perception, such as on-line surveys or focus groups, we decided that despite the immense value this would bring to the project and the understanding of counter-terrorism review it was outside the Project’s capacity. It would require a different set of methodological tools, probably an additional research team and much more time to do it justice. We hope that it can be taken up as a follow-on project and have kept an eye out for projects undertaking similar research. For example, we are interested in the work of MCB’s National Listening Exercise on counter-terrorism and Muslim communities.
After receiving ethical approval, we sent invitations to a range of actors who undertake review. These included the Independent Reviewers of Terrorism Legislation, former Prime Ministers, Home Secretaries, and MPs on relevant Parliamentary Committees. For these invitations we took special care to cover the range of political viewpoints and leadership levels. We also approached Special Advocates representing terrorism suspects, representatives from the Investigatory Powers Tribunal, the Investigatory Powers Commissioner’s Office and the Independent Office of Police Conduct, and representatives from a range of civil society organisations. As might be expected, this created a steep hill of access to climb. In all we invited nearly 100 people to participate in an interview, meaning we had an approximately 25% success rate. In approaching potential interviewees, we noticed how important it was to not only be able to explain the aims and approach of the project clearly and succinctly (and ensure we received informed consent), but also to indicate why we felt the invitee would make a useful contribution to the research. We also learnt that it is important for researchers undertaking empirical research to expect no reply from a vast number of people contacted, but not to be disheartened.
What did we learn from the interviewing process itself?
As we are still undertaking analysis on the interview data gathered we should not yet comment on our findings. However, we can comment on how we decided what questions to ask, what skills we needed and learnt in conducting the interviews, and what techniques we utilised to begin analysing the data produced.
Choosing interview questions to ask a range of participants is part art, part science. As William M.K. Trochim highlights, we are largely concerned with the nomothetic, the general understanding, but to get at it, we have to explore the idiographic understandings of counter-terrorism review. As the project has a defined set of aims, we used this to choose a range of questions which we wrestled into a base list which could be elaborated on depending on the interviewee’s knowledge and responses. This approach is referred to as semi-structured. Working as a team, identifying and editing questions together, was a distinct benefit in developing our interview questions. We sought to match the questions to our initial research queries but ensured they were open enough to enable the participant to elaborate on elements they thought significant. In so doing, we developed a set of questions that had a good balance between descriptive and evaluative open questions and questions that sought to understand the processes and procedures of review.
As a researcher, interviewing participants is a curious activity. It is at once fascinating and demanding. It engages a set of skills unlike legal desk research. Legal and policy analysis, methodologically speaking, is highly technical, detailed and requires the ability to translate legislation, judicial judgements and legal-ese into manageable concepts that are often distinct from colloquial understandings. Interviewing means being able to undertake and hold these skills in mind, but without leading the interviewee, descending into the technicalities of legal language (for example the distinctions between the procedural and substantive aspects of a review), or imposing existing understandings of law and policy. The interviewer has to balance probing and challenging participants claims while being open and guided by their interpretation. Without this second aspect, the data gathered is harder to compare and amalgamate across participants; it is difficult to move from the idiographic to the nomothetic.
On a less abstract level, research interviewers have to be able to ask and listen in a conversational manner all while taking notes during a research interview. While most of our interviews were recorded and transcribed, taking notes during the interview is a form of initial analysis, recording the initial researcher’s in the moment responses or understandings of the interviewee’s views. Unsurprisingly, this is an intensive process and can often leave the interviewer and interviewee feeling physically and mentally tired. For the interviewer, once the interview itself is over there is also the need to write some basic reflections while it is still fresh in the mind. This can be a challenge when interviews are timed back to back (unavoidable when balancing busy diaries).
Interestingly, the experience of interviewing changes as more participants are interviewed. The interviewer develops new expectations and understandings, indeed some of the questions might mature based on the responses of early participants. Many larger research projects suggest this is a good reason that undertaking a pilot study before completing the research proper can be beneficial. Unfortunately, the context of this research somewhat prohibits that as gaining access to elite participants can be a one-time only event.
In terms of how we are analysing the interview data, broadly, we are following a ‘grounded theory approach’ which involves viewing the data as iterative in that the analysis arises from the data itself. In this approach, the ‘data’ are not just the interviews but also the desk and paper research we have undertaken and together these elements come together to form the foundation of our theory and recommendations about counter-terrorism review. The analysis we construct arises directly from the data. As Kathy Charmaz, one of the modern proponents of grounded theory explains the methods “consist of systematic, yet flexible guidelines for collecting and analysing qualitative data to construct theories ‘grounded’ in the data themselves. The guidelines offer a set of general principles and heuristic devices rather than formulaic rules”.
Doubling our expected interview data has meant that more time is needed to analyse it thoroughly. The process of analysis for the Counter-terrorism Review Project has the added benefit of being a team effort, although coordinating and integrating three people’s thoughts also brings its own challenges. In all, not only has the interview material produced a wealth of data but the process of interviewing itself has greatly enriched the research which we are currently writing up as a monograph.
We would like to thank all of our participants and all those whom we contacted to participate but declined for various reasons.