Achieving impact and influencing refugee and asylum seeker policy in Scotland: considerations from seeing policy “in action”

Blog post by Scot Hunter | | twitter: _hunterscot

What is the point of research if academic work is not being cited in policy? Why are academics told we need to achieve impact when policymakers do not cite it? As social scientists, why are we pitted against each other to gain impact when policymakers aren’t that interested? –  all questions I have been wrestling with since finishing an internship with the Scottish Government (SG) and working to develop the New Scots Refugee Integration Strategy.

My background is in migration studies and researching outcomes for marginalised groups. So, I jumped at the chance to work with the SG on asylum seeker and refugee policy and, overall, found it an enriching experience. The internship included analysing some application data, but most importantly, conducting a review of the New Scots policy to see what types of sources were being used to inform its development. Conducting this review challenged a few of my assumptions. For one, I thought the relationship between policy and research was more straightforward than in practice, but I was shocked to find academic work is rarely cited. Second, I realised my understanding of ‘gaining impact’ and influencing policy is individualistic and rooted in personal gain rather than the greater good of the policy area. This blog is a response to these challenges, where I will reflect on my experience of seeing policy in action and attempt to answer the burning questions that emerged.

At first, identifying and reviewing “policy documents” was quite daunting. Yes – the importance of policy was frequently mentioned throughout my studies, and to be honest, I’ve lost count of the number of times I’ve been told that as a researcher, I need to achieve “impact” with my work. Yet, the idea of influencing policy (what that means and how that is achieved) remained abstract, like something I’d figure out as soon as I work through my PhD and start doing more high-quality research as a “proper academic”. The expectation that my research would magically feed into policy as I advance in my career meant that I wasn’t engaging with the policy itself. Instead, I focused on the theoretical and empirical studies in the field, leaving any consideration of how it was to engage with the policy on the backburner.

Despite the initial anxiety, I identified several documents for review and examined what type of sources were being used to inform them. Independent academic journal articles were hardly cited throughout the policy documents – a somewhat alarming revelation considering the deluge of “training” and guidance I had received, preaching the importance of influencing policy with my research. To be fair, some academic work was cited, but only when commissioned by either a government body or NGO. There was also a reliance on wordings from official legislation and sources to provide definitions for terms. Overall, this meant New Scots policy was primarily informed by the same sources as previous documents, without much consideration of the current empirical research in the field.

The observation that academic work was hardly cited in policy challenged everything I thought I knew about gaining “impact” and influencing policy with my work, leaving me confused. Why were academic journals not cited when I’ve been told they’re important? How can policy effectively address the needs of people without including the empirical findings of research that explores their experience? Both issues I thought could only be addressed simply by recommending that policymakers fix this outrage and cite more academic work.

However, this internship has shown me that academic work is only one piece of the policy puzzle. Yes, our academic work is important, and we should strive to publish and gain impact, but I was unaware of so many other facets of policymaking. It also got me thinking about my understanding of how my research should fit into the policy and the broader consequences of how we’re taught about “impact” in academia.

Admittedly, I was a bit naive in thinking the relationship between academia and influencing policy was clean-cut.

 In reality, policymakers are not that interested. Academic journals are dense, full of technical jargon and inaccessible – not just in terms of content, but hard to access. Policymakers work to stricter deadlines and have less time to contemplate the nuances of complicated social issues, a common characteristic of academic work. This is especially true for policy domains that deal with complex social relations, such as refugee integration, where simple solutions to problems are not possible. Cairney (2017) suggests policymakers are much more likely to cite academic work underpinned by random control trials and systematic review in domains characterised by self-evident findings such as medicine or the environment. In these domains, the cause of problems is more self-evident, such as the link between smoking and cancer. Citing research that doesn’t involve any ‘singular’ or ‘root’ causes, and involves work on issues that are often polarising, such as asylum, are not as appealing or efficient. But the argument about policymakers’ preference for STEM research is for another day.

It is also important to remember that just because policymakers are not directly citing academic work does not mean they have not been influenced by it somehow. Talks with senior colleagues made me realise that there are other ways of achieving impact – for example, through word of mouth. So some academics influence policymakers even if they are not cited. I think this deserves further investigation, and it would be interesting to interview academics about their views on this [adds to the list of research ideas].

In addition to my naivety, the internship and discussions with colleagues at the Scottish Policy & Research Exchange have made me consider some of the ethical issues tied up with the importance placed on achieving the impact drilled into my peers and me. This consideration also helped me understand why I was so shocked after discovering that hardly any academic work was cited in New Scots. As researchers, we are told to “publish or perish” – our research needs to achieve impact, we need to influence policy and the more publications we have, the better our job prospects will be. However, embedded within all this guidance is a sentiment of competitiveness – I feel in competition with my colleagues, that rather than achieving impact for the greater good of our policy areas. We become fixated on individual gain, having the most impressive list of publications, and achieving more impact than each other.

Although I am still rethinking my conception of impact and how my research should fit into the policy, I realise my experience and training have obscured the bigger picture. For example, although it is important, policymakers are not only interested in academic journals but also look at various other sources. Also, as discussed above, policymakers are not academics, and perhaps we need to do more to try and make our research accessible and compatible with their working patterns.  

The more I talked through these findings with colleagues, the less shocking and confusing my “revelation” became. Despite the emphasis on policy, discussions with my colleagues slowly shifted my mindset and revealed that academic work only achieves high levels of impact in exceptional circumstances. It became clear that impact is just a buzzword that was continually thrown about when in reality, we should be focusing on conducting high-quality research instead.

Overall, the internship has been an eye-opening experience for me, expanding my (mis)understanding of the relationship between academia and policy. As social scientists, directly influencing policy with our work seems to be a grey area, something we are told to strive for but something that (from my experience with refugee policy in Scotland) does not seem to happen at all (or in rare circumstances). Is this similar to other policy areas or within different academic fields? Have other academics had a similar experience to me?

These considerations are critical in the context of my work for the British Academy funded project looking at out-migration in Armenia, in which we plan to engage with policymakers and present our work to them. As academics looking to influence policy to help improve the conditions for young people and newcomer groups, what can be done to strengthen the relationship between policy and research? How can our empirical findings make a positive contribution with our work?

In terms of how we are taught about impact, is it potentially time for a change? Do we need to move away from the individualistic model and start thinking of the bigger picture when it comes to influencing policy?

A work in progress…

Are ‘equitable research partnerships’ compatible with UK university administrative processes?

Email: | Twitter

According to the British Academy’s ‘Humanities and Social Sciences Tackling Global Challenges’ call, ‘researchers from the Global South [must] be involved as equal partners’ in funded work. Such concerns also resonate with broader calls to ‘decolonise’ the university. But what do they mean in terms of project set up and administration between universities (here Stirling and Yerevan State University (YSU)) that work so differently? How can ‘best [administrative] practice’ reflect equality between institutions in such different circumstances?

A quick, not entirely linear timeline: we applied to the Global Challenges scheme in July 2020. In Autumn 2020, we get the grant!…… But Armenia is now at war. Since our application, tens of thousands of refugees have arrived from Artsakh/Nagorno-Karabakh; Syrian-Armenians have lived through (another) war, this time alongside their Republic of Armenia-born counterparts; while Indian immigration has stalled. By November, Armenia has lost the war; feels betrayed by Russia, its traditional ally; abandoned by the major powers (as during the 1915 genocide); and ever more threatened by Azerbaijan’s Turkey-backed military power. The political and economic hope associated with the 2018 Armenian ‘Velvet Revolution’ has dissipated. By early 2021, flights to Russia have resumed, young people are leaving again… and we must set up our project.

Guided by our very helpful (and busy) Research Office, we ask YSU to fill in a ‘due diligence’ questionnaire and develop a ‘collaboration agreement’. Most of the focus is on data protection and public liability insurance. ‘What if someone falls over a box?’ A good question in a UK context, but I start to feel a bit queasy. I try not to imagine the eyebrows raised in a country that has just been at war. Maybe someone will find the question funny? (Maybe someone who hasn’t lost a relative in Artsakh.)

My major mistake is money though. I read the financial provisions. They seem fine. But it seems that I didn’t read the appendix closely enough. We (the project team) think that YSU’s part of the money will just be transferred there. We think wrong. Things just ‘do not work that way’, and YSU ‘should not have signed’ the agreement if they thought they did. In the meantime, the UK government cuts ODA research funding. Ours is unaffected, thankfully, but what are YSU supposed to think? Somehow, we doubt they will have much confidence in the UK’s financial credibility right now (or welcome any lessons on how to manage funds from us.)

Our overworked finance department is of course just doing their job to protect us and the university. But we are now in a position where YSU is paying charges (ie losing money) on an account in euros -set up at our request- which remains empty. Normal practice, it seems, equates to asking our collaborators in a lower-middle income country, who do not have basic salaries that include time for research, to work for free. After which they need to get others in administrative roles to work beyond their normal contracts to invoice our finance departments to get paid.

Why didn’t I know this? Small-talk while developing relationships with possible research partners (also known as chatting to very nice people you want to work with), this is not. But why didn’t I know to ask about the basics?  ‘How can we work together in way that doesn’t make you worse off?’ might have been a starter. A colleague later tells me that eventually he did manage to get payments to individual accounts on a previous project. Eyebrows were raised, hair pulled out (via very British emails) but something was worked out. It seems that other practices are possible; ones that recognise that normal practice is based on assumptions that may not apply elsewhere. Such approaches need to be normalised if UK universities are to work alongside international partners in a truly equitable way.

We appeal upwards and a compromise is worked out – an advance quarterly payment can be made, but the financial responsibility for this advance will lie with our Faculty, not the university. A huge sigh of relief. Perhaps we will maintain our good relations with YSU (and the finance department too)?

I return to filling out risk assessment forms. YSU doesn’t have an ethics committee in the same way as UK universities do, so our form is needed even for work done in Armenia by Armenian colleagues. There’s a special form for covid. I know that our Armenian colleagues cannot focus on covid risk assessment to the same extent as we can in the wake of the recent war and the need to support refugees. The form continues: how will we mitigate the risks posed by ‘political tension’?  ‘We will not go there/return if war threatens Yerevan’ I write,….adding, cringing, that this option is ‘unavailable to our Armenian colleagues’. An ‘equitable research partnership’ remains the aim, but will be hard to realise.