The presentations can be viewed here: https://vimeopro.com/user39826906/the-1st-besp-symposium
- Dr. Michael Sanders: Practical Science - how we bring rigour into the evaluation of policy
- Prof. Sharon Simpson: Updated guidance on developing and evaluating complex interventions (UK Medical Research Council)
- Prof. Jeffery Carpenter: Experimental innovations to aid evidence-based policy-making
- Prof. Martin Hagger: Why and how do interventions work? Evaluating mechanisms of impact
- Prof. Petri Ylikoski: Mechanism-based thinking
- Assoc. Prof. Jaakko Kuorikoski: Severe testing
- Dr. Samuli Reijula: The problem of extrapolation
- Prof. Kaisa Kotakorpi: Field Experiments with Tax Administration
- Dr. Jennifer McSharry: Evaluating the feasibility of implementing interventions into practice: the example of the Cardiac Health and Relationship Management and Sexuality (CHARMS) intervention
- Dr. Jouko Verho: Could we increase the use of randomised field experiments in Finland?
- Elina Aaltio & Nanne Isokuortti: The effectiveness of an intervention depends on the implementation fidelity: Why to study it? The case of a child protection practice model
- Dr. Mira Fischer: Discussant
BIOS & ABSTRACTS
Michael Sanders is Executive Director of the What Works Centre of Children’s Social Care in the UK, and a Reader (Associate Professor) in Public Policy at King’s College London. His research primarily concerns social mobility and how public policy can be used to support the most vulnerable and least affluent members of society. He previously served as Chief Scientist of the Behavioural Insights Team, where he oversaw the design and implementation of more than 600 randomised controlled trials across the public policy spectrum.
Practical Science - how we bring rigour into the evaluation of policy
There has been a growing interest in uncovering "What Works" in government, and a growing interest in using randomised controlled trials to identify the impacts of government policies, from the small to the large. The UK's What Works Network now includes centres covering hundreds of billions of government spending a year, working to make gold standard evaluation mainstream. This talk will cover the lessons from this network, and thoughts for the future.
I am Professor of Behavioural Sciences and Health at the Medical Research Council/Chief Scientist Office Social and Public Health Sciences Unit in the Institute of Health and Wellbeing at the University of Glasgow. I am currently undertaking a 5 year Senior MRC Fellowship. I lead the Institute’s Solution Focussed Research Theme and the Complexity in Health Improvement Programme. My main interests are obesity, mental health and behaviour change in particular the maintenance of behaviour change. I am also interested in complexity and social network methods as well as the design and delivery of interventions via the web and new technologies. I have methodological expertise in randomised controlled trials and in the development and evaluation of complex interventions, as well as mixed methods approaches and process evaluation.
Updated guidance on developing and evaluating complex interventions (UK Medical Research Council)
The UK Medical Research Council’s (MRC) Guidance on Developing and Evaluating Complex Interventions (Craig, 2008) is a key reference point in the field. It has been cited over 5000 times. However, the current guidance has been criticised on a number of grounds. We are updating the guidance to take account of developments in methods and practice, with an emphasis on identifying relevant methods used in a wider range of disciplines and sectors than the traditional core focus on randomised trials. This talk is an opportunity to learn more about the updated guidance and cutting edge methods in the field of evaluation.
Participants will be familiar with the key challenges in developing and evaluating complex interventions.
Participants will be aware of current debates, methodological developments, and different perspectives, in developing and evaluating complex interventions.
Participants will understand the importance of considering contextual issues such as cultural, policy, and service and organizational factors in the design, evaluation and real world delivery of interventions.
I am the James Jermain Professor of Political Economy at Middlebury College. In addition, I am a research fellow at the Institute for the Study of Labor (IZA). My fields are behavioral and experimental economics with applications to labor, public and development economics. Most recently, I have been involved with the State of Vermont to establish a research center to promote evidence-based policy-making in the state.
Experimental innovations to aid evidence-based policy-making
Although governments around the world have begun to embrace the notion of evidence-based policymaking, the design of economic experiments has been relatively stagnant. We examine a series of design innovations recently used to evaluate a forthcoming policy issued by the Consumer Financial Protection Bureau in the United States. The experiment combines elements of preference elicitation and structural estimation to individualize treatment parameters and better evaluate the impact of choice architecture on financial decision-making.
Martin Hagger is Professor of Psychology in Curtin University’s School of Psychology, and Finland Distinguished Professor (FiDiPro) in the Faculty of Sport and Health Sciences, University of Jyväskylä, Finland. He is Founding Director of the Health Psychology and Behavioural Medicine Research Group. He is also Adjunct Professor at Griffith University and Central Queensland University, and has been visiting Professor at the Universities of Rome, Bordeaux, and Genoble, and Hong Kong Baptist University. From July 2019 he will take up the position of Professor of Psychology at the University of California, Merced, USA. Professor Hagger’s research focuses on the social processes involved in people’s self-regulation of social and health behaviour. He aims to understand how psychological factors such as attitudes, intentions, self-control, planning, and motives affect health behaviour and how health professionals can use this information to promote health behaviour change. He has made several contributions to advancing psychological theory including theoretical integration and ego-depletion. Professor Hagger is editor-in-chief of Health Psychology Review and Stress and Health, and editorial board member of ten international journals. Professor Hagger has received numerous awards and acknowledgements, including Distinguished International Affiliate of Division 38 (Health Psychology) of the American Psychological Association, Distinguished Health Psychology Contribution Award from the International Association of Applied Psychology, and Fellow of European Health Psychology Society.
Why and how do interventions work? Evaluating mechanisms of impact
Many behavioral interventions have demonstrated efficacy in changing their desired outcome variables, often a measure of behavior. However, trials testing intervention efficacy seldom evaluate how the interventions work. As interventions focusing on behavior change tend to adopt content that affects change in one or more key ‘determinants’ of behavior, evaluating how they work often entails measuring change in those determinants alongside change in the target outcome. So, testing that intervention effects on outcomes coincide with changes in the determinants is key to understanding how interventions work. Knowledge how interventions work provide important information for design of future interventions, updating theories and ideas of behavior change, and development of an evidence base of the conditions and contexts in which interventions may or may not work. In this presentation I will outline the importance and rationale for evaluating how behavioral interventions work, and suggest ways intervention designers can incorporate strategies and methods to to do so.
After this talk participants should be able to:
Describe a basic ‘model’ of how behavioral interventions ‘work’ in changing behavior
Outline the advantages and benefits of evaluating the mechanisms by which interventions work
List the key steps in evaluating how interventions work in impacting behavior change
Describe the types of research design and analytic procedures necessary to evaluate how an intervention works
Petri Ylikoski is Professor of Science and Technology Studies at University of Helsinki. His research interests include theories of explanation and evidence, science studies, and social theory. His current research focuses on the foundations of mechanism-based social science, institutional epistemology, and the social consequences of artificial intelligence.
Ylikoski presents the key ideas and motivations for mechanism-based theorizing in the social sciences. The presentation focuses on how mechanism-based thinking could strengthen experimental research, but also highlights the dangers of illusory understanding created by mechanistic storytelling.
Jaakko Kuorikoski is an associate professor in New Social Research at the University of Tampere. His main areas of specializations are philosophy of economics and philosophy of social sciences, and he has published on scientific explanation, modeling, simulation and causal reasoning.
Kuorikoski discusses field experiments as severe tests of a causal hypothesis about the sampled population and analyzes the debate about the internal validity of explanatory and pragmatic trials from the severe testing perspective.
Samuli Reijula is a university researcher at the University of Tampere, Finland. Samuli is also a member of the TINT Centre for the Philosophy of Social Science at the University of Helsinki, where he received his doctorate in 2013. His area of research is the philosophy of science, and he is particularly interested in questions related to evidence-based policy (behaviorally-informed policy in particular), and the social epistemology of science.
The problem of extrapolation
Traditionally, the literature on extrapolative inference has mainly focused on identifying conditions that prevent us from transporting results about the causal effect of a treatment in one population to another population. Recent work on the notions of cause and causal structure (see, for example Bareinboim & Pearl 2016), suggest, however, positive conditions under which extrapolative inference succeed, i.e. conditions under which reliable estimates of causal effects in target populations can be derived.
Kaisa Kotakorpi is a research professor at VATT Institute for Economic Research. Her main field of research is public economics, in particular the economics of taxation. Broadly defined, her research deals with the evaluation of public policies and the impact of the tax and benefit system on the behaviour and wellbeing of individuals. Before joining VATT in September 2017, she has been professor of economics and vice head of the Department of Economics at the University of Turku. Kaisa also served as a member of the Finnish Economic Policy Council for the Council’s first five year term (2014-2019). In August 2019 she will take up a position as Professor of Economics at Tampere University.
Field Experiments with Tax Administration
Field experiments run together by researchers and Tax Administrators have been used to study for example the determinants of tax evasion in various contexts. The talk discusses the specific rationale for using field experiments in this context, and the practical considerations that need to be taken into account when implementing field experiments with public administration, in particular the tax authority. Practical examples are drawn from a field experiment implemented together with the Finnish Tax Administration, to analyze the determinants and effects of tax evasion in the rental housing market.
Dr Jenny Mc Sharry is a lecturer in the School of Psychology and Assistant Director of the Health Behaviour Change Research Group (HBCRG) at the National University of Ireland, Galway. Jenny is a chartered Health Psychologist and current chair of the Psychological Society of Ireland Division of Health Psychology. Jenny’s research aims to take a systematic approach to behaviour change to address healthcare challenges. Jenny has a particular interest in implementation science, and the application of behaviour change to promote the uptake of research into practice. Through an Irish Research Council New Foundations Grant, Jenny led on the development of IMPlementation science Research NeTwork (IMPRNT) to develop implementation science capacity in Ireland.
Evaluating the feasibility of implementing interventions into practice: the example of the Cardiac Health and Relationship Management and Sexuality (CHARMS) intervention
Evaluating the feasibility of implementing interventions into practice can benefit from the collection of quantitative and qualitative feasibility, fidelity, cost, and outcome data. This presentation will describe a mixed-methods approach to the assessment of feasibility using the example of the Cardiac Health and Relationship Management and Sexuality (CHARMS) multi-level intervention. The CHARMS intervention was developed to increase delivery of sexual counselling by healthcare professionals within pre-existing cardiac rehabilitation services. The presentation will also describe the generation of potential solutions to the feasibility issues identified using the ADePT process for decision-making after pilot and feasibility trials.
After this talk participants should be able to:
Outline key aspects of feasibility that can be evaluated using a mixed-methods approach
Generate potential solutions to identified feasibility issues using the decision-making after pilot and feasibility trials (ADePT) process
Consider the implications for full scale evaluation and roll out of interventions
Jouko Verho is a senior researcher at VATT Institute for Economic Research. Previously, he has worked in the Social Insurance Institution of Finland. His research interests are in labour, public and health economics. Recently, Jouko has been involved in the design and evaluation of the Finnish basic income experiment.
Could we increase the use of randomised field experiments in Finland?
Randomised field experiments would provide more rigorous evidence for the design of labour market policies and social security. This talk discusses the benefits of using administrative registers in the implementation and evaluation of the experiments. It also provides examples of how policy evaluation can be conducted simultaneously with the regular implementation of policy programs and reforms. The practical examples include the basic income experiment and the implementation of the income register by the Social Insurance Institution.
I am a doctoral researcher at the University of Helsinki, Faculty of Social Sciences (Social Work). My PhD research focuses on analysing the implementation process of the Systemic Practice Model for child protective services in Finland. Recently, I have conducted academic visits at the University of Oxford and Melbourne. Prior to joining the faculty, I have worked as a social worker and development planner for a local authority in Finland.
Elina Aaltio, M.Soc.Sci, doctoral researcher, University of Jyväskylä
I am a doctoral researcher at the University of Jyväskylä, Faculty of Humanities and Social Sciences (Social Work) and a visiting researcher in the National Institute for Health and Welfare (THL). My doctoral dissertation focuses on the effectiveness of the Systemic Practice Model for child protection using realist evaluation approach. Previously I have worked as a researcher in THL, where I was in charge of the national evaluation of the nation-wide Systemic Practice Model pilot (2017-2018).
The effectiveness of an intervention depends on the implementation fidelity: Why to study it? The case of a child protection practice model
The effectiveness of an intervention depends on the success of its implementation. This presentation provides an introduction to implementation fidelity that refers to the extent to which implementers adhere to the intervention as it is prescribed by its designers. By presenting findings of a mixed-methods evaluation of the Systemic Practice Model, Isokuortti and Aaltio illustrate why it is important to undertake an implementation fidelity evaluation when assessing intervention outcomes.
Three main learning objectives:
- introduce the concept of implementation fidelity
- illustrate why it is important to undertake an implementation fidelity evaluation when assessing effectiveness
- provide a case example of an implementation fidelity evaluation in a child protection service context
Mira Fischer is a postdoctoral research associate at WZB Berlin and a research affiliate at IZA. Her background is in economics and philosophy and her research interests are in education, labor markets, organizations, and public policy. She uses lab and field experiments as well as analysis of survey and administrative data to study institutional determinants of changes in people's beliefs and behavior, and how these affect individual and social outcomes.
Commentary on Prof. Michael Sander's presentation