Normal Text
Medium Text
Large Text
 search
Workshops

The conference included five parallel workshops focusing on different issues within evidence use. The workshops had two aims. First, to discuss and reflect on the progress made in evidence informed policy and practice across Europe and in what areas. Second, to consider the implications for the policy and practice of evidence use and studies of it. A summary of the workshops is provided below. Click on a workshop to view the information.

Workshop 1: Using evidence at national, local and school levels
Workshop 2: PISA cake? The use of large-scale datasets in policy and practice
Workshop 3: School-University Knowledge Exchange Schemes
Workshop 4: Supporting evidence use in schools: Using Continuous Professional Development (CPD) as a means to effect change
Workshop 5: The media and evidence - informing the public debate

Chair: Elisabeth Buk-Berge, Ministry of Education and Research, Norway

This workshop focused on the ways that different types and sources of evidence are used in policy and practice at national, local and school levels. In so doing, the workshop discussed the rationales for developing systems to collect, analyse or use evidence for example, what is being sought through such systems and why? The processes and experiences of creating/developing, implementing and using systems to collect and use data to inform policy and practice are also of interest. The workshop sought to reflect on how systems to collect, analyse or use evidence are used in practice to inform decision-making whilst being sensitive to the barriers and/or facilitators to the development, implementation or use of evidence collection or use systems. In addition, the workshop discussed the outcomes or results of using systems to collect, analyse or use evidence at national, local or school/organisational levels and to what ends.

Presentations

  • Using ICT to support policy making based on evidence at national level: the case of the distribution of educational opportunities in Chile Liliana Guzmán, University of Kaiserslautern, Germany
  • Early school leaving from an evidence informed perspective Kristof De Witte, Maastricht University, the Netherlands 
  • Evidence-informed policy making in the city of Antwerp: Networking, participative research and knowledge management Marleen Baillieul, City of Antwerp, Belgium
  • No title supplied Bram Wellens, City Schools Antwerp, Belgium
  • The process and experiences of developing, implementing and using a system to collect and use data to inform (the council of Amsterdam) and practice (school leaders of the secondary schools) Daan Wienke Nederlands Jeugdinstituut, the Netherlands
  • The satisfaction of the educational community as a contributing factor to the quality in the school Carlos Gonçalves, Ministry of Education - Grouping of Schools D. Carlos I, Portugal 

 

Evidence-informed policy and practice in the city of Antwerp

Marleen Baillieul

City of Antwerp, Department of General Education Policy
 

The satisfaction of the educational community as a contributing factor to the quality in the school

Carlos Manuel Novais Gonçalves

Ministry of Education - Grouping of Schools D. Carlos I, Portugal

Using ICT to support policy making based on evidence at national level: The case of the distribution of educational opportunities in Chile

Liliana Guzmán, Patricio Rodríguez
University of Kaiserslautern, Germany

   

How the Dutch Youth Institute (NJi) contributes to safe and supportive secondary schools

Daan Wienke
Nederlands Jeugt Instituut

Early school leaving in the Netherlands: Policy and research
Kristof De Witte
Top Institute for Evidence Based Education Research, Maastricht University
Faculty of Business and Economics, KU Leuven

 

 

Implications for the policy of evidence use

  1. Analysis of good datasets and databases, both national and international, is necessary to make evidence more relevant for different contexts and in this way enhance its use.
  2. Research that shows “what does not work” can provide much valuable knowledge and for this reason should be used. 
  3. Users of research need to have special competence that includes understanding (comprehension) results from different kinds of research evidence.

Implications for the practice of evidence use

  1. Evidence is used at different levels – international, national, local – for different purposes. Despite these differences, in order to be useful, existing evidence and knowledge should be tailored to the specific context of knowledge use.
  2. Development of competence for evidence use is a matter for organisations/institutions that use evidence. However, individual employees who use evidence ought to have a development plan for competence building in this field.

Implications for studying evidence use

1. More knowledge would be useful in the following areas:

  • Different meanings of “evidence use” in education (different countries, different stakeholders.
  • Evidence use at different levels – international, national, local – commonalities and differences.
  • What kind of knowledge is in demand for different purposes?
  • How is evidence provided by different international bodies like EU and OECD used in policymaking in different countries?
  • Studies of knowledge mobilisation.
  • Use of international and national portals to find research evidence.

2. User involvement at different stages in research enhances the relevance for users.
 

Chair: Andres Sandoval-Hernandez, International Association for the Evaluation of Educational Achievement - Data Processing and Research Center, Germany

This workshop aimed to exchange knowledge and understanding about how international large-scale datasets are being used by policy-makers and practitioners at all levels across Europe. The workshop was particularly interested in how datasets such as PISA are being used at different policy-making levels and how these are being translated to school and practitioner levels. The workshop provided a forum for participants to share experiences about the difficulties and/or challenges of using large-scale datasets to inform policy and practice and how such difficulties have been overcome. In doing so, it will increase knowledge about using, analysing, interpreting these datasets for use in policy and practice. The second session of the workshop (on day two) also included a more practically focussed including training session of interpreting and using these datasets for use in policy and practice.

Presentations

  • Why comparativists argue so much: combining narrative, imperfect knowledge and Grand Theory in comparative analysis Tim Oates, Cambridge Assessment, UK
  • What PISA can tell about educational systems, and what it can’t Eckhard Klieme, German Institute for International Educational Research (DIPF), Chair of the International Questionnaire Expert Group for PISA 2012/2015, Germany
  • Secondary Analyses of Data Acquired through the PISA Survey Jelena Markovic, Social Inclusion and Poverty Reduction Unit Office of the Deputy Prime Minister for European Integration Government of the Republic of Serbia   
  • Linking home background, gender, and reading literacy - a comparative analysis with PISA 2009 data from Germany and Norway Sabine Wollscheid, The Campbell Collaboration, International    

 

Workshop round up slides    

 

Secondary Analyses of Data Acquired through the PISA Survey

Jelena Markovic
Social Inclusion and Poverty Reduction Unit

What PISA can tell us about educational systems -
and what it can‘t
Eckhard Klieme
German Institute for International Educational Research (DIPF)

   

Linking home background, gender, and reading literacy – a comparative analysis with PISA 2009 data from Germany and Norway

Sabine Wollscheid, Norwegian Knowledge Centre for the Health Services
Are Turmo, University of Oslo

   

Implications for the policy of evidence use 

  1. International large-scale educational assessments like PISA, TIMSS, PIRLS, etc are characterized by using complex sample methods and having a cross-sectional design. For this reason, when used to inform policy making, recommendations have to be made with caution. For example, LSA are inadequate to draw causal conclusions (e.g. “more autonomy to schools will result in higher student achievement”). Users of LSA must make sure of not over-interpreting the results of their analyses and/or drawing over-simplistic conclusions that lead to policy borrowing/lending processes.
  2. However, LSA are a very powerful source of information for identifying patterns related to good practices across countries. These good practices can be used as a starting point to promote ‘policy learning’ processes. Policy makers must interpret LSA results in the context of their own educational systems.

Implications for the practice of using evidence

LSA are not appropriate for reporting academic achievement either at the school or student level. However, they can provide teachers, head-teachers and other actors with valuable information about practices (e.g. teaching strategies, leadership styles, supervision practices, etc.) that can be used for improving their own practice. Of course, the same warnings about interpretation of analyzes apply at this level too.

Implications for studying evidence use

It is crucial that policy makers, practitioners and other users of this information (included researchers) analyze and interpret LSA in the correct way. To this end, the organizations running these studies (e.g. OECD, IEA, ETS, etc.) offer specialized statistical software, training, internship programs, among other options. Universities too offer postgraduate programs focusing or including courses on LSA. People interested in using information from LSA are encouraged to procure themselves with training that allows them to effectively use evidence derived from LSA to inform policy making.
 

Chair: Andrew Morris, Educational Evidence Portal (EEP)

In this workshop key messages were presented from a small study of 14 schemes involving knowledge exchange between schools and universities. Drawing on examples from six European countries, EIPPEE partners will illustrate key themes arising from the study and will lead discussion on fundamental aspects of their effectiveness. The workshop began with a welcome and introductions from the workshop chair before providing an overview of the findings of the study into relevant knowledge exchange schemes from six European countries. The workshop then engaged in a discussion about key issues about the development, implementation and experiences of such schemes whilst outlining the main learning points from these schemes for discussion with participants. Five main school-university knowledge exchange schemes were presented in the workshop:

Presentations

  • Anna Kristín Sigurðardóttir, University of Iceland, Iceland
  • Mary Sheard, Instiotute of Effective Education, University of York, UK
  • Per Skoglund, Specialped. Skolmyndigheten Sweden
  • Tomislav Tudjman, Risbo, Erasmus University Netherlands 

Implications for the policy of evidence use

1. Exchange of knowledge between schools and universities helps develop teachers and potentially benefits learners. To move from an incidental to a systematic approach:

  • incentives need to be offered to both parties to collaborate;
  • structures need to be encouraged to support local schemes;
  • information and communication infrastructure needs to be created to enable knowledge from different schemes to accumulate, be shared and built upon.

2. Policy statements designed to raise standards need to move from condemning poor teaching to encouraging collaboration to find out how to do better.

Implications for the practice of using evidence 

  1. Leaders of schools, pre-schools and colleges need to develop evidence-using cultures and seek to grow them in partnership with universities and municipalities.
  2. Practitioners and researchers need to recognise differences in the forms of knowledge they work with, and cultures they work in, as they collaborate.
  3. Imbalances in the power relations between practitioners and researchers need to be taken into account in designing knowledge exchange schemes.

Implications for studying evidence use

  1. Models of knowledge exchange schemes need to be developed, drawing on existing theory and further, more comprehensive and detailed, empirical study.
  2. Approaches to assessing the impact of knowledge exchange schemes on outcomes for practitioners and learners need to be developed and initial assessments made. 

Chair: Frank de Jong, European Association for Practitioner Research on Improving Learning in Education (EAPRIL)

This workshop examined a range of services to support schools and other organisations to use different sources of evidence to improve performance. The first session of the workshop (day one) presented experiences from across Europe that seek to do this, focusing on how learning from these can inform future efforts in this area. The second session of the workshop (day two) offered specific guidance for those who are interested, and able to, develop an evidence informed strategy continuous professional development (CPD) for their school, or organisation. This session provided targeted advice about developing and implementing such a strategy (or the principals of such a strategy) in different organisational and educational contexts. This session was based on the SKEIN service offered by the UK based Centre for the Use of Research Evidence in Education. 

Presentations

  • Promoting Data-Use in Schools: The Impact of the Principal and Implications for Professional Development for Principals Martin Stump, Department of Business Education, University of Mainz, Germany  A link to the project presented in this presentation is available in German at http://www.wipaed.uni-mainz.de/evis/
  • Research Alive – An initiative to close the gap between teaching practice and educational research Heino Schonfeld, Centre for Effective Services, Ireland
  • Pedagogical practice, evidence and evidence informed pedagogical practice? Michael Søgaard Larsen, Danish Clearinghouse for Educational Research, Denmark
  • A research based approach to evaluating and improving the effectiveness of schools as learning environments for teachers Philippa Cordingley, Centre for the Use of Research Evidence in Education, UK
     

 

 

Pedagogical Practice, evidence and evidence informed pedagogical practice

Michael Søgaard Larsen
Danish Clearinghouse for Educational Research
Institute of Education, University of Aarhus
 

Research Alive - An initiative to close the gap between teaching practice and educational research
Heino Schonfeld
Centre for Effective Services, Ireland

 

Sauce for the Goose: learning environments that work for pupils and staff
Philippa Cordingley
Lisa Bradbury
Centre for the Use of Research and Evidence in Education (CUREE)

 

  

Implications for the policy of evidence use

  1. Greater emphasis should be placed on ensuring that teachers receive adequate training and support to find, understand and use evidence during initial and post-teacher education. The ability to understand and apply research evidence and, monitor and evaluate the effectiveness of new methods, could form part of teacher standards/competencies.
  2. More recognition should be given to ‘practitioner’ research and other methods by Research Funding Institutes or, alternative funding bodies should be created to encourage this type of research.
  3. Incentives should be given both in funding projects and publishing in non-traditional outlets (and open access where possible) to encourage greater cooperation between researchers and practitioners and better access to research evidence.

Implications for the practice of using evidence

  1. School leaders should pay attention to creating and sustaining a culture that recognises the value of evidence use and supports teachers to find, understand and use it for example, setting up reading clubs, carefully planned continuous professional development opportunities.
  2. School leaders and principals should be aware of the effect that their leadership visions and ideals can have upon teacher behaviour and outcomes. ‘Thinking big’ has been shown to be a key factor in raising outcomes.

Implications for studying evidence use

  1. Communication of findings from studies of evidence use (both good and bad practice) would be useful as would implications for practice outlined.
  2. Involving users at all stages of the research process is fundamental to ensuring that it is relevant to the needs of potential users. More support and guidance should be given to researchers to enable this to happen in practice.

Chair: Rien Rouw, Ministry of Education, Culture and Science, The Netherlands

The profile of education has changed over the last twenty years, becoming a key priority for governments and the public. In this context, media reporting of education has also become more influential, yet, unlike other fields, robust evidence plays a minor role in informing the national debate. There are numerous barriers to using research in the press and media. Editors and journalists report that academic research is not always media friendly. The subject matter is not always in tune with topical issues or presented at the right time. It can be dense and rely too much on jargon or technical language. At the same time, researchers are wary that research can be reported out of context, or ‘cherry picked’ to support a particular news or political agenda. This workshop considered the important role that the media play in informing decision making within education policy and practice. It looked at some of the challenges and tensions that prevent effective use of evidence in media reporting. Finally, it explored what can be done to bridge the gaps between research and the media, including looking at some new initiatives that are helping the media find, understand and use research knowledge in their day-to-day work.

Presentations

  • The Scandinavian reception of a Campbell systematic review on bullying and the challenge of communicating research findings in the media Eamonn Noonan, The Campbell Collaboration, International    
  • Problems in knowledge mobilisation: how knowledge changes when it moves between contexts James Thomas, EIPPEE team member, EPPI Centre and the Social Science Research Unit,Institute of Education, University of London, UK
  • The Education Media Centre Jonathan Sharples, Institute for Effective Education, UK 
   

Workshop round up slides

Problems in knowledge mobilisation: how knowledge changes when it moves between contexts

James Thomas
EPPI Centre

 

 

 

Implications for the policy of evidence use

System level

  1. Creation of an independent institute to report systematically on the state of art of evidence and on degrees of solidness.
  2. Development of a dedicated media centre: matchmaking and brokering between evidence and science.
  3. Formation of a European network of dedicated media centres.
  4. A round table (consisting of various professions) at the point where evidence leaves the academic domain.

Organisational level

  1. Organisations should look to develop media strategies for communicating research findings.
  2. Where possible, working with the media should be at infrastructure level: dedicated unit or official.
  3. Capacity building to ensure staff are aware of the importance of communicating with the media and skilled in this.

Implications for the practice of using evidence

  1. Media training for researchers, including developing a series of common training modules (for example, courses around the use of social media tools, writing press releases).
  2. Training for journalists and media professionals to improve research literacy. Develop fact sheets for the media on research principles and methods (for example around correlation and causality, systematic reviews). Rationalize the conversation with both policymakers and journalists on the nature of evidence.
  3. Guidance to research organisations to build their capacity for effective media engagement.
  4. Recognition: rewards for media impact.

Implications for studying evidence use

1. Studies of evidence use by the media. Conduct detailed investigations on the press and media's use of research evidence across different EU countries. Issues to explore include:

  • How do the media currently find and use evidence?
  • What are the determining factors in the media's use/misuse of evidence?
  • Does this differ for different forms of media (for example social media)?
  • How does it differ across countries?
  • How influential is the media's reporting of research evidence on policy making?
  • What are the current gaps in terms of mobilising research evidence for the press/media?
  • What would help bridge those gaps?

2. Drawing on the findings from such studies, and the experience of the UK's Education Media Centre model, pilot a number of Education Media Centres in different EU countries.
3. Differentiated to various media audiences.
4. Inventory of good practices of connecting evidence and the media.
5. Analysis of ‘medialogic’ in the field of education policy and practice.
6. Two-sided interaction: media as a source of information for researchers.
7. The opportunities and threats of social media

Keynote speech presentations

The European Union's actions and policy to improve literacy performance in the EU

Daphne De Wit
School education: literacy, School education and Comenius Unit, Directorate-General for Education and Culture, European Commission
 

EU Education Policy

Jan Pakulski
Directorate-General for Education and Culture, European Commission

 

Putting evidence into action in schools: Experiences from England

Robbie Coleman
Education Endowment Foundation (EEF)

     
     

 

Making children ready for school: A German example for the cooperation between research, policy, and practice
Marcus Hasselhorn
German Institute for International Educational Research (DIPF)

 

Educational research and evidence-based policy: Funding educational research by the Federal Ministry in Germany
Dr. Stephanie Schaerer
Project Management Agency (DLR-PT) for the Federal Ministry of Education and Research in Germany

When evidence confronts politics: competing rationalities in the “smart state”
Tracey Burns
Centre for Educational Research and Innovation, OECD 

  

Engagement across Europe: The EIPPEE Network