Edinburgh Open Research Conference 2022

The first edition of the Edinburgh Open Research Conference took place on 27 May 2022 in a hybrid format at the John McIntyre Conference Centre/online. The conference was organised by Edinburgh ReproducibiliTea, Edinburgh Open Research Initiative and the University of Edinburgh Information Services – Library Research Support. This event was funded by Information Services & the University of Edinburgh Student Experience Grant.

On this page you will find:

  • The conference programme
  • Links to published materials (including recordings, slides, posters etc.)
  • A glossary for commonly used open research-related terms
  • A list of introductory resources on open research

Conference Schedule

Recordings & other materials

Recordings and other materials from the conference can be found here in a journal publication by the University.
We have also uploaded other materials on our OSF page

We will provide an overview of links and materials here as well:


TitleSpeakerLink to recordingLink to slides
Open Access, Data Management and Emerging Challenges to International ResearchGavin McLachlanRecordingSlides
Open Research, Research Culture and Research IntegrityMalcolm MacleodRecordingSlides
Research Culture and Open Research Jane HillstonRecordingSlides
The Edinburgh Open Research RoadmapDominic TateRecordingSlides

Roadmap Jan 2022
Diamond Open Access Publishing in the LibraryRebecca WojturskaRecordingSlides
Scottish Universities Open Access Press: An IntroductionDominique WalkerRecordingSlides
FAIRification as a Team SportSusanna-Assunta SansoneRecordingSlides
Open research in the making: or a call for co-creatorsEugénia RodriguesRecordingSlides
Introducing a Framework for Open and Reproducible Research Training (FORRT)Flavio Azevedo
How does open research impact student outcomes? A Big Team Science review and evidence synthesisMadeleine PownallRecordingSlides
The Benefits of an Open-Science Approach in Student Research ProjectsEmma MacKenzie & Felicity AndersonRecordingSlides
Open Research in the Classroom: A call for participants Emma WilsonRecordingSlides
Research Integrity: the view from the Research OfficeAlan CampbellRecordingSlides
The Intersections between DORA, Open Research, and EquityStephen CurryRecordingSlides


TitleSpeakerLink to slides/materials
Open Research & Public EngagementAnn GrandOSF
Open Research Tools & Support at the University of EdinburghResearch Support ServicesOSF
Registered ReportsNiamh MacSweeney, Kelly Wolfe & Chris ChambersOSF
What Does Open Research Mean for Artificial Intelligence?Sarah Bennett, Benedetta Catanzariti & Auste SimkuteOSF
Open Research: Where to Start?Kaitlyn Hair + Q&A panel Niamh MacSweeney & Sam HaynesOSF
Overcoming Barriers to Embedding Open Research Practices in Teaching/MentoringMadeleine PownallOSF
Open Research Community Meeting: Your Thoughts on the RoadmapRobin Rice & Ros AttenboroughOSF (feedback)
Data Version Control for ResearchersStefano CorettaGithub


TitleSpeakerLink to recordingLink to PDF
Not going to waste: preserving Scotland’s COVID-19 wastewater dataLivia Scorza (1st place)RecordingPDF
Building and sustaining a love of reading in children: The Love to Read ProjectEmily Oxley (2nd place)RecordingPDF
The impact of Augmented Reality (AR) books on the reading engagement and comprehension of child readersKawla AlhamadRecordingPDF
Depression DetectivesIona Beange & Sophia CollinsRecordingPDF
Open science in experimental autism research: a replication study of information transfer within and between autistic and non-autistic peopleCatherine CromptonRecordingPDF
Collaborative learning of new information in older age: a systematic reviewKelly WolfeRecordingPDF
Open research in the classroom: a call for participantsEmma WilsonRecordingPDF
tidyqpcrSam HaynesRecordingPDF
Effects of global discourse coherence on local contextual predictionsGeorgia-Ann CarterRecordingPDF
Two Initiatives Furthering Open ResearchRory MacneilRecordingPDF
Research management in “Many Analysts” projectsStefano CorettaRecordingPDF


Here are a few terms commonly discussed in the context of open research (categorised by theme rather than alphabetical order).
Please see the Framework for Open and Reproducible Research Training (FORRT) glossary for a more extensive list.

Open Science/Open Research/
Open Scholarship
An umbrella term for a set of principles and practices related to academic transparency, reproducibility, accessibility and integrity.
These terms are often used interchangeably, yet there some differences in the level of inclusiveness of these terms.
“Open Science” is regarded as the least inclusive term because it implies that it is exclusive to the disciplines that are commonly considered “the sciences”.
“Open Research” is a more inclusive term as it refers to all open practices in research in all disciplines, not just “the sciences”.
“Open Scholarship” is considered the most inclusive term, as it extends to all disciplines and scholarly activities, including non-research activities such as teaching.
Open AccessMaking scholarly outputs (e.g. published research papers, monographs, data) freely and openly accessible online.
I.e. readers do not have to pay a subscription fee to read/access your work.
Green Open Access: the work is openly accessible from a public repository (e.g. PURE, preprint server)
Gold Open Access: the work is immediately openly accessible upon publication via a journal website
Platinum/Diamond Open Access: a subset of Gold OA in which all works in the journal are immediately accessible after publication from the journal website without the authors needing to pay article processing costs (APCs).
Plan S Plan S is an initiative for moving towards full open-access scholarly publishing, which was launched in 2018 by “cOAlition S”, a consortium of national research agencies and funders from 12 European countries. Read more here
Research metricsQuantitative measurements designed to evaluate research outputs and their impacts.
Journal Impact Factor (JIF)Metric for journals. The JIF for a particular year is calculated as:
the total number of citations of its articles that were published during the previous two years/total number of citable published articles in the journal during those two years
This gives the average number of citations for recently published articles in that journal. Based on the premise that good quality articles, published in respected and well-read journals with rigorous peer review processes, receive more citations, JIFs are often taken as a proxy for the quality of a journal and papers. However, there are several important reasons why we should NOT rely on JIFs as a measure of research quality. These include that the JIF can be influenced in several ways and be misleading (e.g. a few highly-cited articles can inflate the number significantly), that the 2 year time-limit in the calculation of JIFs does not allow for fair comparison between disciplines (in some fields knowledge and citations are accrued more slowly) and importantly that the academic world has grown to rely so much on the impact factor that it has become a quick way to evaluate individual researchers (i.e. what the impact factor was for the journals their work got published in) when deciding who gets a promotion or grant, rather than actually reading and properly evaluating their work. Read more here and here
H-IndexThe H-index is a metric for individual researchers, used a proxy for how productive and influential a researcher is. It is calculated by counting the number of publications for which an author has been cited by other authors at least that same number of times.  (E.g. H-index of 17 means that the researcher has published at least 17 papers that have each been cited at least 17 times).
The Research Excellence Framework (REF) The Research Excellence Framework (REF) is the UK’s system for assessing the excellence of research in UK higher education providers. The REF outcomes are used to inform the allocation of around £2 billion per year of public funding for universities’ research. Read more here
ReproducibilityRe-performing the same analysis with the same methods using a different analyst (to see if the methods are reproducible and give you the same result)
ReplicabilityRe-performing the experiment on a different data set (to see if the study outcome can be replicated)
P-hackingAlso known as data dredging, data fishing, data snooping or data butchery. An exploitation of data analysis in order to discover patterns which would be presented as statistically significant, when in reality, there is no underlying effect.
(i.e. trying different analysis methods until you get a significant result, and then reporting only that result)
HARKingHypothesizing After Results are Known
Presenting a post hoc hypothesis (based on interpreting the results) in a research report as if it were an a priori hypothesis (a hypothesis determined before running the analysis, which would be tested with the analysis).
Pre-printA preprint is a full draft research paper that is shared publicly before it has been peer reviewed. Most preprints are given a digital object identifier (DOI) so they can be cited in other research papers. A preprint is a full draft of a research paper that is shared publicly before it has been peer reviewed.
Examples of pre-print servers are biorxiv and psyarxiv
Pre-registrationSpecifying your research plan in advance of your study (particularly distinguishing your confirmatory and exploratory questions and strategies) and submitting it to a registry. 
Read more here
Registered reportregistered_reports.width-800
Registered reports are a kind of publishing format that includes submitting an introduction and methods section (and sometimes pilot data analysis) for a proposed study, which then undergoes Stage 1 peer review. If the registered report then gets principal acceptance, it means that under the premise that you conduct the research as stated, it will get published regardless of the results and outcomes. You then conduct the study as stated in your methods, write up the results and submit the full report again which will go to a Stage 2 peer review before it goes on to publication.
Read more here
FAIR practiceParticularly refers to data and project management practices and how to ensure these are Findable, Accessible, Interoperable and Reusable. Read more here
San Francisco Declaration on Research Assessment (DORA)San Francisco Declaration on Research Assessment
A set of recommendations to improve the ways in which the output of scientific research is evaluated by funding agencies, academic institutions, and other parties. Read more here
League of European Research Universities (LERU)
The League of European Research Universities: an established network of 23 research-intensive universities in Europe. Develop sand disseminates views on research, innovation and higher education, helping to shape policy at the EU level. One of their publications is a roadmap template for universities that contains 41 recommendations on open science. In January 2022, the Edinburgh Open Research Roadmap was published by the Library which contains an assessment about the progress made at the University of Edinburgh and details steps forward.
Citizen scienceCitizen science is the practice of public participation and collaboration in scientific research to increase scientific knowledge, for example through co-production of research or consultancy.
Public engagementThe term public engagement is used to describe the many ways in which the activity and benefits of higher education and research can be shared with the public for mutual benefit. This includes citizen science projects, but also other projects such as events such as public lectures, science festivals, exhibitions and communication through publications, radio and TV.
Open Research CultureA research culture/environment in which principles of transparency, openness, and reproducibility are considered the norm and valued as important features of research, which are reflected by policies and incentive structures. Read more here
EquityEquity (as opposed to “equality”) recognizes that each person has different circumstances and allocates the exact resources and opportunities needed to reach an equal outcome.
The Edinburgh Open Research RoadmapA self-assessment on the University of Edinburgh’s readiness for Open Research, based on the 37 criteria set out in the LERU Open Science Roadmap. The assessment was carried out by the Library Research Support Team. The Roadmap will be presented at the conference and one of the workshops in round 2 will be a feedback meeting on this document. You can find the document here.

Introductory Resources


  • Farran, E., Silverstein, P., Ameen, A., Misheva, I., & Gilmore, C. (2020). Open Research: Examples of good practice, and resources across disciplines. https://doi.org/10.31219/osf.io/3r8hb

YouTube channels:

Introductory videos:

Resources @ University of Edinburgh:



Other resources