ReproducibiliTea Blog

Easing Into Open Science 17/09/2021 with Dr Priya Silverstein

written by Laura Klinkhamer (co-organiser of Edinburgh ReproducibiliTea)

Dr. Priya Silverstein

In this session we took a look at the following paper:
Ummul-Kiram Kathawalla, Priya Silverstein, Moin Syed; Easing Into Open Science: A Guide for Graduate Students and Their Advisors. Collabra: Psychology 4 January 2021; 7 (1): 18684. doi:

and we were joined by one of the authors, Dr Priya Silverstein for a live Q&A.

The paper is a great place to start for people who are new to open research concepts and provides a very useful summary and guide for some practices you could consider applying to your research. It introduces open science (now often referred to as open research, to include the academic disciplines that would not describe themselves as a “science”), as a “broad term that refers to a variety of principles and behaviors pertaining to transparency, credibility, reproducibility, and accessibility” (Kathawalla et al., 2021, p. 2). The paper is written specifically for the types of situations that graduate students are more likely to encounter, but the practices described are broadly applicable to researchers of any career stage.

Paper Summary

Eight Open Research (OR) practices are outlined in this guide and classified according to the author’s perception of the difficulty of implementation.

Practice 1 is to set up or join an open research (journal) club, such as the with the ReproducibiliTea organisation. This can be a quick and efficient way of getting to grips with some key concepts of the reproducibility and OR movement, while meeting new people along the way and increasing your network.
For researchers in Edinburgh – we encourage you to join the Edinburgh Open Research Initiative Teams group, which serves as a hub for bringing people interested in OR together. The University of Edinburgh now also has an Open Research Blog and newsletter that you can sign up for here.

Practice 2 refers to thinking about your project workflow, in particular setting up your file organisation, data access regulations and keeping clear records so that (future) you and others can quickly get an overview of your project and are able to reproduce the outcomes. For more information and tips on how to work reproducibly, we refer you to Kaitlyn Hair’s talk (Edinburgh ReproducibiliTea session Nov 2020) on selfish reasons to work reproducibly and Ralitsa Madsen’s talk on RSpace, a platform that you could consider to set up a project workflow in addition to the freely accessible Open Science Framework.

Practice 3 is about preprints, which refers to the practice of publishing your manuscript before or during peer review. Check with the journal where you intend to publish your work first on what their policy on pre-prints is by either messaging them or checking this source suggested by Priya. Pre-prints are a way to bring your research out to the world, even if publication is delayed or rejected. It also increases the number of times your work will be cited. There are free servers that you can upload your manuscript as a pre-print to, such as bioRxiv for biology.

Practice 4 refers to creating reproducible code/analyses. It is very helpful for your project workflow and reproducibility to write your code/analysis plans in such a way that it is clear beyond a doubt for others and your future self what you did. Annotating your steps and writing README files, basic text files describing for instance what files are in your project space/folder and what role they play in your project (e.g. data_spreadsheet_version3 contains the clean data on x number of participants that is used for Analysis B.), are very useful practices.

Practice 5 Sharing data is very useful to the scientific community and there are many platforms that you could upload your project’s anonymized data set to (e.g. OSF again). However, it is very important to make sure you are legally allowed to share the data. This will depend on your local and wider data regulation guides (e.g. in the EU GDPR applies) as well as what has exactly been put into the consent forms (if applicable to your project of course).
There are also options to upload only part of your data set or set up a system so that others can access more sensitive data. Talk to your supervisors/collaborators and check University research support services to see what would be most suitable in your case (for instance see this resource for the school of PPLS).

Practice 6 Being very open in your manuscript writing. In a way it’s fascinating how the norm in manuscript writing is that the research story gets presented as an almost perfect execution of a plan with a happy ending (i.e. significant results), whereas in reality you often hear researchers struggling with all kinds of issues and ending up with a manuscript that is only very slightly connected to the original research idea. It’s not very realistic, and actually harmful to scientific integrity. So if we allow ourselves to be humans, who make mistakes, and allow others to read about and learn from our mistakes, wouldn’t that make life easier?

Practices 7 & 8 are related.
Pre-registration: a time-stamped, read-only version of your research plan created before you begin data collection/analysis.
Registered report: similar to pre-registration but your research plan undergoes peer review before results are known. Helpful resource on the Centre for Open Science website here.
Both practices are very useful ways that make you sit down and plan your research before executing it. In the case of registered reports, you will also obtain feedback before executing it, which may be much more useful than receiving feedback after the fact in the regular peer-review system. Although you state what you intend to research in a pre-registration or registered report, it is important to realise that you do not sign a binding contract. If it turns out that another method or additional exploratory analysis are interesting to your research question, you are of course able to make changes. It is however your responsibility to transparently report and justify these changes.
As Niamh summarised: these practices do not stifle creativity, but create accountability.

It is important to realise that engaging is in OR practices is not an all-or-nothing approach. It’s much more about adopting a certain critical mindset and taking (small) steps that are suitable for you and your specific project.


During the discussion Priya mentioned that if the paper were to get written this year she would probably include the same practices, but elaborate on the increased number of options in which they could be applied. For example, one thing that has changed in recent years is that registered reports have become available for projects with secondary data analysis (rather than it just being available for projects where the data still is to be collected).

Another interesting development is that of Peer Community In Registered Reports, which facilitates scheduled peer review. You indicate beforehand when you intend to hand in Stage 1 of your registered report (Introduction & Methods) and the community tries to arrange reviewers for that particular time frame, meaning that the peer review process can be completed much more rapidly. Priya mentioned that in the past this has been one of the main criticisms of the registered report that it was unclear when researchers could start their research analysis/data collection because of uncertainty regarding the peer review duration. This new facility makes registered reports an even more attractive option.

Will Cawthorn added that Review Commons is another place you can send your manuscript to for general peer review. If the manuscript then passes review, you can choose from a list of journals where to publish your paper. This approach decreases redundancy in peer review (i.e. if rejected from one journal, you don’t go through the roulette wheel of another round of completely new peer review).
Priya confirmed that this procedure is also in place for PCI registered reports. From the website: “Following the completion of peer review, authors of RRs that are positively recommended have the option to publish their articles in the growing list of PCI RR-friendly journals that have committed to accepting PCI RR recommendations without further peer review.”

We also had a group discussion about how we could further promote OR in the University. One of the suggested routes was to include more OR practices in undergraduate and postgraduate course curriculae. Will Cawthorn also referred to an OR roadmap for the University of Edinburgh that he and several others (including many from the Library and Research Support Services) are working on that is due to be published soon. Priya emphasised that is important create momentum both through bottom-up and top-down initiatives at the same time to bring about real change in the research culture. This nicely connected to our next session on Friday 15 October, which will be on how to build an open research culture in your lab/research group, by Dr. Will Cawthorn (LERU Open Science Ambassador for the University of Edinburgh).

Then I said something silly about how we should all jump aboard the Open Research train and Priya kindly replied with a “choo choo!” making me feel slightly less embarrassed.

All things considered, we look back at a successful first session of this academic year!

The presentation slides and meeting recording can be found on our OSF page. The meeting recordings can also be found on our YouTube channel.

If you’d like to stay up to date with Edinburgh ReproducibiliTea, please consider joining our mailing list by filling in this form and/or following us on Twitter.

For any questions/suggestions, please send us an email:

EORI Bulletin

02/08/2021 – 5-minute update

EORI keeps an eye on changes in the fields of Open Science, FAIR data principles, and others, and directs any interested parties to important updates:

  • Of significance, the European Research Council has banned grant applicants from including journal metrics when applying for grants (here). This is a welcome step forward for research being considered based on quality rather than where it’s been published. Hopefully we’ll see more of this from other funders.
  • Clinical trial data from many FDA-approved drugs are still not available (here); for 3/4 products, one or more relevant trial(s) are unavailable for independent inspection. Further, around 1/4 of these breached legal requirements. There are suggestions that the introduction of European-style legislation could help remedy this (here), and allow for relevant data to be accessible. 
  • This systematic review and meta-analysis of attempts to improve the peer review process of biomedical research. It neatly synthesises the results of various interventions, and may give us an idea of how the peer review process may change for the better in the future. 
EORI Bulletin

21/07/2021 – 5 minute update

EORI keeps an eye on changes in the fields of Open Science, FAIR data principles, and others, and directs any interested parties to important updates:

  • Here is a well laid out resources for finding information on almost all aspects of Open Science. It’s useful since it links to primers such as this one, which explains Open Peer Review in an easy to understand way, or this one on applying FAIR principles. 
  • A call to action we can all get behind! This article argues that we shouldn’t review submissions for journals which effectively profit from restricting access to knowledge – which they’ve not paid to generate – behind a paywall. I personally refuse to review for any journals which isn’t acceptably open access, and I encourage everyone else to do the same! The argument resurfaces as a new route to Open Access, Quartz Open Access, is announced. Lots to consider.
  • The Confederation of Open Access Repositories (COAR) has launched a strategy to modernise repositories (here). Plans will be developed July-Sept 2021, and will help repositories to maximise the roles they can play. It’s announced as another tool has been announced (here) for assessing alignment of biomedical data repositories with open, FAIR, citation and trustworthy principles. Additionally, this work identifies some of the barriers to data sharing through repositories and other platforms. 

The best way to get more updates is to follow EOSI on Twitter

EORI Bulletin

05/07/2021 – 5-minute update

After a brief respite, EORI is back!

EORI keeps an eye on changes in the fields of Open Science, FAIR data principles, and others, and directs any interested parties to important updates:

  • As a nice punchy start, if you’re interested in a guide to Open Science, two handbooks (here and here) contain a lot of information presented neatly about how to get cracking with Open Science & Open research practices. Considering that practicing Open Research can be a gateway to the ‘leadership table’ for ECRs, the guides can be seen as great resources to not only practice Open Science but move up the ladder because of it.
  • We’ve previously mentioned the citation advantage on Open Access. Adding to this is this study which found an increase in news media mentions of Open Access. Overall, some studies report an advantage, others report no advantage, and there’s some suggestion that it might be field dependent. Thankfully, a group as conducted a systemic review to try and explore this (here). Though not conclusive, they bring together many of the studies conducted for us all to see. 
  • In case you needed any further reasons, here’s an argument that Open research can help in the fight against climate change!

As we were off for a week, here are some quickfire mentions:

  • This work touches on some rewards for supporting Open research practices.
  • This work argues that empowering ECRs is one key to improving research quality. 
  • Finally, this work uses Game Theory to reason that publishers will converge on an Open Access publishing strategy – good news to Open Science advocates everywhere!
EORI Bulletin

07/06/2021 5-minute update

EORI keeps an eye on changes in the fields of Open Science, FAIR data principles, and others, and directs any interested parties to important updates:

  • Worryingly, but somewhat predictably, nonreplicable publications are cited more than replicable ones (here). Worse, the difference in citation rates do not change after the publication of the failure to replicate. If you’re using R, then thankfully there’s a tool/ package which has been recently developed, Easyreporting, to help reproducibility in code, but if you’re not then we need to find other ways. It’s worrying that this knowledge which is ontologically false continues to be cited and spread, like science’s equivalent of fake news. This comes as others have made a suggestion to the culture around citation: the right to refuse citations. It’s discussed as a potential reaction to being citied by predatory journals or by papers with questionable ethics/ methods etc., and they make some interesting points. 
  • Following the news, mentioned in the last update, that Clarative Analytics had bought Proquest, there’s pushback and concern from the community (here). The drive towards a monopolistic control of these systems and data is spurring calls for regulation and oversight. Considering Times higher Education’s recent call for academics to become involved in the Open Access struggle, this could be a good place to start. It also comes as SAGE journals have announced that its offering Open Peer Review using Clarative’s Web of Science portal (here), which is simultaneously a great initiative to be implemented (of which EORI thoroughly approves) and also a monopoly-building action. Hopefully this will 
  • Here’s a nice explainer behind preprints, and there’s an interesting new course dedicated to them (here). 2/3 of preprints go on to be published in journals (here), which could be suggestive of the amount of knowledge which never sees publication, or possibly of the issues which arise in 1/3 of work. Either way, accessing this data can be of great use. 
EORI Bulletin

24/05/2021 5-minute update

EORI keeps an eye on changes in the fields of Open Science, FAIR data principles, and others, and directs any interested parties to important updates:

  • On a previous update, we’d mentioned Sci-hub, a website which gives access to research articles by cycling through IP addresses until it finds one which is permitted, and the FBI’s attempts to shut it down by accessing the founder’s data and Apple account. Thankfully, fans of the site are mobilising via Reddit to save the platform, by backing up its combined 77Tb of data – a rather large task. Even though Sci-hub’s website it still online, it’s been unable to add any more papers since this latest attack began, meaning that previous manuscripts can still be accessed with it – for now – but newer manuscripts cannot. Vice covers this here. Alexandra Elbakyan, the website’s founder, reasons that corporations are gatekeeping knowledge for profit, and that the public are the ‘real’ owners of that information. More developments will surely come! Also, if you want to get around the block that the UK’s internet suppliers have put on Sci-hub, there’s a guide here.
  • Clarative have bought ProQuest for the hefty sum of $5.3 billion (here), which adds to their portfolio of bought companies in library services which shows no sign of halting expansion. Reaction to this could be generously described as mixed. It comes as they’ve introduced their new research metric, the Journal Citation Indicator (JCI), which aims to normalise research fields to citation and publication rates to a single journal-level metric. However, following the rise of the Declaration on Research Assessment (DORA), which effectively states that assessing research with single level metrics is inappropriate and instead research should be judged on its own merits, the scientific community are moving away from simplistic reductions of research to arbitrary numbers. Other well-intentioned single-value metrics developed for this, such as the H-index, are also inappropriate. At best, the JCI can be described as well-intentioned, but considering it’s a black-boxed calculation which has the potential to cause many more problems than it solves, it’s a wonder why anyone spent time developing it. Unless, of course, it’s intention is to financially benefit the company but not the scientific community, but with the narrative framing of its introduction by Clarative (here) could that possibly be the case? I’ll leave you to decide.
  • After some long entries, here’s a short one: Dockstore is an open-source platform for publishing, sharing, and finding bioinformatics tools and workflows. More info here.
EORI Bulletin

10/05/2021 5-minute update

EORI keeps an eye on changes in the fields of Open Science, FAIR data principles, and others, and directs any interested parties to important updates:

  • Many people will know of Scihub, a website which gives access to research articles by cycling through IP addresses until it finds one which is permitted. The founder, Alexandra Elbakyan, is being chased up by the FBI as the American legal system is trying to have it shut it down (here). Amusingly, Sci-Hub may be beneficial to research as Indian research which is available on Sci-Hub garners more citations than work which is not (here). Considering this, and the morality of Open Access (here), researchers should be in favour of the website, yet journals and publishers are not. As a website that gives access to (usually publicly funded) research, it seems to occupy a legal grey zone of being morally right but legally wrong. Worth keeping an eye on developments.
  • There’s an interesting question in the debate about predatory journals and conferences. Even if the journals/ conferences are predatory, the science held within the journals may still be perfectly sound, and the conferences may still yield genuine networking opportunities. Additionally, some publish in them as their articles may not be accepted elsewhere, meaning that without predatory journals/ conference that work & data may not be reported for some time, if at all. The question is, considering these (and cost aside), are predatory journals & conferences actually a bad thing? Influencing the answer, one group found that research in predatory journals are less statistically sound and data presentation worse (here). This finding runs against potential ‘benefits’ from predatory publishers. 
  • Nice and quickly, Dr Rhodri Leng presented to the Riot Science Club recently about citation biases (here). Well worth watching!
EORI Bulletin


EORI keeps an eye on changes in the fields of Open Research, FAIR data principles, and others, and directs any interested parties to important updates:

  • This piece argues convincingly for Open Peer Review, where reviewer reports & interactions between the parties involved in the peer review process are published alongside the primary research output. It also misses some benefits of Open Peer Review such as providing a window for less experienced scientists to learn from and about the peer review process (important, as less-established scientists are among the best peer reviewers), and combat biases in reviewing (here and here). 
  • has some really straight forward Open access tools to use. Good to play with!
  • There are some interesting upcoming talks/ symposia on various aspects of Open Research, including Open Repositories,  Peer Review and Pre-printsOpen Scholarship week events, and critically analysing scientific reform etc. There are frequent talks on almost all areas of opens science, targeted at all levels ranging from the uninitiated to seasoned Open Research scholars. Additionally, many talks end up on Youtube, allowing anyone to access them any time (E.g. Edinburgh Reproducibility’s account). If you’re interested in any area of Open Science, just search it!
ReproducibiliTea Blog

A selfish guide to RSpace: Why and How?

In this session, Post-Doctoral Research Fellow Dr. Ralitsa Madsen covers why using an Electronic Lab Notebook (ELN) is a great idea. Dr. Madsen suggests that there are many rewards in using ELN for a reproducible workflow and they include:

  • Saving a lot of time while reading, searching the documents and so on.
  • If you are a postgraduate or graduate student, it will be more convenientwhile retrieving the details that you need for materials and methods section of your research.
  • ELN makes it easier to collaborate not only within but also outside of the group.
  • Lab members can pick up where you left off, therefore it ensures the continuity of the research.
  • It is much safer to rely on an ELN rather than your hard drive. Your documents will be accessible even if your computer gets damaged/stolen.
  • It is necessary to have an extensive documentation, version control and traceability of your work if you would like to make a patent application.
  • In addition, RSpace is well-integrated with many other services like Mendeley, Microsoft Office, Dropbox, Google Drive as well as data repositories like Git Hub.

After naming several great reasons, Dr. Madsen goes on to do a walk-through of the tool and gives useful tips to facilitate RSpace adoption within the lab: 

First, you should think FAIR: are your documents easily findable? Are they accessible to researchers inside or outside of the lab? Is it interpretable? Can others read through the lab book and reuse your protocol for their experiment?

But for this to work, says Dr. Madsen, you also need to create;

  • A lab book entry template which will ensure consistency and make it easier to collaborate,
    • Notebook based project organisation, 
    • Data storage rules that are motivating to use external repositories and
    • Consistent file naming rules

Do not forget to check Dr. Ralitsa Madsen’s RSpace demonstration on Edinburgh Reproducibility’s YouTube channel if you haven’t already!

This blog is written by Bengü Kalo

Find more information about the RSpace here

Edinburgh RT YouTube Channel

Edinburgh RT OSF page

Edinburgh RT Mailing List

EORI Bulletin


EORI keeps an eye on changes in the fields of Open Science, FAIR data principles, and others, and directs any interested parties to important updates:

  • Wikipedia has launched a new project, Wikiexperiments. They want to collect & upload more videos of scientific experiments being conducted. For openness reasons this is a great idea, but there is also the potential for Wiki to extend this and act as a sort of Open repository for training videos etc. in the future. If this does end up happening, it could be a great thing for openness & reproducibility. Let’s keep an eye on it!
  •  The International Science Council has established their steering group for their next phase of their project addressing Scientific Publishing (here). They will work on enabling efficient dissemination and use of scientific work as part of Open Science, and it’ll be interesting to see what they come out with. Some previous work & recommendations of the group here.
  • Yet more work shows that publishing Open Access increases citations and altmetric numbers, this time in electrophysiology. Moreover, journals converting to Open Access increases citations & benefits the journals (here). These, combined with ethical reasons for scientists not to review for commercial journals (explored here), leave little justification for journals to not convert to Open Access publishing, though we may be biased on this conclusion…