HONG KONG PRINCIPLES

Best Practices

Below some best practices for applying the Hong Kong Principles are described:


Ottawa Hospital Research Institute

Which HKP does the best practice concern?

Reward the practice of open science

What is the best practice about?

Requiring research institute researchers and staff to include a data sharing statement along with every journal submission or research report

How has the best practice been implemented?

We developed an institutional grass roots open science committee to help gauge the views of the institute’s faculty and staff about sharing data, analytical code and other information along with each journal submission or formal report. We also developed a comprehensive 10 module online data sharing educational program. The organization’s scientific director is preparing to send a formal email to all faculty and staff recommending they take the course which includes a certificate. This could be included in their promotion dossier. We are currently bench marking current (2019) data sharing practices. This program will be implemented in 2021. We will provide a more complete report documenting data sharing implementation in 2021.

Where can I learn more on the best practice?

http://www.ohri.ca/journalology/data-and-materials-sharing

 


Amsterdam University of Applied Sciences

Which HKP does the best practice concern?

Assess responsible research practices
Value complete reporting

What is the best practice about?

Within the research community at the ‘Urban Vitality’ Center of Expertise (UV-CoE), the Open Science Support Desk (OSSD) implemented a 14-item Open Science checklist. The checklist was inspired by - and its items are, if possible, explicitly linked to - the REWARD research waste pillars and Netherlands Code of Conduct for Research Integrity (2018) articles.

Our direct goal is to monitor progress of adherence to the 14 practices by our researchers. By measuring who experiences hurdles where, we expect to be able to tailor our future OSSD support services and educational activities and improve adherence.

How has the best practice been implemented?

With the support of the UV-CoE’s Board, the checklist is embedded in other OSSD activities, such as a consultation service by data stewards, quantitative and qualitative methodologists, and a web-based Open Science research manual. Inspired by initiatives at e.g. the Berlin Institute of Health, we strive towards using our researchers’ Open Science accomplishments for decisions around hiring and career development. We are currently running a pilot assessment of adherence to the 14 principles on a sample of UV-CoE output. We are liaising with a research group around Dr. Tracey Weissgerber (Berlin) to explore automation of (some) assessments.

Where can I learn more on the best practice?

https://uvaauas.figshare.com/articles/online_resource/Open_science_checklist_Urban_Vitality/12213467

https://www.amsterdamuas.com/uv-openscience/

An evaluation report can be found here: https://doi.org/10.21943/auas.13557530.v1

 

 


British Neuroscience Association

Which HKP does the best practice concern?

Assess responsible research practices
Reward the practice of open science
Value complete reporting
Acknowledge a broad range of research activities
Recognise essential other tasks like peer review and mentoring

What is the best practice about?

Equipping neuroscience researchers with the tools to help practice open science and other credible practices: by promoting the concept of preregistration posters; by supporting reproducibility, replicability and reliability in science by means of its own open access journal (Brain and Neuroscience Advances); by putting explicit value in transparency and reproducibility of research, as well as focusing on quality rather than quality in research; by recognising efforts to make research as reproducible, replicable, robust and reliable as possible by means of the BNA Credibility Prize.

How has the best practice been implemented?

Principle 1: Assess researchers on responsible practices from conception to delivery, including the development of the research idea, research design, methodology, execution, and effective dissemination

As part of the BNA’s work to improve credibility in neuroscience, we have promoted the concept of preregistration posters to the neuroscience community. Prereg posters are a new format that supplement the Center of Open Science's preregistration service, offering a useful additional step that researchers can take prior to (or alongside) placing a research plan in a registry - helping to counter publication bias and non-reproducibility, and strengthen credibility in their work.

The BNA2019 Festival of Neuroscience was the first large conference to support prereg posters in significant numbers, with nearly a fifth of all presented posters (100/491) in this new format, covering a diverse range of neuroscience topics and disciplines. We have highlighted the importance of providing clear information to both submitters and reviewers on what prereg posters should contain, and how they are different to standard conference posters: https://bnacredibility.org.uk/preregposters

We have also developed a prereg poster badge to help promote use of this poster format:

https://bnacredibility.org.uk/prereg-poster-badge

Principle 2: Value the accurate and transparent reporting of all research, regardless of the results

Brain and Neuroscience Advances is the BNA’s society-owned journal. The Journal was launched in

2017, and is at the forefront of fully open-access publishing playing a major role in the BNA’s credibility in neuroscience campaign, by way of being fully open access, publishing null results and Registered Reports, and using CRedIT and Transparency and Openness Promotion (TOP) badges - all features which help the publishing process support reproducibility, replicability and reliability in science.

Principle 3: Value the practices of open science (open research)—such as open methods, materials, and data

Part of the BNA’s Credibility in Neuroscience campaign focuses on equipping neuroscience researchers with the tools to help practice open science and other credible practices. We provide guidance through our toolkits on a simple set of actions that researchers working in different areas of neuroscience cam take to increase credibility – including in vitro and in vivo neuroscience https://bnacredibility.org.uk/toolkits. We also recognise that, for many neuroscientists, it can be daunting to dive into open science if they’re new to the area, so we try and encourage individuals to try and take single steps to begin that journey – see for example the advice we provide to academics at different career stages on what they can do https://bnacredibility.org.uk/academia

Principle 4: Value a broad range of research and scholarship, such as replication, innovation, translation, synthesis, and meta-research

A key aim for the BNA’s Credibility in Neuroscience campaign is to change the landscape in which neuroscientists operate so that the influences which drive research also drive the most credible research. We believe the Research Excellence Framework (REF) needs to be reformed to ensure it places explicit value in transparency and reproducibility of research, focussing on quality rather than quantity, and will be making the case for this in 2021.

Principle 5: Value a range of other contributions to responsible research and scholarly activity, such as peer review for grants and publications, mentoring, outreach, and knowledge exchange

In 2021 we launched the BNA Credibility Prize to recognise efforts to make research as reproducible, replicable, robust and reliable as possible. Within the prize guidelines, we have specifically highlighted that contributions to this can be a range of activities, including “Teaching/mentoring or other activities aimed at improving research culture within neuroscience that strengthen credibility” (https://bnacredibility.org.uk/prizes).

We have also changed the criteria for making submissions to have a session at our biennial Festival of Neuroscience, so that, instead of providing a key publication for each speaker, they should provide “One recent publication or other research output", as a step towards acknowledging outputs other than just publications.

Where can I learn more on the best practice?

https://bnacredibility.org.uk/

Search

We use cookies on our website. Some of them are essential for the operation of the site, while others help us to improve this site and the user experience (tracking cookies). You can decide for yourself whether you want to allow cookies or not. Please note that if you reject them, you may not be able to use all the functionalities of the site.