PreprintPDF Available

The Hong Kong Principles for Assessing Researchers: Fostering Research Integrity

Authors:
  • Amsterdam University Medical Centers - Vrije Universiteit
Preprints and early-stage research may not have been peer reviewed yet.

Abstract

The primary goal of research is to advance knowledge. For that knowledge to benefit research and society, it must be trustworthy. Trustworthy research is robust, rigorous and transparent at all stages of design, execution and reporting. Initiatives such as the San Francisco Declaration on Research Assessment (DORA) and the Leiden Manifesto have led the way bringing much needed global attention to the importance of taking a considered, transparent and broad approach to assessing research quality. Since publication in 2012 the DORA principles have been signed up to by over 1500 organizations and nearly 15,000 individuals. Despite this significant progress, assessment of researchers still rarely includes considerations related to trustworthiness, rigor and transparency. We have developed the Hong Kong Principles (HKP) as part of the 6th World Conference on Research Integrity with a specific focus on the need to drive research improvement through ensuring that researchers are explicitly recognized and rewarded (i.e., their careers are advanced) for behavior that leads to trustworthy research. The HKP have been developed with the idea that their implementation could assist in how researchers are assessed for career advancement with a view to strengthen research integrity. We present five principles: responsible research practices; transparent reporting; open science (open research); valuing a diversity of types of research; and recognizing all contributions to research and scholarly activity. For each principle we provide a rationale for its inclusion and provide examples where these principles are already being adopted.
TheHongKongPrinciplesforAssessingResearchers:1
FosteringResearchIntegrity2
3
DavidMoher1,2,LexBouter3,4,SabineKleinert5,PaulGlasziou6,MaiHarSham74
VirginiaBarbour8,Anne‐MarieCoriat9,NicoleFoeger10,UlrichDirnagl11
5
6
1CentreforJournalology,ClinicalEpidemiologyProgram,OttawaHospitalResearchInstitute;2Schoolof7
EpidemiologyandPublicHealth,UniversityofOttawa,Ottawa,Canada;3DepartmentofEpidemiology8
andBiostatistics,AmsterdamUniversityMedicalCenters,locationVUmc;4DepartmentofPhilosophy,9
FacultyofHumanities,VrijeUniversiteit,Amsterdam,TheNetherlands;5TheLancet,LondonWallOffice,10
London,UK;6InstituteforEvidence‐Basedhealthcare,BondUniversity,GoldCoast,Qld,Australia;and11
7SchoolofBiomedicalSciences,LKSFacultyofMedicine,TheUniversityofHongKong,Pokfulam,Hong12
KongSAR,China;8QueenslandUniversityofTechnology(QUT),Brisbane,Australia;9WellcomeTrust,13
London;10AustrianAgencyforResearchIntegrity,Vienna,Austria;11BerlinInstituteofHealth,QUEST14
CenterforTransformingBiomedicalResearch,Berlin,Germany15
16
DavidMoher:ORCID0000‐0003‐2434‐420617
LexBouter:ORCID0000‐0002‐2659‐548218
SabineKleinert:ORCID0000‐0001‐7826‐118819
PaulGlasziou:ORCID0000‐0001‐7564‐073X20
MaiHarSham:ORCID0000‐0003‐1179‐783921
VirginiaBarbour:ORCID:0000‐0002‐2358‐244022
Anne‐MarieCoriat:ORCID0000‐0003‐2632‐174523
NicoleFoeger:ORCID0000‐0001‐7775‐732524
UlrichDirnagl:ORCID0000‐0003‐0755‐611925
26
27
13thSeptember2019 28
Abstract29
30
Theprimarygoalofresearchistoadvanceknowledge.Forthatknowledgetobenefitresearchand31
society,itmustbetrustworthy.Trustworthyresearchisrobust,rigorousandtransparentatallstagesof32
design,executionandreporting.InitiativessuchastheSanFranciscoDeclarationonResearch33
Assessment(DORA)andtheLeidenManifestohaveledthewaybringingmuchneededglobalattention34
totheimportanceoftakingaconsidered,transparentandbroadapproachtoassessingresearchquality.35
Sincepublicationin2012theDORAprincipleshavebeensigneduptobyover1500organizationsand36
nearly15,000individuals.Despitethissignificantprogress,assessmentofresearchersstillrarelyincludes37
considerationsrelatedtotrustworthiness,rigorandtransparency.WehavedevelopedtheHongKong38
Principles(HKP)aspartofthe6thWorldConferenceonResearchIntegritywithaspecificfocusonthe39
needtodriveresearchimprovementthroughensuringthatresearchersareexplicitlyrecognizedand40
rewarded(i.e.,theircareersareadvanced)forbehaviorthatleadstotrustworthyresearch.TheHKP41
havebeendevelopedwiththeideathattheirimplementationcouldassistinhowresearchersare42
assessedforcareeradvancementwithaviewtostrengthenresearchintegrity.Wepresentfive43
principles:responsibleresearchpractices;transparentreporting;openscience(openresearch);valuing44
adiversityoftypesofresearch;andrecognizingallcontributionstoresearchandscholarlyactivity.For45
eachprincipleweprovidearationaleforitsinclusionandprovideexampleswheretheseprinciplesare46
alreadybeingadopted.47
Introduction48
Inaquesttoadvanceknowledge,researcherspublishapproximately1.5millionjournalarticleseach49
year.Thepresumptionisthatthisliteraturecanbeusedbyotherresearchers,stakeholders,andthe50
widersocietybecauseitistrusted,robust,rigorousandcomplete.51
52
Theapproachtakentovalidatingresearchanditsoutcomesdiffersdependingonthenatureofthe53
research.Forexample,torigorouslyexaminetheeffectsofahealthintervention,trialparticipants54
(humanoranimal)aretypicallyrequiredtoberandomizedtotheinterventionbeingstudied.Many55
researchersadvocateregistrationofprotocolsasawaytoensuretransparencyandtoenableothersto56
engagewiththeirresearch.Subsequently,theuseofreportingguidelinescanhelpensurecompleteand57
transparentreportingoftheresearchers’methodsandresults.Whentheresearchisbeing58
disseminated,theresearchteamwouldensurethattheassociateddata,materialsandanyanalytical59
codearemadeavailableasanintegralpartofpublication.Suchdatasharingfacilitatesre‐analysisofthe60
datatocheckreproducibilityandtoperformsecondaryanalyses.61
62
Althoughsomemechanismsexisttosupportresearchersinensuringtransparencyatallstagesofdesign,63
executionandreporting,thereisnotwidespreadadoptionofthesepractices.Therearemany64
interwovenreasonsforthis.Onecontributingfactor,weargue,isthatlittleemphasisisplacedonthe65
rigorofresearchwhenhiring,reviewingandpromotingresearchers.Workingtogetheracrossthe66
researchsectorasawholetoaddressthissystemicissue,webelieve,offersaglobalopportunityto67
improveresearchandimpact.68
69
WedevelopedtheHongKongPrinciples(HKP)aspartofthe6thWorldConferenceonResearchIntegrity70
(WCRI)specificallytodrivegreaterrecognitionforresearcherswhocommittorobust,rigorousand71
transparentpractices(i.e.,theircareersareadvanced)(seeFigure).Ifimplemented,theHKPcouldplaya72
criticalroleinevidence‐basedassessmentsofresearchersandputresearchrigorattheheartof73
assessment,aswellasopenupresearchtothewiderbenefitofsociety.74
75
Weproposefiveprinciples,eachwitharationaleforitsinclusion.Weillustratetheseprincipleswith76
exampleswhereweknowtheyexist.Theseexamplesarenotexhaustive,andmanyarerelevanttomore77
thanoneprinciple.Together,theyillustrateofabreadthofapproachesastohowtheseprinciplescan78
operateattheveryhighestlevelsofinternationalresearch.79
80
EarlydraftsoftheHKPwerecirculatedtothe700participantsregisteredforthe6thWCRI.Further81
discussionstookplaceduringtwosessionsatthe6thWCRI.Apenultimateversionwasuploadedonthe82
6thWCRIwebsiteaftertheconference.Morethan100peopleprovidedinputandfeedback.We83
acknowledgeallofthesevaluablecontributionsandthegloballeadershipofthoseworkingontheSan84
FranciscoDeclarationonResearchAssessment(DORA),theLeidenManifestoandotherinitiativesto85
promotetheresponsibleuseofmetrics,whichhavelaidthefoundationsformuchofourwork(1‐4).86
87
Principles88
Principle1:Assessresearchersonresponsiblepracticesfromconceptiontodelivery,includingthe89
developmentoftheresearchidea,researchdesign,methodology,executionandeffective90
dissemination.91
92
Rationale93
Thenumbersofpublications,citations,andtotalvolumeofgrantsareoftenstillthedominantmetrics94
usedbyresearchinstitutionsforassessingandrewardingtheirresearchers(1‐4).Providingbonusesto95
academicsforpublishingincertainjournals(i.e.,meritpay)isalsocommoninmanypartsoftheworld96
(5‐7).Theseassessmentcriteriatellassessorslittleabouttheresearchersandtherigoroftheirwork;97
thustheyarenotparticularly‘responsible’metrics.Thesemetricscanalsobeundulyinfluencedbyfield98
andcitationpracticesandprovidelittleinformationaboutapublication’s(andthereforearesearcher’s)99
contributionstoresearchandsociety.Othercriteriaarerequiredtoprovideabroaderviewofmarkers100
ofbestpractices:forexample,theextenttowhicharesearcherdevelopsresearchquestionswiththe101
involvementofappropriatemembersofthepublic(seeFigure).102
103
Currentimplementation104
TheCanadianInstitutesofHealthResearch’sStrategyforPatient‐OrientedResearch(SPOR)isamulti‐105
million‐dollarinitiativetobringpatientsintoabroadrangeofactivitiesregardingresearchacross106
Canadianprovincesandterritories(8).Patientsarenowactiveinthedevelopmentofresearchprojects107
insettingprioritiesandformulatingstudyquestions.TheOntarioresponse(OntarioSUPPORTUnit)has108
includedaseriesofarticleswithpatientstakingaleadershiproleinco‐authoringthecontent(9).Inthe109
UK,theJamesLindAlliance,fundedbytheUKNationalInstituteofHealthResearch(NIHR),isa110
successfulexampleofincludingpatients,carersandclinicianstodeveloppriority‐settingpartnerships111
(10)andquestionformulation(11).112
113
WithafocusonenhancingreproducibilitytheUSNationalInstitutesofHealth(NIH)haverevisedtheir114
applicationinstructionsandreviewcriteriatostrengthenscientificrigorandtransparency(12).Oneof115
theresourcestheyrecommendistheExperimentalDesignAssistant(EDA)developedbyTheNational116
CentrefortheReplacement,Refinement&ReductionofAnimalsinResearch(NC3Rs).This10‐module117
onlinetoolwasdevelopedtoassistresearchersinthedesignandanalysisofanimalexperiments.It118
includesdedicatedsupportforrandomization,blindingandsamplesizecalculation.Itcanalsobeused119
tohelpresearcherspreparetheexperimentaldesigninformationandanalysisplanrequestedforgrant120
applications(13).TheNC3RsencouragestheuseoftheEDAsothatapplicantscangenerateanEDA121
report,whichcanbesubmittedinplaceoftheexperimentaldesignandmethodologyappendix.122
123
Otherexamplesofpreferredcriteriaincludesocialmediametricsasindicatorsofdisseminatingresearch124
(14),publiclecturesabouttheresultsofaresearchproject,publicengagementandothertypesof125
eventsthatbringtogetherfunders,researchersandotherstakeholderstoworkonaneffective126
communicationplanoftheresearchprogram(15).OrganizationssuchastheWellcomeTrustaretaking127
aholisticattitudetoredefiningtheirapproachtoengagementexplicitlytohelppeoplefeelempowered128
toaccess,use,respondtoandcreatehealthresearch(16).129
130
Principle2:Valuetheaccurateandtransparentreportingofallresearch,regardlessoftheresults.131
132
Rationale133
Failuretopublishallfindingsofallstudiesseriouslydistortstheevidencebasefordecisionmaking.For134
example,asystematicreviewoftrialsofreboxetinefortreatingdepressionfoundthatalmostthree135
quartersofincludedpatientswereinunpublishedtrials(17).Selectivepublishingofresearchwith136
positiveresults(i.e.,publicationbias)distortsscience’sevidence‐baseandhasbeendemonstratedina137
varietyofdisciplinesincludingeconomics,psychologyandclinicalandpreclinicalhealthresearch(e.g.,138
18).Furthermore,thefrequencyofotherreportingbiases(e.g.,switchedprimaryoutcomeswithout139
disclosure,andspin)isaround30%(19).Thisisunacceptablyhighanddiminishesthetrustworthiness140
andintegrityofresearch(7).ItalsoappearsthatPromotionandTenureCommittees(PTCs)generallydo141
notgivesufficientimportancetoregisteringprotocolsanddataanalysisplans,fullpublishingof142
completedstudiesormakingdata,code,andmaterialsavailable(20).143
144
Currentimplementation145
Studyregistrationandreportingguidelinesareusefultoolstohelpimprovethecompletenessand146
transparencyofaverybroadspectrumofresearch(e.g.,21‐24).Aspartoftheeditorialpoliciesofthe147
WellcomeTrust’sopenaccesspublishingplatform(WellcomeOpenResearch(WOR)),authorsare148
requiredtousereportingguidelinesforprotocols(e.g.,SPIRIT)andcompletedstudies(e.g.,ARRIVE)149
(25).Otherfunders,suchasGatesOpenResearch(26),theNC3RsGateway(27)andtheAssociationof150
MedicalResearchCharities(28),dolikewise.Tohelpreducepublicationbias,WORalsorequires151
registrationthroughoneofseveraldifferentoptions(e.g.,registeredreports)(25).Similarly,topromote152
theregistrationandpublicationofallresearchtheNIHRintheUKindicatethat“Whensubmittingan153
applicationtoNIHRprogrammesforfundingforanewclinicaltrial,theapplicantmustdisclosepast154
publicationandtrialRegistrationhistoryforanyrelevantpublicationsandresearchgrantsheld,155
referencedintheapplication.”(29).Whiletheseareexamplesofbestpracticefromfunders,wewere156
unabletofindanyresearchinstitutionthathasincorporatedthemintoresearcherassessments(20).157
158
Severalresearchinstitutions(e.g.,UniversityofToronto)arenowrecommendingthattheirclinical159
trialistsuseSEPTRE(30),aweb‐basedprotocolcreationandmanagementtool.WhenSEPTREisused,160
protocolinformationfortrialsisautomaticallyregisteredinclinicaltrials.gov.Thissavestimeandhelps161
theresearchers,andtheirresearchinstitutions,tomaintainbestpublicationpractices(e.g.,trial162
registration).Somejournalsinthesocialsciences,particularlypsychology,useregisteredreportstohelp163
ensurethatresearchispublishedregardlessofitsresults(31,32).164
165
Principle3:Valuethepracticesofopenscience(openresearch)‐suchasopenmethods,materialsand166
data.167
168
Rationale 169
Opennessinresearchismorethanjustaccesstoresearch–itbringsequalitytotheresearchprocess.It170
encompassesarangeofpracticesacrosstheentirelifecycleofresearch(33).Accesstoresearchshould171
notbeaboutwhohastheresourcestopaytoseebehindapaywall,typicallysubscriptionjournals.172
Healthcareandsocialpolicydecisionsshouldbemadebasedonaccesstoallresearchknowledgerather173
thanonlyapartofit(34).Aconsiderableamountofpublicfundsisusedforresearchanditsresultscan174
haveprofoundsocialimpact.Preclinicalscientistsarecommittingtoopenlysharetheirlaboratory175
notebooks(35)tostreamlineresearch,fostercollaborationsandreduceunnecessaryduplication.Inan176
efforttodeterquestionableauthorshippractices,theConsortiaAdvancingStandardsinResearch177
AdministrationInformationsupportstheCRediTtaxonomy(36)asawayforresearchauthorstomore178
openlydescribehoweachpersonhascontributedtoaresearchproject.179
180
Datasharingisanotherexampleofopennessbutisnotcommonpracticeinclinicalresearch(withsome181
exceptions,suchasgenetics)(37)althoughpatientsseemsupportiveofsharingtheirdata,atleastof182
randomizedtrialstheyhaveparticipatedin(38).Datasharingisalsonotconsideredstandardinmany183
otherdisciplines.Withoutdatasharingitisdifficulttochecktheselectivityofreports;datasharingiskey184
toaddressingthereproducibilitycrisis(39)andbuildingtrust(40).Therearevaryingestimatesasto185
whichproportionofresearchismadeavailablethroughopenaccessmediums,suchasopenaccess186
journals,repositories,oraspreprints,butitisfarfrom100%(41).187
188
Currentimplementation189
GhentUniversity,Belgium,hasemployeddatasharingguidancestating,“Sounddatamanagementisa190
basicrequirementforthis(academicanalysis)andprovidesadditionalguaranteesforaflawless191
methodology,forsharing,andreusingdatabyotherresearchersinanOpenSciencecontextandforthe192
accountabilityofaresearchersownacademicintegrity"(42).TheNanyangTechnologicalUniversity193
(NTU),Singapore,implementedanOpenAccesspolicyin2011.AllNTUfacultyandstaffmustdeposit194
theirfinalpeer‐reviewedmanuscriptofjournalarticlesandconferencepapersintheDigitalRepository195
(DR‐NTU)maintainedbytheLibraryuponacceptanceoftheirpublications.AtNTU’sfacultyofmedicine,196
randomdataauditsareconductedonthesubmitted(required)DataManagementPlans(DMPs)and197
checksaremadetoseeifthefinaldataareindeedsharedonNTU’sopenaccessdatarepositoryDR‐198
NTU.199
200
TohelpfacilitatedatasharingtheUniversityofCambridgehasintroducedtheconceptof‘data201
champions’(43).Here,volunteersadvisemembersoftheresearchcommunityonproperhandlingof202
researchdatasupportingtheuseoftheFindable,Accessible,Interoperable,andRe‐usable(FAIR)203
researchprinciples(44).DelftUniversityofTechnology,TheNetherlands,hastakenthisconceptastep204
furtherandimplementeditasacareerassessmentcriterion(45).TheUniversityofGlasgow’sacademic205
promotioncriteriaexplicitlyallowsfordatasharingasaresearchandscholarshipoutput(tosupport206
replication)(46).207
208
Somejournalshavealsoestablishedstrongdatasharingpolicies.Forexample,thePLOSjournals209
“requireauthorstomakealldataunderlyingthefindingsdescribedintheirmanuscriptfullyavailable210
withoutrestrictionatthetimeofpublication.Whenspecificlegalorethicalrequirementsprohibitpublic211
sharingofadataset,authorsmustindicatehowresearchersmayobtainaccesstothedata.Refusalto212
sharedataandrelatedmetadataandmethodsinaccordancewiththispolicywillbegroundsfor213
rejection.”(47).Giventhatsocietalbenefitispartofanemergingcareerassessment,clinicalresearchers214
shouldalsorespondtoagrowingviewthatpatientswanttheirdatashared(38).215
216
Openresearchissupportedbykeyinfrastructurecompliance,suchasrequiringanOpenResearcherand217
ContributorID(ORCID)byeveryresearcher,wherebyeachresearchercanbeuniquelyidentified.A218
recentletterfromglobalfunderscommittingtoImplementingORCIDsforallresearchersisasignificant219
stepforward(48).ThiswasrecentlyimplementedattheOttawaHospitalResearchInstitute(49).In220
AustraliaandNewZealandthereisaconsortiumthatsupportsORCIDnationally.221
222
TheNIHpromotestheuseofpreprintsingrantapplications(50)asdoallmajorUKpublicfunders(e.g.,223
MedicalResearchCouncil,UK)(51),TheWellcomeTrustmadethemcompulsoryforworkinhealth224
emergenciesandpromotestheirusewidelyinparticularforearlycareerresearchers(52).225
226
Principle4:Valueabroadrangeofresearchandscholarship,suchasreplication,innovation,227
translation,synthesis,andmeta‐research.228
229
Rationale230
Asystemthatrewardsbenefittosocietyandencouragestrustworthyandimportantresearchneedsto231
takethedifferenttypesofresearchintoaccount:creatingnewideas;testingthem;replicatingkey232
findings;synthesisofexistingresearch;developingandvalidatingnewtools;measuresormethods;etc.233
Differentindicatorsandcriterianeedtobedevelopedthatarerelevanttothesedifferenttypesand234
stagesofresearch(seeFigure).Thisincludesdifferenttimeframesofassessmentfordifferenttypesof235
research.236
237
Incentivesthatencourageonefixedideaofthe‘rightkind’ofresearchwillbeslow,orevenstall,238
progress.So‐calledblue‐skyresearchthatbuildsonchancefindingsorcuriosity‐drivenresearchbased239
on‘out‐of‐the‐box’thinkingshouldbepossibleandencouraged,aswellinanacademicrewardsystem240
thatvaluessocietalprogress(53).Forexample,thediscoveryofgrapheneattheUniversityof241
Manchester,UK,wastheresultofFridayafternoondiscussionsoutsidethe‘normal’researchactivities242
(54).Otherexamplesfromabroadrangeofdisciplinesexist(55).Theshort‐termnatureofacademic243
rewardcyclesmakesthiskindofresearchlessattractiveforfunders,institutionsandindividual244
researchers.Equally,replicationstudiesorresearchsynthesiseffortsareoftennotregardedas245
innovativeenoughinresearcherassessmentsdespitetheircriticalimportanceforthecredibilityof246
research,orforabalancedandrobustsystematicpresentationofallavailableevidence,respectively247
(39,56).ThisisnotuniversallyappreciatedbyPTCs.Researchonresearchandmeta‐researchare248
practicedat,forexample,atMETRICS(Stanford,USA)(57),QUEST(Berlin,Germany)(58)whosefocusis249
onclinicalandpreclinicalmeta‐research,andtheMetaResearchCenteratTilburgUniversity(59)250
(Tilburg,TheNetherlands)whosefocusisonthesocialsciences.Suchactivitiesareimportanttoinform251
andimproveresearchpracticesandthereforecontributetomakingresearchmorereliableandrelevant.252
253
Currentimplementation254
Somefundershavealreadyrecognizedtherelevanceofabroadrangeofresearchactivities.The255
ResearchImpactAssessmentPlatform(Researchfish)workstocapturesomeofthisdiversityandcan256
generatereportsontheimpactofabroadspectrumoffundedresearch(60).TheWellcomeSuccess257
Frameworkhighlightstheimportanceofalong‐termvisionandsharedobjectivesinordertotakea258
morebalancedapproachtoassessment(61).TheGermanFederalMinistryofScienceandEducationis259
fundingpreclinicalconfirmatorytrials(62).260
261
TheWellcomeTrusthasdevelopedanewLongitudinalPopulationStudiesStrategy,fundeddatare‐use262
prizes(63)andsupportsresearchonresearch(64).Allapproachesareaimedatvaluingabroadrangeof263
scholarshipandmaximizingthevalueofresearch.TheNetherlandsOrganizationforScientificResearch264
isinitsthirdcallforreplicationstudies(65).Researchonresearchandmeta‐researcharealsogaining265
momentumandnowhavesomeformaloutlets.Forexample,PLOSBiologyandeLIFEhaveameta‐266
researchsectionintheirjournals(66,67).Wewereunabletofindanyacademicinstitutionthathas267
incorporatedreplicationormeta‐researchintotheircareerassessmentportfolio(20).NIHRrequiresthe268
completionofasystematicreviewpriortofundinganynewresearch(68).TheNC3Rshavealso269
promotedtheimportanceofsystematicreviewsforprovidingarationaleforprojectproposals(69,70).270
Intheeventthatsuchareviewdoesnotexist,theyprovidefundingtoperformone.271
272
Principle5:Valuearangeofothercontributionstoresponsibleresearchandscholarlyactivity,suchas273
peerreviewforgrantsandpublications,mentoring,outreach,andknowledgeexchange.274
275
Rationale276
AsdiscussedalongsidePrinciple1,researchassessmentsfrequentlyfocusonanarrowrangeofeasyto277
measuremetricsincludingpublications,citationsandfundingincome(1,20).Fortheresearchecosystem278
tofunctionoptimally,otherresearchactivitiesarealsoessential.Peerreviewremainsthecornerstone279
ofqualityassessmentofgrants,publicationsandconferences.Thequalityofpeerreviewcontributions280
tojournalsandfunders,shouldalsobepartofassessmentsforpromotionandtenureasshould281
contributionstovariousresearchinfrastructure,oversight,orregulations.Equally,contributionsto282
improvementsthatgobeyondanindividual‐centeredapproachforassessmentshouldbeconsidered.283
TheseactivitiesarecurrentlylargelymissingfromPTCs(20).Contributionstodevelopingthecareersof284
othersatallstagesoftheircareeriscriticalasarecontributionsvariouscommitteesrelatedtoresearch285
(e.g.,assumingtheroleofaneditor).Howbesttodothiswithoutcreatingfurtherbarriersand286
bureaucracy,however,haslongbeendebated(71).287
288
Anyrewardsystemthathasthewholeresearchenterpriseatheartandaimstofosteraclimate289
conducivetotrustworthyandusefulresearchwiththehighestregardtointegrity,needstofindwaysto290
incorporatethesevitalrolesintoitsoverallassessmentstructure.291
292
Currentimplementation293
MacquarieUniversity,Sydney,Australia,hassomeexcitinginitiativesintheirnewacademicpromotion294
policywhichincludesfivepillarsoneofwhichisinleadershipandcitizenship.Hereresearcherscanshow295
theiralignmentwiththeuniversity’svaluesandbroadercontributiontotheuniversity,andits296
community(72).Asaresultofthisimplementation,thenumberofpromotionapplicationsincreasedby297
50%andthenumberofwomenpromotedhasalsoincreased.298
299
TheUniversityofGlasgow’sacademicpromotioncriteriaexplicitlyrewardsresearchersforparticipation300
inpeerreviewandotherrelatedactivities(e.g.,journaleditorship)(73,74).Inorderforthistooccur,itis301
necessarytohaveorganizationsthatcanprovidereviewerswithapermanentidentifier(aDigitalObject302
Identifier(DOI))forjournalsthatpublishOpenReviews(75)thatcanbeincludedinaresearcher’sCVor303
whichcanaggregatecompletedpeerreviews(76).Suchpoliciesmightalsohelppromotemore304
meaningfulinvolvementintraininginpeerreview(76).TheUniversityofExeter,UK,hasdeveloped305
‘ExeterAcademic’,ahubtohelptheirresearchersnavigatecareerprogression(77).Leadershipand306
citizenshiparetwo(offive)majorareasoffocus.Theformerincludesmentoringandthelatterincludes307
avenuestodisseminateresearchknowledgefromtheuniversity’sresearchers.308
309
TheFinnishAdvisoryBoardonResearchIntegrity(TENK)templateforresearcherCVsincludesabroad310
spectrumofcontributionsincludingmentoringand‘trustinsociety’(78).Asameasureofmentorship,311
MaastrichtUniversity,TheNetherlandsassessesthecareerprogressionofitsPhDgraduates(79).We312
wereunabletoidentifyresearchinstitutionsthatrewardresearcherswhohaveparticipatedintraining313
coursesonhigh‐qualitymentorship(20).314
315
TheIrishHealthResearchBoard(HRB)hasaknowledgeexchangeanddisseminationgrantprogram316
providingexistingHRB‐fundedresearcherswithanopportunitytoseeksupplementaryfundingfor317
exchangeanddisseminationactivitiesthatcanaccelerateandmaximizethepotentialtranslationand318
impactoftheresearchfindings,andlearninggained,onpolicyorpracticeandhealthoutcomes(80).A319
similarschemeexiststhroughtheCanadianInstitutesofHealthResearch(81)andtheNC3RsSkillsand320
KnowledgeTransfergrants(82)andtheirCrackITopeninnovationplatform(83).321
322
Wellcome’sgrantformslimitthenumberofpublicationsapplicantscansubmitandexplicitlyinvite323
applicantstodetailotherachievements.Thisiscombinedwithexplicitguidanceforpanelmembers324
remindingthemoftheimportanceoftakingabroadviewwhenassessingindividuals(84).325
326
Discussion327
TheHKPfocusonpromotingassessmentpracticesthatstrengthenresearchrigorisdeliberate,328
concentratingprimarilyonwhatresearchinstitutionscandotomodifythecriteriausedbyPTCsfor329
careerassessments.TheHKPdonotaddressotherrelevantissues,suchasdiversity,prejudiceand330
unconsciousbiasinhiringandpromotion.331
332
Dissemination333
TheWorldConferencesonResearchIntegrity(WCRI)Foundation(85)andtheREduceresearchWaste334
AndReviewDiligence(REWARD)Alliance(86)willmaketheHKPavailableontheirwebsites.This‘home’335
willincludetheprinciples,thesignatories,infographics,translationsintoseverallanguages(ongoing),336
futureimplementationplans(ongoing),andcrucially,aplacetohighlightthosewhohaveendorsedthe337
HKP.Beyondjournalpublication,wearedevelopingothersynergisticdisseminationroutes.338
339
EndorsementandUptake340
ResearchinstitutionsarekeytotheHKP.Theyarethehomeofcurrentandfutureresearchers,where341
promotionandtenureassessmentsarecarriedout.TohelpfacilitateHKP‘ontheground’,localkey342
opinionleaders,andtheirendorsement,shouldbeincludedinanyplan.TheHKPhavebeenrecognized343
bytheGoverningBoardoftheWCRIFoundationandtheSteeringCommitteeoftheREWARDAlliance.344
Weinviteacademicinstitutions,funders,othergroupsandindividualstodolikewiseontheWCRI345
Foundation’swebsite.346
347
Weareinvitingindividualsandorganizationstodeliverbrief(2‐3minutes)YouTubetestimonialsasto348
howtheyhaveimplementedtheHKP(categorizedbystakeholdergroup)andwewillprovidealinkto349
thesevideosontheWCRIFoundationwebsite.Thisapproachcanserveasapragmaticwayfor350
individualsandorganizationstoshowhowtheyareendorsingandusingtheHKPandasanudgeto351
otherstodolikewise.352
353
Toimplementsomeoftheseprinciplesislikelystraightforwardalthoughthismightnotbethecasefor354
allprinciples.Todosorequiresmoreunderstandingofthecomplexitiesoftoday’sresearch355
environment,suchastheavailabilityofinstitutionalinfrastructure,whethercurrentCVformatsare356
optimaltocollectbestpractices,enablingtransparencyaboutcareerassessment,andconsideringcloser357
alignmentwithpoliciesoffunders.358
359
Wewouldliketoevaluateourapproachanddeveloptoolkitsforthoseinterestedinwaystoimplement360
thefiveprinciples.Wewillworkwithsignatoriestotakethisforward.WeseetheHKPasanimportant361
stepalongthewaytoimprovingresearchintegrityandweencourageanongoingdialoguetosupport362
implementationoftheseimportantprinciples. 363
Acknowledgments364
Themanyparticipantstothe6thWorldConferenceonResearchIntegritywhoprovidedfeedbackon365
earlierversionsofthedocumentandactivelyparticipatedinthefocusgroupsessionsduringthe366
conference.367
368
369
References370
.MoherD,NaudetF,CristeaIA,MiedemaF,IoannidisJPA,GoodmanSN.Assessingscientistsforhiring,371
promotion,andtenure.PLoSBiol2018;16(3):e2004089372
2.AmericanSocietyforCellBiology.DORA.DeclarationonResearchAssessment.[Internet]Available373
from:http://www.ascb.org/dora/.Accessed4thAugust2019374
3.HicksD,WoutersP,WaltmanL,deRijckeS,RafolsI.Bibliometrics:TheLeidenManifestoforresearch375
metrics.Nature2015;520(7548):429–31376
4.KretserA,MurphyD,BertuzziS,AbrahamT,AllisonDB,BoorKJ,DwyerJ,GranthamA,HarrisLJ,377
HollanderR,Jacobs‐YoungC,RovitoS,VafiadisD,WotekiC,WyndhamJ,YadaR.ScientificIntegrity378
PrinciplesandBestPractices:RecommendationsfromaScientificIntegrityConsortium.SciEngEthics.379
2019Apr;25(2):327‐355.380
5.ZaunerH,NogoyNA,EdmundsSC,ZhouH,GoodmanL.Editorial:Weneedtotalkabout381
authorship,GigaScience,Volume7,Issue12,December2018,382
giy122,https://doi.org/10.1093/gigascience/giy122383
6.QuanW,ChenB,ShuF.PublishOrimpoverish:Aninvestigationofthemonetaryrewardsystemof384
scienceinChina(1999–2016).[Internet]Availablefrom:385
https://arxiv.org/ftp/arxiv/papers/1707/1707.01162.pdf.386
7.OsterlohM,FreyBS.RankingGames.EvaluationRev2014;39(1):102–129)387
8.http://www.cihr‐irsc.gc.ca/e/41204.htmlAccessed4thAugust2019388
9.http://www.cmaj.ca/content/190/supplementAccessed4thAugust2019389
10.http://www.jla.nihr.ac.uk/Accessed4thAugust2019390
11.BooteJD,DalgleishM,FreemanJ,JonesZ,MilesM,RodgersH.Butisitaquestionworthasking?A391
reflectivecasestudydescribinghowpublicinvolvementcanleadtoresearchers’ideasbeingabandoned.392
HealthExpect2012;publishedonlineMay31.DOI:10.1111/j.1369‐7625.2012.00771.x393
12.https://grants.nih.gov/policy/reproducibility/index.htmAccessed4thAugust2019394
13.https://www.nc3rs.org.uk/experimental‐design‐assistant‐edaAccessed4thAugust2019395
14.http://faculty.washington.edu/sr320Accessed4thAugust2019396
15.https://socialmedia.mayoclinic.org/2016/05/25/mayo‐clinic‐includes‐social‐media‐scholarship‐397
activities‐in‐academic‐advancement/398
16.https://wellcome.ac.uk/news/wellcomes‐approach‐engaging‐public‐going‐change399
17.EydingD,LelgemannM,GrouvenU,HarterM,KrompM,KaiserT,etal.Reboxetineforacute400
treatmentofmajordepression:systematicreviewandmeta‐analysisofpublishedandunpublished401
placeboandselectiveserotoninreuptakeinhibitorcontrolledtrials.BMJ2010;341:c4737..2402
18.ChanA‐W,SongF,VickersA,etal.Increasingvalueandreducingwaste:addressinginaccessible403
research.Lancet2014;publishedonlineJan8.http://dx.doi.org/10.1016/S0140‐6736(13)62296‐5.)404
19.DwanK,AltmanDG,ArnaizJA,BloomJ,ChanAW,CroninE,etal:Systematicreviewoftheempirical405
evidenceofstudypublicationbiasandoutcomereportingbias.PloSOne2008;3:e3081406
20.RiceDB,RaffoulH,IoannidisJPA,MoherD.Academiccriteriaforpromotionandtenureinfaculties407
ofmedicine:Across‐sectionalanalysisof170universities[Unpublished]408
21.CoboE,CortésJ,RiberaJM,etal.:Effectofusingreportingguidelinesduringpeerreviewonquality409
offinalmanuscriptssubmittedtoabiomedicaljournal:maskedrandomisedtrial.BMJ.2011;343:d6783410
22.TurnerL,ShamseerL,AltmanDG,etal.Consolidatedstandardsofreportingtrials(CONSORT)and411
thecompletenessofreportingofrandomisedcontrolledtrials(RCTs)publishedinmedicaljournals.412
CochraneDatabaseSystRev2012;11:MR000030413
23.TunisAS,McInnesMD,HannaR,EsmailK.Associationofstudyqualitywithcompletenessof414
reporting:havecompletenessofreportingandqualityofsystematicreviewsandmeta‐analysesinmajor415
radiologyjournalschangedsincepublicationofthePRISMAstatement?Radiology.2013;269(2):413‐426416
24.KorevaarDA,WangJ,vanEnstWA,LeeflangMM,HooftL,SmidtN,etal.Reportingdiagnostic417
accuracystudies:someimprovementsafter10yearsofSTARD.Radiology.2015;274(3):781‐9418
25.https://wellcomeopenresearch.org/about/policiesAccessed4thAugust2019419
26.https://gatesopenresearch.org/420
27.https://f1000research.com/nc3rs421
28.https://amrcopenresearch.org422
29.https://www.nihr.ac.uk/about‐us/documents/NIHR‐Policy‐on‐Clinical‐Trial‐Registration‐and‐423
Disclosure‐of‐Results.pdfAccessed4thAugust2019424
30.https://www.spirit‐statement.org/Accessed4thAugust2019425
31.WichertsJM,VeldkampCL,AugusteijnHE,BakkerM,vanAertRC,vanAssenMADegreesoffreedom426
inplanning,running,analyzingandreportingpsychologicalstudies:achecklisttoavoidp-hacking.Front427
Psych2016;7:1832428
32.NosekBA,EbersoleCR,DeHavenAC,MellorDT.Thepreregistrationrevolution.PNAS2018;429
115:2600–6430
33.AllenC,MehlerDMA(2019)Opensciencechallenges,benefitsandtipsinearlycareerandbeyond.431
PLoSBiol17(5):e3000246432
34.LiberatiA.Anunfinishedtripthroughuncertainties.BMJ2004;328:531433
35.https://openlabnotebooks.org/Accessed4thAugust2019434
36.Brand,A.;Allen,L.;Altman,M.;Hlava,M.;Scott,J.,BeyondAuthorship:attribution,contribution,435
collaboration,andcredit.LearnedPublishing2015,28(2),151‐155436
37.NaudetF,SakarovitchC,JaniaudP,CristeaI,FanelliD,MoherD,IoannidisJ.Datasharingand437
reanalysisofrandomisedcontrolledtrialsinleadingbiomedicaljournalswithfulldatasharingpolicy:438
surveyofstudiespublishedinTheBMJandPLOSMedicine.(2018)BMJ,360:k400439
38.MelloMM,LieouV,GoodmanSN.Clinicaltrialparticipants’viewsoftherisksandbenefitsofdata440
sharing.NEJM2018;378(23):2202–11441
39.Munafò,M.R.,Nosek,B.A.,Bishop,D.V.M.,Button,K.S.,Chambers,C.D.,PercieduSert,N.,&442
Ioannidis,J.P.A.(2017).Amanifestoforreproduciblescience.NatureHumanBehaviour,1(1),0021443
40.https://www.pewresearch.org/science/wp‐444
content/uploads/sites/16/2019/08/PS_08.02.19_trust.in_.scientists_FULLREPORT.pdf).445
41.AcceleratingScienceandPublicationinbiologyhttps://asapbio.org/Accessed4thAugust2019446
42.https://www.ugent.be/en/research/research‐ugent/research‐strategy/indicators.htmAccessed4th447
August2019448
43.https://www.data.cam.ac.uk/intro‐data‐championsAccessed4thAugust2019449
44.WilkinsonMD,DumontierIJ,AalbersbergG,AppletonM,AxtonA,BaakN,etal.TheFAIRGuiding450
Principlesforscientificdatamanagementandstewardship.SciData2016;3(1):160018451
45.https://www.tudelft.nl/en/library/current‐topics/research‐data‐management/r/data‐452
stewardship/data‐champions/453
46.https://www.gla.ac.uk/media/media_498056_en.pdfAccessed4thAugust2019454
47.https://journals.plos.org/plosone/s/data‐availabilityAccessed4thAugust2019455
48.https://ukorcidsupport.jisc.ac.uk/2018/12/funders‐sign‐up‐to‐orcid‐open‐letter/456
49.OHRIandORCID457
50.https://grants.nih.gov/grants/guide/notice‐files/not‐od‐17‐050.htmlAccessed4thAugust2019458
51.https://mrc.ukri.org/research/policies‐and‐guidance‐for‐researchers/preprints/459
52.https://wellcome.ac.uk/news/more‐positive‐culture‐phd‐training460
53.AmonA.Acaseformorecuriosity‐drivenbasicresearch.MolBiolCell2015;26:3690–1461
54.https://www.graphene.manchester.ac.uk/learn/discovery‐of‐graphene/Accessed4thAugust2019462
55.otherexamplesexist463
56.CamererCF,DreberA,HolzmeisterF,HoT‐H,HuberJ,JohannessenJ,etal.Evaluatingthe464
replicabilityofsocialscienceexperimentsinNatureandSciencebetween2010and2015.NatureHum465
Behav2018;2:637–44466
57.https://metrics.stanford.edu/Accessed4thAugust2019467
58.https://www.bihealth.org/en/quest‐center/mission‐approaches/Accessed4thAugust2019468
59.https://metaresearch.nl/Accessed4thAugust2019469
60.https://www.researchfish.net/Accessed4thAugust2019470
61.https://wellcome.ac.uk/news/how‐weve‐defined‐what‐success‐looks‐wellcomes‐work471
62.http://www.dlr.de/pt/Portaldata/45/Resources/Dokumente/GF/Outline_Application_Preclinical_Con472
firmatory_Study_2018.docx.Accessed4thAugust2019473
63.https://wellcome.ac.uk/news/new‐data‐re‐use‐prizes‐help‐unlock‐value‐research474
64.https://wellcome.ac.uk/funding/people‐and‐projects/grants‐awarded?scheme_id=3569475
65.https://bit.ly/2H1PIt3Accessed4thAugust2019476
66.https://collections.plos.org/meta‐research‐evaluation‐and‐scientometricsAccessed4thAugust2019477
67,https://elifesciences.org/collections/8d233d47/meta‐research‐a‐collection‐of‐articles478
68.https://www.nihr.ac.uk/about‐us/documents/NIHR‐Policy‐on‐Clinical‐Trial‐Registration‐and‐479
Disclosure‐of‐Results.pdfAccessed4thAugust2019480
69.https://www.nc3rs.org.uk/funding‐scheme‐priority‐areas481
70.https://www.nc3rs.org.uk/camaradesnc3rs‐systematic‐review‐facility‐syrf482
71.Thescholarlykitchen.[Internet].Availablefrom:483
https://scholarlykitchen.sspnet.org/2018/10/18/credit‐for‐peer‐review‐what‐exactly‐does‐that‐mean/484
72.https://www.mq.edu.au/thisweek/2017/04/13/new‐academic‐promotion‐scheme/#.XXvNkZNKhBw485
73.https://www.gla.ac.uk/media/media_498056_en.pdf).Accessed4thAugust2019486
74.Boyer,E.L.(1990)Scholarshipreconsidered:Prioritiesoftheprofessoriate.CarnegieFoundationfor487
theAdvancementofTeaching.488
75.https://f1000research.com/for‐referees/guidelinesAccessed4thAugust2019489
76.https://publons.com/about/homeAccessed4thAugust2019490
77.http://www.exeter.ac.uk/staff/exeteracademic/yourdevelopment/Accessed4thAugust2019491
78.https://www.tenk.fi/sites/tenk.fi/files/CV_english_270613.pdfAccessed4thAugust2019492
79.493
80.https://www.hrb.ie/funding/funding‐awarded/platforms‐programmes‐and‐projects/494
81.http://www.cihr‐irsc.gc.ca/e/46949.html495
82.https://www.nc3rs.org.uk/skills‐and‐knowledge‐transfer‐grants496
83.https://nc3rs.org.uk/crackit/497
84.https://wellcome.ac.uk/sites/default/files/induction‐pack‐for‐committee‐members‐2018.pdf498
85.https://www.wcrif.org/Accessed4thAugust2019499
86.http://rewardalliance.net/Accessed4thAugust2019500
501
502
Figure1:Robust,rigorousandtransparentpracticeandimpact503
504
Research
stage
Potentialmeasuresofrigorous
researchpractice
Importancetoresearchquality
Question
Knowledgesynthesis
Priority‐settingexercise;
stakeholder(s)engagement;
Usefulandrelevantresearch
thatbuildsonpreviousresearch
Design
Openprotocols;
(Pre)registration
Reuseofprotocolbyothers
Reducespublicationbiasand
otherreportingbiases;
Enhancesreproducibility
Conduct Qualityassuranceofdata;
Datasharing;sharing
materials
Reuseofdata/materialsby
others
Allowsdataaggregation,data
reuse,andtransparency
Analysis Analyticalcodesharing Enhancesreproducibility
Report Transparency;openaccess;
Useofreportingguidelines
Enhancesopennessand
accessibility
Dissemination Impactonresearch(including
altmetrics;citations)
Impactonpractice/society
Focusesonoutcomes&impact
ofresearch
505
1Itemsinblackaremeasuresofresponsibleresearchpractice;itemsinredaremeasuresofuseby506
others507
... Hence researchers working on eye-catching topics may thrive in an evaluation system based on impact factors. But for many of our participants, it hampered a responsible research climate because the "impact fetish" steered researchers away from supervising or peer review, research-related activities that are also important (Moher et al. 2020). ...
... rsite itlei den.nl/binar ies/conte nt/asset s/geest eswet ensch appen /pdfs/best-pract ices-for-phd-super visio n.pdf) where they translate commitments into concrete actions that both the PhD supervisor and the PhD student can take. One possible avenue would be to incorporate good mentorship into the reward system, making it a scientific activity that is valued in its own right, as described in the recently released Hong Kong Principles for Assessing Researchers (Moher et al. 2020). Principle 5 reads: "Value a range of other contributions to responsible research and scholarly activity, such as peer review for grants and publications, mentoring, outreach, and knowledge exchange." ...
Article
Full-text available
The research climate plays a key role in fostering integrity in research. However, little is known about what constitutes a responsible research climate. We investigated academic researchers’ perceptions on this through focus group interviews. We recruited researchers from the Vrije Universiteit Amsterdam and the Amsterdam University Medical Center to participate in focus group discussions that consisted of researchers from similar academic ranks and disciplinary fields. We asked participants to reflect on the characteristics of a responsible research climate, the barriers they perceived and which interventions they thought fruitful to improve the research climate. Discussions were recorded and transcribed at verbatim. We used inductive content analysis to analyse the focus group transcripts. We conducted 12 focus groups with 61 researchers in total. We identified fair evaluation, openness, sufficient time, integrity, trust and freedom to be mentioned as important characteristics of a responsible research climate. Main perceived barriers were lack of support, unfair evaluation policies, normalization of overwork and insufficient supervision of early career researchers. Possible interventions suggested by the participants centered around improving support, discussing expectations and improving the quality of supervision. Some of the elements of a responsible research climate identified by participants are reflected in national and international codes of conduct, such as trust and openness. Although it may seem hard to change the research climate, we believe that the realisation that the research climate is suboptimal should provide the impetus for change informed by researchers’ experiences and opinions.
... Esta última iniciativa se enfoca en la necesidad de mejorar la investigación y asegura que los investigadores serán reconocidos y recompensados por un comportamiento que conduzca a una investigación confiable; es decir, aquella que tiene atributos de robustez, rigor y transparencia. (5) Los estudios bibliométricos también necesitan ser confiables para contribuir al avance del conocimiento y facilitar la toma de decisiones. Deben cumplir con otras dos características: la replicabilidad, lo que significa que el estudio está lo suficientemente detallado como para que pueda ser replicado por otro investigador, y la reproducibilidad, que se refiere a instancias en que los datos originales y los códigos de computadora se utilizan para regenerar los resultados. ...
... Additional actions to improve methodological quality and transparency of RCTs include trial tracker initiatives aimed at reducing non-publication of clinical trials [12] and fostering responsible research practices. At the most recent World Conference on Research Integrity, the Hong Kong Principles were proposed for responsible research practices, transparent reporting, open science, valuing research diversity, and recognizing contributions to research and scholarly activity [13]. ...
Article
Full-text available
Many randomized controlled trials (RCTs) are biased and difficult to reproduce due to methodological flaws and poor reporting. There is increasing attention for responsible research practices and implementation of reporting guidelines, but whether these efforts have improved the methodological quality of RCTs (e.g., lower risk of bias) is unknown. We, therefore, mapped risk-of-bias trends over time in RCT publications in relation to journal and author characteristics. Meta-information of 176,620 RCTs published between 1966 and 2018 was extracted. The risk-of-bias probability (random sequence generation, allocation concealment, blinding of patients/personnel, and blinding of outcome assessment) was assessed using a risk-of-bias machine learning tool. This tool was simultaneously validated using 63,327 human risk-of-bias assessments obtained from 17,394 RCTs evaluated in the Cochrane Database of Systematic Reviews (CDSR). Moreover, RCT registration and CONSORT Statement reporting were assessed using automated searches. Publication characteristics included the number of authors, journal impact factor (JIF), and medical discipline. The annual number of published RCTs substantially increased over 4 decades, accompanied by increases in authors (5.2 to 7.8) and institutions (2.9 to 4.8). The risk of bias remained present in most RCTs but decreased over time for allocation concealment (63% to 51%), random sequence generation (57% to 36%), and blinding of outcome assessment (58% to 52%). Trial registration (37% to 47%) and the use of the CONSORT Statement (1% to 20%) also rapidly increased. In journals with a higher impact factor (>10), the risk of bias was consistently lower with higher levels of RCT registration and the use of the CONSORT Statement. Automated risk-of-bias predictions had accuracies above 70% for allocation concealment (70.7%), random sequence generation (72.1%), and blinding of patients/personnel (79.8%), but not for blinding of outcome assessment (62.7%). In conclusion, the likelihood of bias in RCTs has generally decreased over the last decades. This optimistic trend may be driven by increased knowledge augmented by mandatory trial registration and more stringent reporting guidelines and journal requirements. Nevertheless, relatively high probabilities of bias remain, particularly in journals with lower impact factors. This emphasizes that further improvement of RCT registration, conduct, and reporting is still urgently needed.
... Providing RI training courses and education, as well as developing infrastructure for adequate data management were also mentioned in many documents as an important responsibility of research organisations. All this reflects the organisations' valuable role in creating an environment and organisational culture in which researchers will be motivated to pertain to RI principles and rules in their work (Forsberg et al. 2018;Moher et al. 2019;Lerouge and Hol 2020). ...
Article
Full-text available
Research integrity (RI) is a continuously developing concept, and increasing emphasis is put on creating RI promotion practices. This study aimed to map the existing RI guidance documents at research performing organisations (RPOs) and research funding organisations (RFOs). A search of bibliographic databases and grey literature sources was performed, and retrieved documents were screened for eligibility. The search of bibliographical databases and reference lists of selected articles identified a total of 92 documents while the search of grey literature sources identified 118 documents for analysis. The retrieved documents were analysed based on their geographical origin, research field and organisational origin (RPO or RFO) of RI practices, types of guidance presented in them, and target groups to which RI practices are directed. Most of the identified practices were developed for research in general, and are applicable to all research fields (n = 117) and medical sciences (n = 78). They were mostly written in the form of guidelines (n = 136) and targeted researchers (n = 167). A comprehensive search of the existing RI promotion practices showed that initiatives mostly come from RPOs while only a few RI practices originate from RFOs. This study showed that more RI guidance documents are needed for natural sciences, social sciences, and humanities since only a small number of documents was developed specifically for these research fields. The explored documents and the gaps in knowledge identified in this study can be used for further development of RI promotion practices in RPOs and RFOs.
... DORA articulates the need 'to assess research on its own merits rather than on the basis of the journal in which the research is published,' and to make assessments based on content rather than publication metrics. In DORA, the Leiden Manifesto, 34 and the more recent Hong Kong Manifesto, 35 there is a growing focus on the quality of content in research when assessing researchers. Similarly, Plan S commits to assess research on 'the intrinsic merit of the work and not consider the publication channel, its impact factor (or other journal metrics), or the publisher' 36 . ...
Article
Full-text available
Universities want a voluntary non-exclusive licence from authors to disseminate publications. This practitioner case study explores an innovative model to communicate and advance open and equitable scholarship through the implementation of the Global University Publications Licence at the University of Nottingham Ningbo China. The paper explains the licence policy and key influences, including: Copyright Law of the People’s Republic of China and the Declaration on Research Assessment (DORA). The University approved the Global University Publications Licence, with implementation from 1 August 2019. It is available in Chinese and English. Since implementation, the University has retained rights for 74% of research publications submitted; 100% of those publications are available through the University with a CC-BY licence and zero embargo. The Open Scholarship Model provides an equitable approach to versions and citation. The paper concludes by suggesting university libraries can exploit Copyright Law in China to progress open scholarship strategies, including: recognition of employers as authors of works; a priority right to the exploitation of works; and, an embargo protection of two years after the completion of the work. The author’s final version of publications can be open, discoverable, cited and preserved through trusted universities with global reputations for high quality research.
Article
Full-text available
Zusammenfassung Transfer ist ein integraler Bestandteil der hochschulischen Aufgaben in Forschung und Lehre. Aus verschiedensten Gründen wird dies jedoch auch heute noch in der Breite der Hochschulwelten nicht so gesehen. Zwar wurde ver-sucht, über die Third Mission Diskussion Transfer als wei-tere Leistungssäule der Hochschulen einzuführen, dies führte letztendlich jedoch nicht zum erwünschten Ergeb-nis. Weder Transferleistung noch Transferkompetenzen an Hochschulen wurden erkennbar auf-und ausgebaut und genutzt. Der eigentliche Grund für die suboptimale Trans-ferleistung an Hochschulen ist die vorherrschende Reputa-tionslogik als praktiziertes "Regelwerk" der wissenschaft-lichen Leistungsanerkennung. Dies führt zu einer völligen Überbewertung der wissenschaftlichen Publikationsaktivi-täten. Forschungserkenntnisse verbleiben dadurch im wis-senschaftlichen Publikationsumfeld und gelangen erst gar nicht durch den Transfer in die Anwendungspraxis. Ent-sprechend werden an Hochschulen auch keine oder nur we-nig Transferkompetenzen aufgebaut und genutzt. Dies gilt es mittelfristig zu ändern. Transferkompetenzen müssen an Hochschulen aufgebaut werden. Hierzu ist in erster Li-nie einmal ein Verständnis über die Grundlagen des Transfers und des Transfergeschehens zu vermitteln. Ein solches muss Transferwege, Transferebenen und Trans-ferdialoge einbeziehen. Die notwendigen Kompetenzen, um Transfer zu betreiben und aktiv am Transfergeschehen teilzunehmen, können in einem Qualifikationsprogramm innerhalb der Hochschule oder hochschulübergreifend ver-mittelt werden. Ein Qualifikationsrahmen skizziert dabei Umfang und Inhalte der Kompetenzfelder, die es zu ver-mitteln gilt.
Chapter
Full-text available
In many countries, attention for fostering research integrity started with a misconduct case that got a lot of media exposure. But there is an emerging consensus that questionable research practices (QRPs) are more harmful due to their high prevalence. QRPs have in common that they can help to make study results more exciting, more positive and more statistically significant. That makes them tempting to engage in. Research institutions have the duty to empower their research staff to steer away from QRPs and to explain how they realise that in a Research Integrity Promotion Plan. Avoiding perverse incentives in assessing researchers for career advancement is an important element in that plan. Research institutions, funding agencies and journals should make their research integrity policies as evidence based as possible. The dilemmas and distractions researchers face are real and universal. We owe it to society to collaborate and to do our utmost best to prevent QRPs and to foster research integrity.KeywordsResearch integrityResearch misconductFabricationFalsificationQuestionable research practicesMeta-research
Article
Full-text available
In this essay, we argue that colleges of education, particularly those at research-intensive institutions, favor simplistic notions of scholarly impact and that this trend has concerning implications for the field, for researchers, and for the public at large. After describing the challenges and shortcomings of the current models of research assessment in education, we outline an alternative proposal in which trustworthiness and usability of research would complement traditional metrics of scholarly relevance. This proposal encourages a twofold approach to research assessment that involves (1) a more thorough analysis of the limitations and problems generated by the use of simplistic notions of scholarly impact, and (2) a commitment to the implementation of more equitable systems based on a broader range of assessment measures to assess faculty research contributions.
Article
Full-text available
The field of prevention science aims to understand societal problems, identify effective interventions, and translate scientific evidence into policy and practice. There is growing interest among prevention scientists in the potential for transparency, openness, and reproducibility to facilitate this mission by providing opportunities to align scientific practice with scientific ideals, accelerate scientific discovery, and broaden access to scientific knowledge. The overarching goal of this manuscript is to serve as a primer introducing and providing an overview of open science for prevention researchers. In this paper, we discuss factors motivating interest in transparency and reproducibility, research practices associated with open science, and stakeholders engaged in and impacted by open science reform efforts. In addition, we discuss how and why different types of prevention research could incorporate open science practices, as well as ways that prevention science tools and methods could be leveraged to advance the wider open science movement. To promote further discussion, we conclude with potential reservations and challenges for the field of prevention science to address as it transitions to greater transparency, openness, and reproducibility. Throughout, we identify activities that aim to strengthen the reliability and efficiency of prevention science, facilitate access to its products and outputs, and promote collaborative and inclusive participation in research activities. By embracing principles of transparency, openness, and reproducibility, prevention science can better achieve its mission to advance evidence-based solutions to promote individual and collective well-being.
Article
Full-text available
Research evaluation is often understood as something similar to a competition, where an evaluation panel’s task is to award the most excellent researchers. This interpretation is challenging, in as far as excellence it is at best a multi-dimensional concept and at worst an ill-defined term because it assumes that there exists some ground truth as to who the very best researchers are and all that an evaluation panel needs to do is uncover this ground truth. Therefore, instead of focusing on competition, the Swiss National Science Foundation focused on active decision-making and sought inspiration in the deliberation proceedings of a jury trial for the design of a new evaluation procedure of an academic award. The new evaluation procedure is based upon fully anonymised documents consisting of three independent parts (achievements, impact and prominence). Before the actual evaluation meeting, the panel, which includes non-academic experts, pre-evaluates all nominations through a pseudo-randomly structured network, such that every nomination is reviewed by six members of the panel only. Evaluation decisions are based upon anonymous votes, structured discussions in the panel, ranking as opposed to rating of nominees and data-rich figures providing an overview of the positioning of the nominee along various dimensions and the ranking provided by the individual panel members. The proceedings are overseen by an academic chair, focusing on content, and a procedural chair, focusing on the process and compliance. Combined, these elements form a highly-structure deliberation procedure, consisting of individual steps, through which nominations proceed and which each either feed into the next step or into the final verdict. The proposed evaluation process has been successfully applied in the real world in the evaluation of the Swiss Science Prize Marcel Benoist, Switzerland’s most prestigious academic award.
Article
Full-text available
Background: The objective of this study was to determine the presence of a set of prespecified criteria used to assess scientists for promotion and tenure within faculties of medicine among the U15 Group of Canadian Research Universities. Methods: Each faculty guideline for assessing promotion and tenure was reviewed and the presence of five traditional (peer-reviewed publications, authorship order, journal impact factor, grant funding, and national/international reputation) and seven nontraditional (citations, data sharing, publishing in open access mediums, accommodating leaves, alternative ways for sharing research, registering research, using reporting guidelines) criteria were collected by two reviewers. Results: Among the U15 institutions, four of five traditional criteria (80.0%) were present in at least one promotion guideline, whereas only three of seven nontraditional incentives (42.9%) were present in any promotion guidelines. When assessing full professors, there were a median of three traditional criteria listed, versus one nontraditional criterion. Conclusion: This study demonstrates that faculties of medicine among the U15 Group of Canadian Research Universities base assessments for promotion and tenure on traditional criteria. Some of these metrics may reinforce problematic practices in medical research. These faculties should consider incentivizing criteria that can enhance the quality of medical research.
Article
Full-text available
The movement towards open science is a consequence of seemingly pervasive failures to replicate previous research. This transition comes with great benefits but also significant challenges that are likely to affect those who carry out the research, usually early career researchers (ECRs). Here, we describe key benefits, including reputational gains, increased chances of publication, and a broader increase in the reliability of research. The increased chances of publication are supported by exploratory analyses indicating null findings are substantially more likely to be published via open registered reports in comparison to more conventional methods. These benefits are balanced by challenges that we have encountered and that involve increased costs in terms of flexibility, time, and issues with the current incentive structure, all of which seem to affect ECRs acutely. Although there are major obstacles to the early adoption of open science, overall open science practices should benefit both the ECR and improve the quality of research. We review 3 benefits and 3 challenges and provide suggestions from the perspective of ECRs for moving towards open science practices, which we believe scientists and institutions at all levels would do well to consider.
Article
Full-text available
A Scientific Integrity Consortium developed a set of recommended principles and best practices that can be used broadly across scientific disciplines as a mechanism for consensus on scientific integrity standards and to better equip scientists to operate in a rapidly changing research environment. The two principles that represent the umbrella under which scientific processes should operate are as follows: (1) Foster a culture of integrity in the scientific process. (2) Evidence-based policy interests may have legitimate roles to play in influencing aspects of the research process, but those roles should not interfere with scientific integrity. The nine best practices for instilling scientific integrity in the implementation of these two overarching principles are (1) Require universal training in robust scientific methods, in the use of appropriate experimental design and statistics, and in responsible research practices for scientists at all levels, with the training content regularly updated and presented by qualified scientists. (2) Strengthen scientific integrity oversight and processes throughout the research continuum with a focus on training in ethics and conduct. (3) Encourage reproducibility of research through transparency. (4) Strive to establish open science as the standard operating procedure throughout the scientific enterprise. (5) Develop and implement educational tools to teach communication skills that uphold scientific integrity. (6) Strive to identify ways to further strengthen the peer review process. (7) Encourage scientific journals to publish unanticipated findings that meet standards of quality and scientific integrity. (8) Seek harmonization and implementation among journals of rapid, consistent, and transparent processes for correction and/or retraction of published papers. (9) Design rigorous and comprehensive evaluation criteria that recognize and reward the highest standards of integrity in scientific research.
Article
Full-text available
Being able to replicate scientific findings is crucial for scientific progress. We replicate 21 systematically selected experimental studies in the social sciences published in Nature and Science between 2010 and 2015. The replications follow analysis plans reviewed by the original authors and pre-registered prior to the replications. The replications are high powered, with sample sizes on average about five times higher than in the original studies. We find a significant effect in the same direction as the original study for 13 (62%) studies, and the effect size of the replications is on average about 50% of the original effect size. Replicability varies between 12 (57%) and 14 (67%) studies for complementary replicability indicators. Consistent with these results, the estimated true-positive rate is 67% in a Bayesian analysis. The relative effect size of true positives is estimated to be 71%, suggesting that both false positives and inflated effect sizes of true positives contribute to imperfect reproducibility. Furthermore, we find that peer beliefs of replicability are strongly related to replicability, suggesting that the research community could predict which results would replicate and that failures to replicate were not the result of chance alone.
Article
Full-text available
Assessment of researchers is necessary for decisions of hiring, promotion, and tenure. A burgeoning number of scientific leaders believe the current system of faculty incentives and rewards is misaligned with the needs of society and disconnected from the evidence about the causes of the reproducibility crisis and suboptimal quality of the scientific publication record. To address this issue, particularly for the clinical and life sciences, we convened a 22-member expert panel workshop in Washington, DC, in January 2017. Twenty-two academic leaders, funders, and scientists participated in the meeting. As background for the meeting, we completed a selective literature review of 22 key documents critiquing the current incentive system. From each document, we extracted how the authors perceived the problems of assessing science and scientists, the unintended consequences of maintaining the status quo for assessing scientists, and details of their proposed solutions. The resulting table was used as a seed for participant discussion. This resulted in six principles for assessing scientists and associated research and policy implications. We hope the content of this paper will serve as a basis for establishing best practices and redesigning the current approaches to assessing scientists by the many players involved in that process.
Article
Full-text available
Objectives To explore the effectiveness of data sharing by randomized controlled trials (RCTs) in journals with a full data sharing policy and to describe potential difficulties encountered in the process of performing reanalyses of the primary outcomes. Design Survey of published RCTs. Setting PubMed/Medline. Eligibility criteria RCTs that had been submitted and published by The BMJ and PLOS Medicine subsequent to the adoption of data sharing policies by these journals. Main outcome measure The primary outcome was data availability, defined as the eventual receipt of complete data with clear labelling. Primary outcomes were reanalyzed to assess to what extent studies were reproduced. Difficulties encountered were described. Results 37 RCTs (21 from The BMJ and 16 from PLOS Medicine) published between 2013 and 2016 met the eligibility criteria. 17/37 (46%, 95% confidence interval 30% to 62%) satisfied the definition of data availability and 14 of the 17 (82%, 59% to 94%) were fully reproduced on all their primary outcomes. Of the remaining RCTs, errors were identified in two but reached similar conclusions and one paper did not provide enough information in the Methods section to reproduce the analyses. Difficulties identified included problems in contacting corresponding authors and lack of resources on their behalf in preparing the datasets. In addition, there was a range of different data sharing practices across study groups. Conclusions Data availability was not optimal in two journals with a strong policy for data sharing. When investigators shared data, most reanalyses largely reproduced the original results. Data sharing practices need to become more widespread and streamlined to allow meaningful reanalyses and reuse of data. Trial registration Open Science Framework osf.io/c4zke.
Article
Full-text available
Purpose The purpose of this paper is to present the landscape of the cash-per-publication reward policy in China and reveal its trend since the late 1990s. Design/methodology/approach This study is based on the analysis of 168 university documents regarding the cash-per-publication reward policy at 100 Chinese universities. Findings Chinese universities offer cash rewards from USD30 to USD165,000 for papers published in journals indexed by Web of Science, and the average reward amount has been increasing for the past ten years. Originality/value The cash-per-publication reward policy in China has never been systematically studied and investigated before except for in some case studies. This is the first paper that reveals the landscape of the cash-per-publication reward policy in China.
Article
Full-text available
Improving the reliability and efficiency of scientific research will increase the credibility of the published scientific literature and accelerate discovery. Here we argue for the adoption of measures to optimize key elements of the scientific process: methods, reporting and dissemination, reproducibility, evaluation and incentives. There is some evidence from both simulations and empirical studies supporting the likely effectiveness of these measures, but their broad adoption by researchers, institutions, funders and journals will require iterative evaluation and improvement. We discuss the goals of these measures, and how they can be implemented, in the hope that this will facilitate action toward improving the transparency, reproducibility and efficiency of scientific research.
Article
Background Sharing of participant-level clinical trial data has potential benefits, but concerns about potential harms to research participants have led some pharmaceutical sponsors and investigators to urge caution. Little is known about clinical trial participants’ perceptions of the risks of data sharing. Methods We conducted a structured survey of 771 current and recent participants from a diverse sample of clinical trials at three academic medical centers in the United States. Surveys were distributed by mail (350 completed surveys) and in clinic waiting rooms (421 completed surveys) (overall response rate, 79%). Results Less than 8% of respondents felt that the potential negative consequences of data sharing outweighed the benefits. A total of 93% were very or somewhat likely to allow their own data to be shared with university scientists, and 82% were very or somewhat likely to share with scientists in for-profit companies. Willingness to share data did not vary appreciably with the purpose for which the data would be used, with the exception that fewer participants were willing to share their data for use in litigation. The respondents’ greatest concerns were that data sharing might make others less willing to enroll in clinical trials (37% very or somewhat concerned), that data would be used for marketing purposes (34%), or that data could be stolen (30%). Less concern was expressed about discrimination (22%) and exploitation of data for profit (20%). Conclusions In our study, few clinical trial participants had strong concerns about the risks of data sharing. Provided that adequate security safeguards were in place, most participants were willing to share their data for a wide range of uses. (Funded by the Greenwall Foundation.)
Article
Progress in science relies in part on generating hypotheses with existing observations and testing hypotheses with new observations. This distinction between postdiction and prediction is appreciated conceptually but is not respected in practice. Mistaking generation of postdictions with testing of predictions reduces the credibility of research findings. However, ordinary biases in human reasoning, such as hindsight bias, make it hard to avoid this mistake. An effective solution is to define the research questions and analysis plan before observing the research outcomes-a process called preregistration. Preregistration distinguishes analyses and outcomes that result from predictions from those that result from postdictions. A variety of practical strategies are available to make the best possible use of preregistration in circumstances that fall short of the ideal application, such as when the data are preexisting. Services are now available for preregistration across all disciplines, facilitating a rapid increase in the practice. Widespread adoption of preregistration will increase distinctiveness between hypothesis generation and hypothesis testing and will improve the credibility of research findings.