Show simple item record

dc.contributor.advisorLai, Sue M
dc.contributor.authorFraga, Garth
dc.date.accessioned2021-07-20T20:51:08Z
dc.date.available2021-07-20T20:51:08Z
dc.date.issued2020-05-31
dc.date.submitted2020
dc.identifier.otherhttp://dissertations.umi.com/ku:17074
dc.identifier.urihttp://hdl.handle.net/1808/31745
dc.description.abstractDiagnostic accuracy studies compare the performance of an index test against a reference standard test. The Standards for Reporting of Diagnostic Accuracy Studies (STARD) is a checklist of items that should be reported in scientific studies to improve completeness and transparency. The primary objective of this project was to calculate the frequency of STARD items reported in diagnostic accuracy studies from the 2017-2018 pathology scientific literature. Two raters independently scored 171 articles for compliance in reporting 34 STARD items. There was excellent inter-rater reliability (Cohen Kappa coefficient = .8773). The mean number of STARD recommended items reported was 15.44 + 3.59 with a range of 4-28 out of maximum possible score of 34. Excluding not-applicable items such as test-induced adverse events, overall adherence to STARD reporting recommendations was 50%. There was substantial variation in individual item reporting, with > 75% reporting of 8/34 items and 75% reporting of 8/34 items and < 25% reporting of 11/34 items. Less than 10% of the articles included pre-specified hypotheses, rationale for choice of the reference standard, subgroup analyses for confounding, sample size calculations, subject flow diagrams, time intervals and/or interventions between index and reference tests, adverse events caused by testing, study registration numbers, or links to full study protocols. Significantly more items were reported in articles from journals that encouraged STARD usage in their author guidelines (16.15 vs. 14.84, P = .0165). The frequency of STARD item reporting was independent of journal impact factor, article citation count, ICMJE reporting standards endorsement, anatomic/clinical pathology disciplines, and pathology subspecialty. These findings demonstrate variable compliance in the recent pathology scientific literature with STARD 2015 reporting recommendations. Mandating authors submit completed STARD checklists in the manuscript submission process might improve compliance.
dc.format.extent47 pages
dc.language.isoen
dc.publisherUniversity of Kansas
dc.rightsCopyright held by the author.
dc.subjectPathology
dc.subjectLibrary science
dc.subjectPublic policy
dc.subjectDiagnostic test
dc.subjectReporting guidelines
dc.subjectStandards for Reporting of Diagnostic Accuracy Studies (STARD)
dc.titleAdherence to STARD 2015 Reporting Recommendations in Pathology
dc.typeThesis
dc.contributor.cmtememberPlapp, Fred V
dc.contributor.cmtememberVukas, Rachel R
dc.thesis.degreeDisciplinePreventive Medicine and Public Health
dc.thesis.degreeLevelM.S.
dc.identifier.orcidhttps://orcid.org/0000-0001-5567-0165en_US
dc.rights.accessrightsopenAccess


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record