Fiche publication


Date publication

mai 2019

Journal

IEEE transactions on medical imaging

Auteurs

Membres identifiés du Cancéropôle Est :
Pr WEMMERT Cédric


Tous les auteurs :
Grote A, Schaadt NS, Forestier G, Wemmert C, Feuerhake F

Résumé

Crowdsourcing in pathology has been performed on tasks that are assumed to be manageable by nonexperts. Demand remains high for annotations of more complex elements in digital microscopic images, such as anatomical structures. Therefore, this paper investigates conditions to enable crowdsourced annotations of high-level image objects, a complex task considered to require expert knowledge. Seventy six medical students without specific domain knowledge who voluntarily participated in three experiments solved two relevant annotation tasks on histopathological images: 1) labeling of images showing tissue regions and 2) delineation of morphologically defined image objects. We focus on methods to ensure sufficient annotation quality including several tests on the required number of participants and on the correlation of participants' performance between tasks. In a set up simulating annotation of images with limited ground truth, we validated the feasibility of a confidence score using full ground truth. For this, we computed a majority vote using weighting factors based on individual assessment of contributors against scattered gold standard annotated by pathologists. In conclusion, we provide guidance for task design and quality control to enable a crowdsourced approach to obtain accurate annotations required in the era of digital pathology.

Mots clés

Crowdsourcing, methods, Decision Making, physiology, Feasibility Studies, Histocytochemistry, classification, Humans, Image Processing, Computer-Assisted, Reproducibility of Results, Students, Medical

Référence

IEEE Trans Med Imaging. 2019 05;38(5):1284-1294