Document Type

Article

Publication Date

4-23-2024

Keywords

JMG, Animals, Mice, Guinea Pigs, Humans, Rats, Swine, Cochlea, Hair Cells, Auditory, Microscopy, Fluorescence, Machine Learning

JAX Source

Sci Data. 2024;11(1):416.

ISSN

2052-4463

PMID

38653806

DOI

https://doi.org/10.1038/s41597-024-03218-y

Abstract

Our sense of hearing is mediated by cochlear hair cells, of which there are two types organized in one row of inner hair cells and three rows of outer hair cells. Each cochlea contains 5-15 thousand terminally differentiated hair cells, and their survival is essential for hearing as they do not regenerate after insult. It is often desirable in hearing research to quantify the number of hair cells within cochlear samples, in both pathological conditions, and in response to treatment. Machine learning can be used to automate the quantification process but requires a vast and diverse dataset for effective training. In this study, we present a large collection of annotated cochlear hair-cell datasets, labeled with commonly used hair-cell markers and imaged using various fluorescence microscopy techniques. The collection includes samples from mouse, rat, guinea pig, pig, primate, and human cochlear tissue, from normal conditions and following in-vivo and in-vitro ototoxic drug application. The dataset includes over 107,000 hair cells which have been identified and annotated as either inner or outer hair cells. This dataset is the result of a collaborative effort from multiple laboratories and has been carefully curated to represent a variety of imaging techniques. With suggested usage parameters and a well-described annotation procedure, this collection can facilitate the development of generalizable cochlear hair-cell detection models or serve as a starting point for fine-tuning models for other analysis tasks. By providing this dataset, we aim to give other hearing research groups the opportunity to develop their own tools with which to analyze cochlear imaging data more fully, accurately, and with greater ease.

Comments

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Cre- ative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not per- mitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0

Share

COinS