Loading…
OrgaQuant: Human Intestinal Organoid Localization and Quantification Using Deep Convolutional Neural Networks
Organoid cultures are proving to be powerful in vitro models that closely mimic the cellular constituents of their native tissue. Organoids are typically expanded and cultured in a 3D environment using either naturally derived or synthetic extracellular matrices. Assessing the morphology and growth...
Saved in:
Published in: | Scientific reports 2019-08, Vol.9 (1), p.12479-7, Article 12479 |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Organoid cultures are proving to be powerful
in vitro
models that closely mimic the cellular constituents of their native tissue. Organoids are typically expanded and cultured in a 3D environment using either naturally derived or synthetic extracellular matrices. Assessing the morphology and growth characteristics of these cultures has been difficult due to the many imaging artifacts that accompany the corresponding images. Unlike single cell cultures, there are no reliable automated segmentation techniques that allow for the localization and quantification of organoids in their 3D culture environment. Here we describe OrgaQuant, a deep convolutional neural network implementation that can locate and quantify the size distribution of human intestinal organoids in brightfield images. OrgaQuant is an end-to-end trained neural network that requires no parameter tweaking; thus, it can be fully automated to analyze thousands of images with no user intervention. To develop OrgaQuant, we created a unique dataset of manually annotated human intestinal organoid images with bounding boxes and trained an object detection pipeline using TensorFlow. We have made the dataset, trained model and inference scripts publicly available along with detailed usage instructions. |
---|---|
ISSN: | 2045-2322 2045-2322 |
DOI: | 10.1038/s41598-019-48874-y |