Imaging techniques enable a detailed look inside an organism. But interpreting the data is time-consuming and requires a great deal of experience. Artificial neural networks open up new possibilities: They require just seconds to interpret whole-body scans of mice and to segment and depict the organs in colors, instead of in various shades of gray. This facilitates the analysis considerably.
How big is the liver? Does it change if medication is taken? Is the kidney inflamed? Is there a tumor in the brain and did metastases already develop? In order to answer such questions, bioscientists and doctors to date had to screen and interpret a wealth of data.
“The analysis of three-dimensional imaging processes is very complicated,” explains Oliver Schoppe. Together with an interdisciplinary research team, the TUM researcher has now developed self-learning algorithms to in future help analyze bioscientific image data.
At the core of the AIMOS software—the abbreviation stands for AI-based Mouse Organ Segmentation—are artificial neural networks that, like the human brain, are capable of learning. “You used to have to tell computer programs exactly what you wanted them to do,” says Schoppe. “Neural networks don’t need such instructions:” It’s sufficient to train them by presenting a problem and a solution multiple times. Gradually, the algorithms start to recognize the relevant patterns and are able to find the right solutions themselves.”
Training self-learning algorithms
In the AIMOS project, the algorithms were trained with the help of images of mice. The objective was to assign the image points from the 3-D whole-body scan to specific organs, such as stomach, kidneys, liver, spleen, or brain. Based on this assignment, the program can then show the exact position and shape.
“We were lucky enough to have access to several hundred image of mice from a different research project, all of which had already been interpreted by two biologists,” recalls Schoppe. The team also had access to fluorescence microscopic 3-D scans from the Institute for Tissue Engineering and Regenerative Medicine at the Helmholtz Zentrum München.
Through a special technique, the researchers were able to completely remove the dye from mice that were already deceased. The transparent bodies could be imaged with a microscope step by step and layer for layer. The distances between the measuring points were only six micrometers—which is equivalent to the size of a cell. Biologists had also localized the organs in these datasets.
Artificial intelligence improves accuracy
At the TranslaTUM the information techs presented the data to their new algorithms. And these learned faster than expected, Schoppe reports: “We only needed around ten whole-body scans before the software was able to successfully analyze the image data on its own—and within a matter of seconds. It takes a human hours to do this.”
The team then checked the reliability of the artificial intelligence with the help of 200 further whole-body scans of mice. “The result shows that self-learning algorithms are not only faster at analyzing biological image data than humans, but also more accurate,” sums up Professor Bjoern Menze, head of the Image-Based Biomedical Modeling group at TranslaTUM at the Technical University of Munich.
The intelligent software is to be used in the future in particular in basic research: “Images of mice are vital for, for example, investigating the effects of new medication before they are given to humans. Using self-learning algorithms to analyze image data in the future will save a lot of time in the future,” emphasizes Menze.
Oliver Schoppe et al, Deep learning-enabled multi-organ segmentation in whole-body mouse scans, Nature Communications (2020). DOI: 10.1038/s41467-020-19449-7
Technical University Munich
Self-learning algorithms analyze medical imaging data (2020, December 28)
retrieved 28 December 2020
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.