FEDmix: Eveolutionary Deep Neural Network Mixture Learning from Distributed Data for Robust.
Automated Medial Image Analysis (MIA) has the potential to truly innovate clinical practice by offering solutions to routine, yet key tasks, such as segmentation (i.e., delineating organs).
Especially with recent advances in machine learning (ML), in particular in Deep Neural Networks (DNNs) that have led to an explosive growth of successful MIA studies reported in academic literature, the time appears right for such innovations to find widespread realworld uptake. Yet, labor-intensive manual performance of these tasks is still often daily clinical practice.
In this proposal, we integrate DNNs with other state-of-the-art computational intelligence techniques, in particular evolutionary algorithms (EAs), to overcome 2 key obstacles in moving toward widespread clinical uptake of computationally intelligent MIA techniques: 1) observer variation in the definition of a ground truth, and 2) image quality variation due to different acquisition protocols and scanners at different institutes. In particular, we design and develop efficient-computingcompatible implementations of mixtures of DNNs, the results of which can be fused with results learned from other data sets (i.e., from different institutes).
To maintain sufficient focus while doing so, we consider an elementary, but key MIA task: segmentation. Moreover, by means of an application in radiotherapy treatment planning, in collaboration with the Academic Medical Center in Amsterdam, we validate the newly developed technology on real-world patient data within the runtime of the proposed project.
This call is closed
The projects have been awarded, at the bottom of this page the current projects.
Sign up for our newsletter and stay informed of the latest news Commit2data
View the projects related to the theme eScience