nnU-Net (v2) is a deep learning-based segmentation method that automatically configures itself, including preprocessing, network architecture, training and post-processing for any new task in the biomedical domain.
The key design choices in this process are modeled as a set of fixed parameters, interdependent rules and empirical decisions. Without manual intervention, nnU-Net surpasses most existing approaches, including highly specialized solutions on 23 public datasets used in international biomedical segmentation competitions. Pretrained models for all datasets used in the model’s reference evaluation study are available for download at Zenodo. nnU-Net model is made publicly available at GitHub as an out-of-the-box tool, rendering state-of-the-art segmentation accessible to a broad audience by requiring neither expert knowledge nor computing resources beyond standard network training.
Description: Exemplary segmentation results generated by nnU-Net for a variety of datasets are shown in the figure below obtained from the model’s reference publication:

The nnU-Net model can automatically adapt to new datasets. The figure below shows how nnU-Net systematically addresses the configuration of entire segmentation pipelines and provides a visualization and description of the most relevant design choices.

With this model its developers aimed at outlining a new path between the status quo of primarily expert-driven method configuration in biomedical segmentation on one side and primarily data-driven AutoML approaches on the other. Specifically, they defined a recipe that systematizes the configuration process on a task-agnostic level and drastically reduces the search space for empirical design choices when given a new task.
- Collect design decisions that do not require adaptation between datasets and identify a robust common configuration (‘fixed parameters’).
- For as many of the remaining decisions as possible, formulate explicit dependencies between specific dataset properties (‘dataset fingerprint’) and design choices (‘pipeline fingerprint’) in the form of heuristic rules to allow for almost-instant adaptation on application (‘rule-based parameters’).
- Learn only the remaining decisions empirically from the data (‘empirical parameters’).
This recipe as implemented in the nnU-Net model was validated on the ten datasets provided by the Medical Segmentation Decathlon. The resulting segmentation method (nnU-Net) is able to perform automated configuration for arbitrary new datasets. In contrast to existing research methods:
- nnU-Net is holistic, that is, its automated configuration covers the entire segmentation pipeline (including essential topological parameters of the network architecture) without any manual decisions.
- automated configuration in nnU-Net is fast, comprising a simple execution of rules and only a few empirical choices to be made, thus requiring virtually no compute resources beyond standard model training.
- nnU-Net is data efficient; encoding design choices based on a large and diverse data pool serves as a strong inductive bias for application to datasets with limited training data.
The general applicability of nnU-Net’s automated configuration is demonstrated in 13 additional datasets in the model’s reference publication. Altogether, results are reported on 53 segmentation tasks, covering an unprecedented diversity of target structures, image types and image properties. As an open source tool, nnU-Net can simply be trained out-of-the box to generate state-of-the-art segmentations.
Data Availability: All 23 datasets used in the model’s reference evaluation study are publicly available and can be accessed via their respective challenge websites as follows.
- D1–D10 Medical Segmentation Decathlon;
- D11 Beyond the Cranial Vault (BCV)-Abdomen;
- D12 PROMISE12;
- D13 ACDC;
- D14 LiTS;
- D15 MSLes;
- D16 CHAOS;
- D17 KiTS;
- D18 SegTHOR;
- D19 CREMI;
- D20–D23 Cell Tracking Challenge.
Results: The nnU-Net outperformed specialized pipelines in a range of diverse tasks. In the figure below an overview is provided of the quantitative results achieved by nnU-Net and the competing challenge teams across all 53 segmentation tasks. Despite its generic nature, nnU-Net outperforms most existing segmentation solutions, even though the latter were specifically optimized for the respective task. Overall, nnU-Net sets a new state of the art in 33 of 53 target structures and otherwise shows performances on par with or close to the top leaderboard entries.

Release Notes for nnU-Net V2: The core of the old nnU-Net (nnU-Net v1) was developed in a short time period while participating in the Medical Segmentation Decathlon challenge in 2018. Consequently, code structure and quality were not the best. Many features were added later on and didn’t quite fit into the nnU-Net design principles.
On the other hand, nnU-Net V2 is a complete overhaul. The “delete everything and start again” kind. So everything is better (in the author’s opinion). While the segmentation performance remains the same, a lot of cool stuff has been added. It is now also much easier to use it as a development framework and to manually fine-tune its configuration to new datasets. A big driver for the reimplementation was also the emergence of Helmholtz Imaging, prompting the developers to extend nnU-Net to more image formats and domains. as highligthed here.
Acknowledgments: nnU-Net is developed and maintained by the Applied Computer Vision Lab (ACVL) of Helmholtz Imaging and the Division of Medical Image Computing at the German Cancer Research Center (DKFZ).


Claim: The nnU-Net model sets a new state of the art in various semantic segmentation challenges and displays strong generalization characteristics requiring neither expert knowledge nor compute resources beyond standard network training. As indicated by Litjens et al. and quantitatively confirmed by the results presented in the model’s reference publication, method configuration in biomedical imaging used to be considered a “highly empirical exercise”, for which “no clear recipe can be given”. Based on the recipe referenced above, nnU-Net is able to automate this often insufficiently systematic and cumbersome procedure and may thus help alleviate this burden. The nnU-Net is meant to be leveraged as an out-of-the box tool for state-of-the-art segmentation, as a standardized and dataset-agnostic baseline for comparison and as a framework for the large-scale evaluation of novel ideas without manual effort.

1 Comment