S'abonner

Evaluation of a Learning-based Deformable Registration Method on Abdominal CT Images - 25/04/20

Doi : 10.1016/j.irbm.2020.04.002 
R. Bhattacharjee a, , F. Heitz b, c , V. Noblet c, d , S. Sharma a , N. Sharma a
a School of Biomedical Engineering, Indian Institute of Technology (B.H.U.), Varanasi, 221005, India 
b Signal and Image Processing, Telecom Physique Strasbourg, France 
c IMages, leArning, Geometry and Statistics, Icube Laboratory, University of Strasbourg, 67412 Illkirch CEDEX, France 
d Icube Laboratory, University of Strasbourg, CNRS, 67412 Illkirch CEDEX, France 

Corresponding author.
Sous presse. Épreuves corrigées par l'auteur. Disponible en ligne depuis le Saturday 25 April 2020
Cet article a été publié dans un numéro de la revue, cliquez ici pour y accéder

Abstract

Background

Reliable image comparisons, based on fast and accurate deformable registration methods, are recognized as key steps in the diagnosis and follow-up of cancer as well as for radiation therapy planning or surgery. In the particular case of abdominal images, the images to compare often differ widely from each other due to organ deformation, patient motion, movements of gastrointestinal tract or breathing. As a consequence, there is a need for registration methods that can cope with both local and global large and highly non-linear deformations.

Method

Deformable registration of medical images traditionally relies on the iterative minimization of a cost function involving a large number of parameters. For complex deformations and large datasets, this process is computationally very demanding, leading to processing times that are incompatible with the clinical routine workflow. Moreover, the highly non-convex nature of these optimization problems leads to a high risk of convergence toward local minima. Recently, deep learning approaches using Convolutional Neural Networks (CNN) have led to major breakthroughs by providing computationally fast unsupervised methods for the registration of 2D and 3D images within seconds. Among all the proposed approaches, the VoxelMorph learning-based framework pioneered to learn in an unsupervised way the complex mapping, parameterized using a CNN, between every couple of 2D or 3D pairs of images and the corresponding deformation field by minimizing a standard intensity-based similarity metrics over the whole learning database. Voxelmorph has so far only been evaluated on brain images. The present study proposes to evaluate this method in the context of inter-subject registration of abdominal CT images, which present a greater challenge in terms of registration than brain images, due to greater anatomical variability and significant organ deformations.

Results

The performances of VoxelMorph were compared with the current top-performing non-learning-based deformable registration method “Symmetric Normalization” (SyN), implemented in ANTs, on two representative databases: LiTS and 3D-IRCADb-01. Three different experiments were carried out on 2D or 3D data, the atlas-based or pairwise registration, using two different similarity metrics, namely (MSE and CC). Accuracy of the registration was measured by the Dice score, which quantifies the volume overlap for the selected anatomical region.

All the three experiments exhibit that the two deformable registration methods significantly outperform the affine registration and that VoxelMorph accuracy is comparable or even better than the reference non-learning based registration method ANTs (SyN), with a drastically reduced computation time.

Conclusion

By substituting a time consuming optimization problem, VoxelMorph has made an outstanding achievement in learning-based registration algorithm, where a registration function is trained and thus, able to perform deformable registration almost accurately on abdominal images, while reducing the computation time from minutes to seconds and from seconds to milliseconds in comparison to ANTs (SyN) on a CPU.

Le texte complet de cet article est disponible en PDF.

Graphical abstract

Le texte complet de cet article est disponible en PDF.

Highlights

Deformable registration of abdominal images is done using deep learning technique.
Three experiments are done: 2D atlas based, 2D pairwise and 3D pairwise registration.
Inter-subject registration results show competitive outcomes with reduced runtime.

Le texte complet de cet article est disponible en PDF.

Keywords : Deformable image registration, Abdominal CT image, Unsupervised convolutional neural networks, VoxelMorph


Plan


© 2020  AGBM. Publié par Elsevier Masson SAS. Tous droits réservés.
Ajouter à ma bibliothèque Retirer de ma bibliothèque Imprimer
Export

    Export citations

  • Fichier

  • Contenu

Bienvenue sur EM-consulte, la référence des professionnels de santé.
L’accès au texte intégral de cet article nécessite un abonnement.

Déjà abonné à cette revue ?

Mon compte


Plateformes Elsevier Masson

Déclaration CNIL

EM-CONSULTE.COM est déclaré à la CNIL, déclaration n° 1286925.

En application de la loi nº78-17 du 6 janvier 1978 relative à l'informatique, aux fichiers et aux libertés, vous disposez des droits d'opposition (art.26 de la loi), d'accès (art.34 à 38 de la loi), et de rectification (art.36 de la loi) des données vous concernant. Ainsi, vous pouvez exiger que soient rectifiées, complétées, clarifiées, mises à jour ou effacées les informations vous concernant qui sont inexactes, incomplètes, équivoques, périmées ou dont la collecte ou l'utilisation ou la conservation est interdite.
Les informations personnelles concernant les visiteurs de notre site, y compris leur identité, sont confidentielles.
Le responsable du site s'engage sur l'honneur à respecter les conditions légales de confidentialité applicables en France et à ne pas divulguer ces informations à des tiers.


Tout le contenu de ce site: Copyright © 2024 Elsevier, ses concédants de licence et ses contributeurs. Tout les droits sont réservés, y compris ceux relatifs à l'exploration de textes et de données, a la formation en IA et aux technologies similaires. Pour tout contenu en libre accès, les conditions de licence Creative Commons s'appliquent.