Learning Deep Similarity Metric for 3D MR-TRUS Registration

Purpose: The fusion of transrectal ultrasound (TRUS) and magnetic resonance (MR) images for guiding targeted prostate biopsy has significantly improved the biopsy yield of aggressive cancers. A key component of MR-TRUS fusion is image registration. However, it is very challenging to obtain a robust automatic MR-TRUS registration due to the large appearance difference between the two imaging modalities. The work presented in this paper aims to tackle this problem by addressing two challenges: (i) the definition of a suitable similarity metric and (ii) the determination of a suitable optimization strategy. 

Methods: This work proposes the use of a deep convolutional neural network to learn a suitable similarity metric. We also use a composite optimization strategy to explore the solution space for optimizing the learned metric for image registration. 

Results: The learned similarity metric outperforms mutual information and the results indicate that the overall registration framework has a large capture range. The proposed deep similarity metric based approach obtained the mean TRE of 4.24mm (with an initial TRE of 16mm) for this challenging problem. 

Conclusion: Learned metric based on deep learning can be used to assess the quality of any given image registration and can be used in conjunction with the aforementioned optimization framework to perform automatic registration that is robust to poor initialization.

Reference

G. Haskins, J. Kruecker, U. Kruger, S. Xu, P.A. Pinto, B.J. Wood, P. Yan, " Learning Deep Similarity Metric for 3D MR-TRUS Registration ,"

arXiv:1806.04548 [cs.CV], June 2018