Real-time diameter of the fetal aorta from ultrasound
Savioli, Nicolò, Grisan, Enrico, Visentin, Silvia, Cosmi, Erich, Montana, Giovanni and Lamata, Pablo (2019). Real-time diameter of the fetal aorta from ultrasound. Neural Computing and Applications.
|Authors||Savioli, Nicolò, Grisan, Enrico, Visentin, Silvia, Cosmi, Erich, Montana, Giovanni and Lamata, Pablo|
The automatic analysis of ultrasound sequences can substantially improve the efficiency of clinical diagnosis. This article presents an attempt to automate the challenging task of measuring the vascular diameter of the fetal abdominal aorta from ultrasound images. We propose a neural network architecture consisting of three blocks: a convolutional neural network (CNN) for the extraction of imaging features, a convolution gated recurrent unit (C-GRU) for exploiting the temporal redundancy of the signal, and a regularized loss function, called CyclicLoss, to impose our prior knowledge about the periodicity of the observed signal. The solution is investigated with a cohort of 25 ultrasound sequences acquired during the third-trimester pregnancy check, and with 1000 synthetic sequences. In the extraction of features, it is shown that a shallow CNN outperforms two other deep CNNs with both the real and synthetic cohorts, suggesting that echocardiographic features are optimally captured by a reduced number of CNN layers. The proposed architecture, working with the shallow CNN, reaches an accuracy substantially superior to previously reported methods, providing an average reduction of the mean squared error from 0.31 (state-of-the-art) to 0.09 mm2, and a relative error reduction from 8.1 to 5.3%. The mean execution speed of the proposed approach of 289 frames per second makes it suitable for real-time clinical use.
|Keywords||ultrasound; convolutional neural network; gated-recurrent unit; cyclic loss; prenatal screening; intima-media thickness|
|Journal||Neural Computing and Applications|
|Publisher||Springer (part of Springer Nature)|
|Digital Object Identifier (DOI)||doi:10.1007/s00521-019-04646-3|
|Online||18 Dec 2019|
|Publication process dates|
|Accepted||22 Nov 2019|
|Deposited||09 Jan 2020|
CC BY 4.0
File Access Level
1views this month
1downloads this month