FHSU-NETR: Transformer-Based Deep Learning Model for the Detection of Fetal Heart Sounds in Phonocardiography

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    1 Scopus citations

    Abstract

    Assessing fetal well-being using conventional tools requires skilled clinicians for interpretation and can be susceptible to noise interference, especially during lengthy recordings or when maternal effects contaminate the signals. In this study, we present a novel transformer-based deep learning model called fetal heart sounds U-Net Transformer (FHSU-NETR) for automated extraction of fetal heart activity from raw phonocardiography (PCG) signals. The model was trained using a realistic synthetic dataset and validated on data recorded from 20 healthy mothers at the pregnancy outpatient clinic of Tohoku University Hospital, Japan. The model successfully extracted fetal PCG signals; achieving a heart rate mean difference of-1.5 bpm compared to the ground-truth calculated from fetal electrocardiogram (ECG). By leveraging deep learning, FHSU-NETR would facilitates timely interpretation of lengthy PCG recordings while reducing the heavy reliance on medical experts; thereby enhancing the efficiency in clinical practice.

    Original languageBritish English
    Title of host publicationComputing in Cardiology, CinC 2023
    PublisherIEEE Computer Society
    ISBN (Electronic)9798350382525
    DOIs
    StatePublished - 2023
    Event50th Computing in Cardiology, CinC 2023 - Atlanta, United States
    Duration: 1 Oct 20234 Oct 2023

    Publication series

    NameComputing in Cardiology
    ISSN (Print)2325-8861
    ISSN (Electronic)2325-887X

    Conference

    Conference50th Computing in Cardiology, CinC 2023
    Country/TerritoryUnited States
    CityAtlanta
    Period1/10/234/10/23

    Fingerprint

    Dive into the research topics of 'FHSU-NETR: Transformer-Based Deep Learning Model for the Detection of Fetal Heart Sounds in Phonocardiography'. Together they form a unique fingerprint.

    Cite this