Underwater Fish Tracking-by-Detection: An Adaptive Tracking Approach

Divya Velayudhan, Adarsh Ghimire, Jorge Dias, Naoufel Werghi, Sajid Javed

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    Abstract

    High distortion and complex marine environment pose severe challenges to underwater tracking. In this paper, we propose a simple, template-free Adaptive Euclidean Tracking (AET) approach for underwater fish tracking by regarding tracking as a specific case of instance detection. The proposed method exploits the advanced detection framework to track the fish in underwater imagery without any image enhancement techniques. The proposed method achieves comparable performance on the DeepFish dataset, with 22% and 14% improvement in precision and success over state-of-art trackers.

    Original languageBritish English
    Title of host publicationPattern Recognition, Computer Vision, and Image Processing. ICPR 2022 International Workshops and Challenges, Proceedings
    EditorsJean-Jacques Rousseau, Bill Kapralos
    PublisherSpringer Science and Business Media Deutschland GmbH
    Pages504-515
    Number of pages12
    ISBN (Print)9783031377303
    DOIs
    StatePublished - 2023
    Event26th International Conference on Pattern Recognition, ICPR 2022 - Montréal, Canada
    Duration: 21 Aug 202225 Aug 2022

    Publication series

    NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
    Volume13645 LNCS
    ISSN (Print)0302-9743
    ISSN (Electronic)1611-3349

    Conference

    Conference26th International Conference on Pattern Recognition, ICPR 2022
    Country/TerritoryCanada
    CityMontréal
    Period21/08/2225/08/22

    Keywords

    • Adaptive search
    • Object Detection
    • Underwater Tracking

    Fingerprint

    Dive into the research topics of 'Underwater Fish Tracking-by-Detection: An Adaptive Tracking Approach'. Together they form a unique fingerprint.

    Cite this