Gender Recognition on RGB-D Image

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

3 Scopus citations

Abstract

In this paper, we propose a deep-learning approach for human gender classification on RGB-D images. Unlike most of the existing methods, which use hand-crafted features from the human face, we exploit local information from the head and global information from the whole body to classify people's gender. A head detector is fine-tuned on YOLO to detect the head regions on the images automatically. Two gender classifiers are trained using head images and whole-body images separately. The final prediction is made by fusing the two classifiers' results. The presented method outperforms the state-of-art with an improvement in the accuracy of 2.6%, 7.6%, and 8.4% on three different test data of a challenging gender dataset which includes human standing, walking, and interacting scenarios.

Original languageBritish English
Title of host publication2020 IEEE International Conference on Image Processing, ICIP 2020 - Proceedings
PublisherIEEE Computer Society
Pages1836-1840
Number of pages5
ISBN (Electronic)9781728163956
DOIs
StatePublished - Oct 2020
Event2020 IEEE International Conference on Image Processing, ICIP 2020 - Virtual, Abu Dhabi, United Arab Emirates
Duration: 25 Sep 202028 Sep 2020

Publication series

NameProceedings - International Conference on Image Processing, ICIP
Volume2020-October
ISSN (Print)1522-4880

Conference

Conference2020 IEEE International Conference on Image Processing, ICIP 2020
Country/TerritoryUnited Arab Emirates
CityVirtual, Abu Dhabi
Period25/09/2028/09/20

Keywords

  • deep learning
  • Gender classification
  • RGB-D image

Fingerprint

Dive into the research topics of 'Gender Recognition on RGB-D Image'. Together they form a unique fingerprint.

Cite this