TY - JOUR
T1 - A neuromorphic dataset for tabletop object segmentation in indoor cluttered environment
AU - Huang, Xiaoqian
AU - Kachole, Sanket
AU - Ayyad, Abdulla
AU - Naeini, Fariborz Baghaei
AU - Makris, Dimitrios
AU - Zweiri, Yahya
N1 - Publisher Copyright:
© 2024, The Author(s).
PY - 2024/12
Y1 - 2024/12
N2 - Event-based cameras are commonly leveraged to mitigate issues such as motion blur, low dynamic range, and limited time sampling, which plague conventional cameras. However, a lack of dedicated event-based datasets for benchmarking segmentation algorithms, especially those offering critical depth information for occluded scenes, has been observed. In response, this paper introduces a novel Event-based Segmentation Dataset (ESD), a high-quality event 3D spatial-temporal dataset designed for indoor object segmentation within cluttered environments. ESD encompasses 145 sequences featuring 14,166 manually annotated RGB frames, along with a substantial event count of 21.88 million and 20.80 million events from two stereo-configured event-based cameras. Notably, this densely annotated 3D spatial-temporal event-based segmentation benchmark for tabletop objects represents a pioneering initiative, providing event-wise depth, and annotated instance labels, in addition to corresponding RGBD frames. By releasing ESD, our aim is to offer the research community a challenging segmentation benchmark of exceptional quality.
AB - Event-based cameras are commonly leveraged to mitigate issues such as motion blur, low dynamic range, and limited time sampling, which plague conventional cameras. However, a lack of dedicated event-based datasets for benchmarking segmentation algorithms, especially those offering critical depth information for occluded scenes, has been observed. In response, this paper introduces a novel Event-based Segmentation Dataset (ESD), a high-quality event 3D spatial-temporal dataset designed for indoor object segmentation within cluttered environments. ESD encompasses 145 sequences featuring 14,166 manually annotated RGB frames, along with a substantial event count of 21.88 million and 20.80 million events from two stereo-configured event-based cameras. Notably, this densely annotated 3D spatial-temporal event-based segmentation benchmark for tabletop objects represents a pioneering initiative, providing event-wise depth, and annotated instance labels, in addition to corresponding RGBD frames. By releasing ESD, our aim is to offer the research community a challenging segmentation benchmark of exceptional quality.
UR - http://www.scopus.com/inward/record.url?scp=85183037938&partnerID=8YFLogxK
U2 - 10.1038/s41597-024-02920-1
DO - 10.1038/s41597-024-02920-1
M3 - Article
C2 - 38272894
AN - SCOPUS:85183037938
SN - 2052-4463
VL - 11
JO - Scientific Data
JF - Scientific Data
IS - 1
M1 - 127
ER -