TY - JOUR
T1 - Sensory Substitution Device for Time Presentation
T2 - Spatial-Temporal Vibrotactile Encoding for Presenting Time on the Human Wrist
AU - Afzal, Hafiz
AU - Hussain, Irfan
AU - Zhou, Zejian
AU - Prattichizzo, Domenico
AU - Seneviratne, Lakmal
AU - Zhang, Yuru
AU - Wang, Dangxiao
N1 - Publisher Copyright:
© 2013 IEEE.
PY - 2025
Y1 - 2025
N2 - Presenting information privately such as alertness levels and time on the wrist via vibrotactile feedback proves invaluable for visually impaired individuals. Additionally, in situations where the visual channel is occupied, this serves as a discreet solution for sighted users, allowing them to stay informed during meetings or tasks without the need to overtly check their watches, thus minimizing potential distractions. However, it is a challenging task to present time accurately and efficiently to the users using vibrotactile modality due to the perceptual limits of human's haptic channel. Inspired by the metaphors of mechanical and digital watches that have been widely used in our daily lives, we proposed two novel spatial-temporal vibrotactile encoding strategies. By varying the location, number, and duration of the vibrotactile stimuli, these strategies are capable of presenting the exact information about the current time through a series of encoded tactile cues. A physical prototype was developed and fifteen participants were recruited to evaluate the two solutions. Two experiments were performed to evaluate the two encoding strategies. The results showed that the mechanical and digital encoding strategies achieved an average correct rate of 90.55 ± 5.2% and 95.22 ± 4.1% during the focused state, and 95.28 ± 3.3% and 97.78 ± 3.8% during the distracted state, respectively (mean ± SD). Experimental results provide deep insights into utilizing the spatial-temporal patterns of vibrotactile stimuli for developing industrial-scale wearable haptic devices to present time and quantitative information efficiently and privately to the users.
AB - Presenting information privately such as alertness levels and time on the wrist via vibrotactile feedback proves invaluable for visually impaired individuals. Additionally, in situations where the visual channel is occupied, this serves as a discreet solution for sighted users, allowing them to stay informed during meetings or tasks without the need to overtly check their watches, thus minimizing potential distractions. However, it is a challenging task to present time accurately and efficiently to the users using vibrotactile modality due to the perceptual limits of human's haptic channel. Inspired by the metaphors of mechanical and digital watches that have been widely used in our daily lives, we proposed two novel spatial-temporal vibrotactile encoding strategies. By varying the location, number, and duration of the vibrotactile stimuli, these strategies are capable of presenting the exact information about the current time through a series of encoded tactile cues. A physical prototype was developed and fifteen participants were recruited to evaluate the two solutions. Two experiments were performed to evaluate the two encoding strategies. The results showed that the mechanical and digital encoding strategies achieved an average correct rate of 90.55 ± 5.2% and 95.22 ± 4.1% during the focused state, and 95.28 ± 3.3% and 97.78 ± 3.8% during the distracted state, respectively (mean ± SD). Experimental results provide deep insights into utilizing the spatial-temporal patterns of vibrotactile stimuli for developing industrial-scale wearable haptic devices to present time and quantitative information efficiently and privately to the users.
KW - Eye-free communication
KW - haptic watch
KW - information theory
KW - spatial-temporal patterns
KW - vibrotactile stimuli
KW - wearable computing
KW - wearable haptic devices
KW - wrist-worn tactile display
UR - https://www.scopus.com/pages/publications/105001063633
U2 - 10.1109/ACCESS.2025.3548552
DO - 10.1109/ACCESS.2025.3548552
M3 - Article
AN - SCOPUS:105001063633
SN - 2169-3536
VL - 13
SP - 44385
EP - 44402
JO - IEEE Access
JF - IEEE Access
ER -