Audio Matters in Visual Attention
Document Type
Article
Publication Date
11-2014
Publication Source
IEEE Transactions on Circuits and Systems for Video Technology
Abstract
There is a dearth of information on how perceived auditory information guides image-viewing behavior. To investigate auditory-driven visual attention, we first generated a human eye-fixation database from a pool of 200 static images and 400 image-audio pairs viewed by 48 subjects. The eye tracking data for the image-audio pairs were captured while participants viewed images, which took place immediately after exposure to coherent/incoherent audio samples. The database was analyzed in terms of time to first fixation, fixation durations on the target object, entropy, AUC, and saliency ratio. It was found that coherent audio information is an important cue for enhancing the feature-specific response to the target object. Conversely, incoherent audio information attenuates this response. Finally, a system predicting the image-viewing with the influence of different audio sources was developed. The detailedly discussed top-down module in the system is composed of auditory estimation based on Gaussian mixture model-maximum a posteriori algorithm-universal background model structure, as well as visual estimation based on the conditional random field model and sparse latent variables. The evaluation experiments show that the proposed models in the system exhibit strong consistency with eye fixations.
Inclusive pages
1992-2003
ISBN/ISSN
1051-8215
Copyright
Copyright © 2014, IEEE
Publisher
IEEE
Volume
24
Peer Reviewed
yes
Issue
11
eCommons Citation
Chen, Yanxiang; Nguyen, Tam; Kankanhalli, Mohan; Yan, Jun; Yan, Shuicheng; and Wang, Meng, "Audio Matters in Visual Attention" (2014). Computer Science Faculty Publications. 75.
https://ecommons.udayton.edu/cps_fac_pub/75
COinS
Comments
Permission documentation is on file.