Journal of Information Systems Engineering and Management

Caring for Special Participants in the Digital Media Era: A Study on Enhancing the Blind User Experience on Short Video Platforms Through Auditory Cues
Xin Wang 1 2, Anping Cheng 3, Kiechan Namkung 4, Younghwan Pan 5 *
More Detail
1 Ph.D candidate, Department of Smart Experience Design, Graduate School of Techno Design, Kookmin University, Seoul, Republic of Korea
2 Lecturer, Shandong Vocational College of Special Education, Jinan, China
3 Doctor, Department of Smart Experience Design, Graduate School of Techno Design, Kookmin University, Seoul, Republic of Korea
4 Professor, Department of AI Design, Graduate School of Techno Design, Kookmin University, Seoul, Republic of Korea
5 Professor, Department of Smart Experience Design, Graduate School of Techno Design, Kookmin University, Seoul, Republic of Korea
* Corresponding Author
Research Article

Journal of Information Systems Engineering and Management, 2024 - Volume 9 Issue 3, Article No: 28013
https://doi.org/10.55267/iadt.07.14774

Published Online: 05 Jul 2024

Views: 138 | Downloads: 107

How to cite this article
APA 6th edition
In-text citation: (Wang et al., 2024)
Reference: Wang, X., Cheng, A., Namkung, K., & Pan, Y. (2024). Caring for Special Participants in the Digital Media Era: A Study on Enhancing the Blind User Experience on Short Video Platforms Through Auditory Cues. Journal of Information Systems Engineering and Management, 9(3), 28013. https://doi.org/10.55267/iadt.07.14774
Vancouver
In-text citation: (1), (2), (3), etc.
Reference: Wang X, Cheng A, Namkung K, Pan Y. Caring for Special Participants in the Digital Media Era: A Study on Enhancing the Blind User Experience on Short Video Platforms Through Auditory Cues. J INFORM SYSTEMS ENG. 2024;9(3):28013. https://doi.org/10.55267/iadt.07.14774
AMA 10th edition
In-text citation: (1), (2), (3), etc.
Reference: Wang X, Cheng A, Namkung K, Pan Y. Caring for Special Participants in the Digital Media Era: A Study on Enhancing the Blind User Experience on Short Video Platforms Through Auditory Cues. J INFORM SYSTEMS ENG. 2024;9(3), 28013. https://doi.org/10.55267/iadt.07.14774
Chicago
In-text citation: (Wang et al., 2024)
Reference: Wang, Xin, Anping Cheng, Kiechan Namkung, and Younghwan Pan. "Caring for Special Participants in the Digital Media Era: A Study on Enhancing the Blind User Experience on Short Video Platforms Through Auditory Cues". Journal of Information Systems Engineering and Management 2024 9 no. 3 (2024): 28013. https://doi.org/10.55267/iadt.07.14774
Harvard
In-text citation: (Wang et al., 2024)
Reference: Wang, X., Cheng, A., Namkung, K., and Pan, Y. (2024). Caring for Special Participants in the Digital Media Era: A Study on Enhancing the Blind User Experience on Short Video Platforms Through Auditory Cues. Journal of Information Systems Engineering and Management, 9(3), 28013. https://doi.org/10.55267/iadt.07.14774
MLA
In-text citation: (Wang et al., 2024)
Reference: Wang, Xin et al. "Caring for Special Participants in the Digital Media Era: A Study on Enhancing the Blind User Experience on Short Video Platforms Through Auditory Cues". Journal of Information Systems Engineering and Management, vol. 9, no. 3, 2024, 28013. https://doi.org/10.55267/iadt.07.14774
ABSTRACT
Screen readers for the visually impaired and blind and short video platforms have conflicting functionalities. In particular, blind users encounter information access barriers when searching for video content, which reduces their user experience. We embed auditory cues at the beginning of a short video corresponding to its content to help blind users identify the video type. The experimental design and evaluation results reveal the significant impact of these auditory cues. By embedding auditory cues, we can significantly enhance the user's usability, recognition efficiency, and emotional experience, surpassing traditional short videos' experience. Speech had the shortest response time and highest accuracy, while auditory icons provided a better emotional experience. In addition, some participants expressed concerns about the potential social privacy issues associated with Speech. This study provides auditory cue-matching solutions for a wide range of short videos. It offers a beacon of hope for enhancing the experience of short video platforms for the blind user. By doing so, we contribute to the well-being of people with disabilities and provide highly versatile user experience design recommendations for a broader range of digital media platforms.
KEYWORDS
REFERENCES
  • Abraham, C. H., Boadi-Kusi, B., Morny, E. K. A., & Agyekum, P. (2022). Smartphone usage among people living with severe visual impairment and blindness. Assistive Technology, 34(5), 611-618.
  • Adebiyi, A., Sorrentino, P., Bohlool, S., Zhang, C., Arditti, M., Goodrich, G., & Weiland, J. D. (2017). Assessment of feedback modalities for wearable visual aids in blind mobility. PloS One, 12(2), e0170531.
  • Ahmed, T., Hoyle, R., Connelly, K., Crandall, D., & Kapadia, A. (2015, April). Privacy concerns and behaviors of people with visual impairments. In Proceedings of the 33rd Annual ACM conference on human factors in computing systems (pp. 3523-3532). New York, NY: Association for Computing Machinery.
  • Bilal Salih, H. E., Takeda, K., Kobayashi, H., Kakizawa, T., Kawamoto, M., & Zempo, K. (2022). Use of auditory cues and other strategies as sources of spatial information for people with visual impairment when navigating unfamiliar environments. International Journal of Environmental Research and Public Health, 19(6), 3151.
  • Blattner, M. M., Sumikawa, D. A., & Greenberg, R. M. (1989). Earcons and icons: Their structure and common design principles. Human–Computer Interaction, 4(1), 11-44.
  • Brewster, S. A., Wright, P. C., & Edwards, A. D. (1993, May). An evaluation of earcons for use in auditory human-computer interfaces. In Proceedings of the INTERACT'93 and CHI'93 conference on human factors in computing systems (pp. 222-227). New York, NY: Association for Computing Machinery.
  • Brooke, J. (1986). System usability scale (SUS): A quick-and-dirty method of system evaluation user information. Reading, UK: Digital Equipment Co Ltd, 43, 1-7.
  • Brown, M. L., Newsome, S. L., & Glinert, E. P. (1989). An experiment into the use of auditory cues to reduce visual workload. ACM SIGCHI Bulletin, 20 339-346.
  • Cabral, J. P., & Remijn, G. B. (2019). Auditory icons: Design and physical characteristics. Applied Ergonomics, 78, 224-239.
  • Chaudary, B., Pohjolainen, S., Aziz, S., Arhippainen, L., & Pulli, P. (2023). Teleguidance-based remote navigation assistance for visually impaired and blind people—Usability and user experience. Virtual Reality, 27(1), 141-158.
  • Csapó, Á., Wersényi, G., Nagy, H., & Stockman, T. (2015). A survey of assistive technologies and applications for blind users on mobile platforms: A review and foundation for research. Journal on Multimodal User Interfaces, 9, 275-286.
  • Dingler, T., Lindsay, J., & Walker, B. N. (2008, June). Learnability of sound cues for environmental features: Auditory icons, earcons, spearcons, and speech. In Proceedings of the 14th International Conference on Auditory Display, Paris, France (pp. 1-6). Retrieved from http://hdl.handle.net/1853/49940
  • Dinh, P. Q., Dorai, C., & Venkatesh, S. (2002). Video genre categorization using audio wavelet coefficients. In ACCV 2002: The 5th Asian conference on computer vision (pp. 1-6). Retrieved from https://staff.itee.uq.edu.au/lovell/aprs/accv2002/accv2002_proceedings/Dinh69.pdf
  • Donker, H., Klante, P., & Gorny, P. (2002, October). The design of auditory user interfaces for blind users. In Proceedings of the second Nordic conference on human-computer interaction (pp. 149-156). New York, NY: Association for Computing Machinery.
  • Dulyan, A., & Edmonds, E. (2010, November). AUXie: Initial evaluation of a blind-accessible virtual museum tour. In Proceedings of the 22nd conference of the computer-human interaction special interest group of Australia on computer-human interaction (pp. 272-275). New York, NY: Association for Computing Machinery.
  • Edworthy, J. R., Parker, C. J., & Martin, E. V. (2022). Discriminating between simultaneous audible alarms is easier with auditory icons. Applied Ergonomics, 99, 103609.
  • Encelle, B., Ollagnier-Beldame, M., Pouchot, S., & Prié, Y. (2011, October). Annotation-based video enrichment for blind people: A pilot study on the use of earcons and speech synthesis. In The proceedings of the 13th international ACM SIGACCESS conference on computers and accessibility (pp. 123-130). New York, NY: Association for Computing Machinery.
  • Finstad, K. (2010). The usability metric for user experience. Interacting with Computers, 22(5), 323-327.
  • Garzonis, S., Bevan, C., & O'Neill, E. (2008, December). Mobile Service Audio Notifications: Intuitive semantics and noises. In Proceedings of the 20th Australasian conference on computer-human interaction: Designing for habitus and habitat (pp. 156-163). New York, NY: Association for Computing Machinery.
  • Garzonis, S., Jones, S., Jay, T., & O'Neill, E. (2009, April). Auditory icon and earcon mobile service notifications: Intuitiveness, learnability, memorability and preference. In Proceedings of the SIGCHI conference on human factors in computing systems (pp. 1513-1522). New York, NY: Association for Computing Machinery.
  • Gaver, W. W. (1987). Auditory icons: Using sound in computer interfaces. ACM SIGCHI Bulletin, 19(1), 74.
  • Guo, J., & Gurrin, C. (2012, November). Short user-generated videos classification using accompanied audio categories. In Proceedings of the 2012 ACM international workshop on Audio and multimedia methods for large-scale video analysis (pp. 15-20). New York, NY: Association for Computing Machinery.
  • Hussain, I., Chen, L., Mirza, H. T., Chen, G., & Hassan, S. U. (2015). Right mix of speech and non-speech: Hybrid auditory feedback in mobility assistance of the visually impaired. Universal access in the Information Society, 14, 527-536.
  • Iturregui-Gallardo, G., & Méndez-Ulrich, J. L. (2020). Towards the creation of a tactile version of the Self-Assessment Manikin (T-SAM) for the emotional assessment of visually impaired people. International Journal of Disability, Development and Education, 67(6), 657-674.
  • Jordan, P. W. (2020). An introduction to usability. Boca Raton, FL: Crc Press.
  • Jordan, P. W., Thomas, B., McClelland, I. L., & Weerdmeester, B. (Eds.). (1996). Usability evaluation in industry. Boca Raton, FL: CRC Press.
  • Kaye, D. B. V., Chen, X., & Zeng, J. (2021). The co-evolution of two Chinese mobile short video apps: Parallel platformization of Douyin and TikTok. Mobile Media & Communication, 9(2), 229-253.
  • Khan, A., & Khusro, S. (2021). An insight into smartphone-based assistive solutions for visually impaired and blind people: Issues, challenges and opportunities. Universal Access in the Information Society, 20(2), 265-298.
  • Khan, M. A., Paul, P., Rashid, M., Hossain, M., & Ahad, M. A. R. (2020). An AI-based visual aid with integrated reading assistant for the completely blind. IEEE Transactions on Human-Machine Systems, 50(6), 507-517.
  • Klinge, C., Röder, B., & Büchel, C. (2010). Increased amygdala activation to emotional auditory stimuli in the blind. Brain, 133(6), 1729-1736.
  • Kuber, R., Hastings, A., & Tretter, M. (2012). Determining the accessibility of mobile screen readers for blind users. In Proceedings of IASTED conference on human-computer interaction. https://userpages.umbc.edu/~rkuber/pubs/IASTED2012b.pdf
  • Lang, P. J. (2019). The cognitive psychophysiology of emotion: Fear and anxiety. In Anxiety and the anxiety disorders (pp. 131-170). Abingdon, UK: Routledge.
  • Laugwitz, B., Held, T., & Schrepp, M. (2008). Construction and evaluation of a user experience questionnaire. In HCI and usability for education and work: 4th symposium of the workgroup human-computer interaction and usability engineering of the Austrian computer society (pp. 63-76). Berlin, Germany: Springer.
  • Leplâtre, G., & Brewster, S. A. (2000). Designing non-speech sounds to support navigation in mobile phone menus. In 6th International Conference on Auditory Display (ICAD) (pp. 190-199). Retrieved from https://eprints.gla.ac.uk/3210/1/icad20001.pdf
  • Lewis, J. R. (2018). The system usability scale: Past, present, and future. International Journal of Human-Computer Interaction, 34(7), 577-590.
  • Lewis, J. R., & Sauro, J. (2017). Can I leave this one out? The effect of dropping an item from the SUS. Journal of Usability Studies, 13(1), 28-46.
  • Liu, X., Carrington, P., Chen, X. A., & Pavel, A. (2021, May). What makes videos accessible to blind and visually impaired people?. In Proceedings of the 2021 CHI conference on human factors in computing systems (pp. 1-14). New York, NY: Association for Computing Machinery.
  • Maes, P. J., Giacofci, M., & Leman, M. (2015). Auditory and motor contributions to the timing of melodies under cognitive load. Journal of Experimental Psychology: Human Perception and Performance, 41(5), 1336.
  • Mankoff, J., Fait, H., & Tran, T. (2005, April). Is your web page accessible? A comparative study of methods for assessing web page accessibility for the blind. In Proceedings of the SIGCHI conference on Human factors in computing systems (pp. 41-50). New York, NY: Association for Computing Machinery.
  • Mehrabian, A., & Russell, J. A. (1974). An approach to environmental psychology. Cambridge, MA: MIT Press.
  • Mieda, T., Kokubu, M., & Saito, M. (2019). Rapid identification of sound direction in blind footballers. Experimental brain research, 237, 3221-3231.
  • Mynatt, E. D. (1994, April). Designing with auditory icons: How well do we identify auditory cues?. In Conference companion on human factors in computing systems (pp. 269-270). New York, NY: Association for Computing Machinery.
  • Nees, M. A., & Liebman, E. (2023). Auditory icons, earcons, spearcons, and speech: A systematic review and meta-analysis of brief audio alerts in human-machine interfaces. Auditory Perception & Cognition, 6(3-4), 300-329.
  • Nelson, P. A., Dial, J. G., & Joyce, A. (2002). Validation of the cognitive test for the blind as an assessment of intellectual functioning. Rehabilitation Psychology, 47(2), 184.
  • Palmer, S. E., Schloss, K. B., Xu, Z., & Prado-León, L. R. (2013). Music–color associations are mediated by emotion. Proceedings of the National Academy of Sciences, 110(22), 8836-8841.
  • Redondo, J., Fraga, I., Padrón, I., & Piñeiro, A. (2008). Affective ratings of sound stimuli. Behavior Research Methods, 40, 784-790.
  • Rokem, A., & Ahissar, M. (2009). Interactions of cognitive and auditory abilities in congenitally blind individuals. Neuropsychologia, 47(3), 843-848.
  • Roth, P., Petrucci, L., Pun, T., & Assimacopoulos, A. (1999, May). Auditory browser for blind and visually impaired users. In CHI'99 extended abstracts on Human factors in computing systems (pp. 218-219). New York, NY: Association for Computing Machinery.
  • Rowlands, T., Waddell, N., & McKenna, B. (2016). Are we there yet? A technique to determine theoretical saturation. Journal of Computer Information Systems, 56(1), 40-47.
  • Saariluoma, P., & Jokinen, J. P. (2014). Emotional dimensions of user experience: A user psychological analysis. International Journal of Human-Computer Interaction, 30(4), 303-320.
  • Šabić, E., Chen, J., & MacDonald, J. A. (2021). Toward a better understanding of in-vehicle auditory warnings and background noise. Human factors, 63(2), 312-335.
  • Sanderson, P., Wee, A., Seah, E., & Lacherez, P. (2006). Auditory alarms, medical standards, and urgency. In Proceedings of the 12th International conference on auditory display. London, UK: University of London.
  • Scherer, K. R. (2004). Which emotions can be induced by music? What are the underlying mechanisms? And how can we measure them?. Journal of New Music Research, 33(3), 239-251.
  • Shimomura, Y., Hvannberg, E. T., & Hafsteinsson, H. (2010). Accessibility of audio and tactile interfaces for young blind people performing everyday tasks. Universal Access in the Information Society, 9, 297-310.
  • Snyder, J. (2005, September). Audio description: The visual made verbal. In International congress series (Vol. 1282, pp. 935-939). Amsterdam, Netherlands: Elsevier.
  • Soares, A. P., Pinheiro, A. P., Costa, A., Frade, C. S., Comesaña, M., & Pureza, R. (2013). Affective auditory stimuli: Adaptation of the international affective digitized sounds (IADS-2) for European Portuguese. Behavior research methods, 45, 1168-1181.
  • Stephan, K. L., Smith, S. E., Martin, R. L., Parker, S. P., & McAnally, K. I. (2006). Learning and retention of associations between auditory icons and denotative referents: Implications for the design of auditory warnings. Human factors, 48(2), 288-299.
  • Theodorou, P., Tsiligkos, K., Meliones, A., & Filios, C. (2022). A training smartphone application for the simulation of outdoor blind pedestrian navigation: Usability, UX evaluation, sentiment analysis. Sensors, 23(1), 367.
  • Townsend, J. T., & Altieri, N. (2012). An accuracy–response time capacity assessment function that measures performance against standard parallel predictions. Psychological review, 119(3), 500.
  • van Someren, M. W., Barnard, Y. F., & Sandberg, J. (1994). The think aloud method: A practical guide to modelling cognitive processes. London, UK: Academic Press.
  • Voykinska, V., Azenkot, S., Wu, S., & Leshed, G. (2016, February). How blind people interact with visual content on social networking services. In Proceedings of the 19th ACM conference on computer-supported cooperative work & social computing (pp. 1584-1595). New York, NY: Association for Computing Machinery.
  • Walker, B. N., & Kramer, G. (2005). Mappings and metaphors in auditory displays: An experimental assessment. ACM Transactions on Applied Perception (TAP), 2(4), 407-412.
  • Wang, Y., Liang, W., Huang, H., Zhang, Y., Li, D., & Yu, L. F. (2021, May). Toward automatic audio description generation for accessible videos. In Proceedings of the 2021 CHI conference on human factors in computing systems (pp. 1-12). New York, NY: Association for Computing Machinery.
  • Wu, Y., Wang, X., Hong, S., Hong, M., Pei, M., & Su, Y. (2021). The relationship between social short-form videos and youth’s well-being: It depends on usage types and content categories. Psychology of Popular Media, 10(4), 467.
  • Zhou, R., Fong, P. S., & Tan, P. (2014). Internet use and its impact on engagement in leisure activities in China. PloS one, 9(2), e89598.
LICENSE
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.