Multimodal reading comprehension competence in university students from social sciences and engineering

Authors

DOI:

https://doi.org/10.18488/61.v14i2.4884

Abstract

This study investigates multimodal reading comprehension competence among university students in social sciences and engineering programs, focusing on the influence of cognitive, attitudinal, and academic variables. Adopting a quantitative, non-experimental, and correlational design, the research employed Poisson regression analysis to examine how different semiotic modes verbal, graphic, and combined affect comprehension outcomes. The study also considered prior experience with text formats, self-reported comfort levels, and academic progression as predictor variables. The sample consisted of 426 students from a private Peruvian university, stratified by discipline and semester level. Results show that students who received multimodal texts achieved significantly higher scores, suggesting that multimodal integration enhances comprehension through dual-channel processing. Additionally, positive perception of text comfort and previous exposure to verbal formats were positively associated with performance. Conversely, students with stronger mathematical or graphic backgrounds and those in advanced academic semesters performed less well, indicating potential gaps in multimodal literacy development across disciplines and stages of study. These findings underscore the importance of integrating multisemiotic literacy as a transversal academic skill, not only as a technical reading competence but as a critical component of epistemic participation in higher education. The study highlights the need for curricula to explicitly address multimodal comprehension strategies and to provide students with guided practice in interpreting and producing complex multimodal artifacts.

Keywords:

Academic performance, Disciplinary literacy, Multimodal reading comprehension, Multisemiotic competence, Semiotic modes, Text perception, University students.

Downloads

Download data is not yet available.

Published

2026-04-03