Search

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z Ç Ł Ş

Showing 11 - 20 results of 24 for Montero Perez

The integration of auditory and textual input in vocabulary learning from subtitled viewing: An eye-tracking study
...Montero Perez, 2022). The provision of both auditory and textual input allows learners to link auditory and written forms (or L1 meanings) of unknown words during viewing, which could potentially fa...

by Andi Wang
in Volume 29 Number 3, October 2025 Special Issue: Multimodality in CALL

How captions help people learn languages: A working-memory, eye-tracking study
...Montero Perez, M., Peters, E., & Desmet, P. (2014). Is less more? Effectiveness and perceived usefulness of keyword and full captioned video for L2 comprehension. ReCALL, 26(1), 21–43. Montero Perez...

by Susan Gass, Paula Winke, Daniel R. Isbell, Jieun Ahn
in Volume 23 Number 2, June 2019

Announcements and acknowledgements
...Montero Perez, Liam Murray, Phuong Nguyen, Elke Nissen, Breffni O’Rourke, Ana Oskoz, Ana Padial, Luisa Panichi, Magali Paquot, Youngmin Park, Mark Pegrum, Carmen Perez-Llantada, Lucy Pickering, Arja...

in Volume 22 Number 1, February 2018

Announcing the 2024 Dorothy Chun Best Paper Award
...Montero Perez! Their article, Audiovisual Input in Language Learning: Teachers’ Perspectives, has been awarded this year’s best paper award! Award Committee Chair, Sangmin-Michelle Lee, of...

in News & Announcements

Attention and learning in L2 multimodality: A webcam-based eye-tracking study
...Montero-Perez et al., 2014), however, mainly focused on evaluating learning outcomes from multimodal input. Very limited research has explored how L2 learners, especially young learners (Pellicer-Sá...

by Pengchong Zhang, Shi Zhang
in Volume 29 Number 1, 2025

Learning pronunciation through television series
...Montero Perez, M. (2020). Multimodal input in SLA research. Studies in Second Language Acquisition, 42(3), 653–663. https://doi.org/10.1017/S0272263120000145 Montero Perez, M., Van Der Noortgate, W...

by Paweł Scheffler, Karolina Baranowska
in Volume 27 Number 1, 2023

Ecological semiotics: Multimodality, multilingualism, and situated language learning in the AI era
...Montero Perez et al., 2014), help learning collocations (Puimège et al., 2023), develop intercultural competence (Tinedo-Rodríguez, 2025), and acquire grammatical structures (Cintrón-Valentín et al....

by Robert Godwin-Jones
in Volume 29 Number 3, October 2025 Special Issue: Multimodality in CALL

Announcements and news from our sponsors
...Perez Orsini-Jones Marina Ines A. Martin Elena Martin Monje Shannon McCrocklin Mairi McLaughlin Joanne Meredith Fanny Meunier Haitham Mohamed Maribel Montero Perez Colleen Moorman Charles N...

in Volume 25 Number 1, February 2021 Special Issue: Big Data in Language Education & Research

Promoting grammatical development through multimodal digital recasts in video-conferencing tasks
...Montero Perez, M. (2020). Multimodal input in SLA research. Studies in Second Language Acquisition, 62, 653–663. https://doi.org/10.1017/S0272263120000145 Montero Perez, M., Van Den Noortgate, W., ...

by Yeonwoo Jung, Andrea Révész
in Volume 28 Number 1, 2024

First- and second-language subtitles and cognitive load: An EEG study
...Montero Perez et al., 2014; Vanderplank, 1988). Studies have consistently demonstrated that both L1 and L2 subtitles enhance viewers’ understanding of videos (e.g., Hayati & Mohmedi, 2011; Lwo & Lin...

by Taegang Lee, Yoohyoung Lee, Sungmook Choi
in Volume 29 Number 1, 2025