How Captions Help ESL Learners Improve Their English
Updated: June 3, 2019
In 2013, the number of individuals in the United States who spoke English with limited proficiency was estimated at 25.1 million, or 8% of the population. And according to the US Department of Education, over 9% of children in the national public school system participate in English language learning programs, while in the state of California that number is over 22%.
These numbers grow steadily every year, meaning there are vast opportunities in helping English language learners (ELLs) succeed and in reaching wider audiences with video content.
Traditional English-as-a-Second-Language (ESL) classes provide a great foundation for basic vocabulary, grammar, syntax and other linguistic features of a language. However, watching videos with captions or subtitles over the audio of native speakers is a great way for English students to improve vocabulary, pronunciation, inflection, and pick up on more nuanced features of English such as slang terms, phrases, and colloquialisms.
First, let’s quickly clarify some key terminology:
- Subtitles: time-synchronized text on a video that translates the spoken audio to another language.
- Dubbing: a voice-over, or time-synchronized spoken audio translated into another language from that of the video, replacing the original speaker’s voice.
- Captions: time-synchronized text on a video in the same language as the spoken audio.
- Closed captions: captions that can be turned on and off.
- Open captions: captions that are “burned” into the video; cannot be turned off.
English Captions Improve Language Retention
New ELLs listening to a native English speaker talk often find it difficult to identify which words are being spoken, how they are spelled, and in what order they are arranged (syntax). That’s why, for anyone learning a new language, it is extremely helpful to read the words one is hearing at the same time.
Even if the viewer cannot fully understand what they are reading on screen, captions can provide some helpful context, encouraging the viewer to stay engaged with video. Time-synchronized captions focus the ELL’s attention on the words being spoken in real time, which helps with retention of vocabulary, spelling, pronunciation, grammar, and other valuable linguistic qualities one must understand to speak a language properly.
In 2009, a study conducted with Dutch ELLs concluded that watching English-language video content with English captions led to high scores after testing for aural word recognition, while watching English videos with Dutch subtitles led to lower scores on those tests. This suggests that reinforcing English speech with English text helps ELLs memorize spoken and written words in the language, leading to stronger vocabulary skills.
English Captions Help Decipher Accents and Dialects
Many Americans have a hard time understanding certain accents and dialects from places like the UK, Ireland, Australia, and other places where English is spoken. So, imagine what ESL learners have to go through in the same scenario.
Accents tend to go hand in hand with dialects, or regionally-exclusive ways of speaking. Captions can help ELLs learn words and phrases from different dialects by helping them process the audio in the videos they watch.
In the previously mentioned study with Dutch ESL students, it was found that adding closed captions to videos with Scottish and Australian actors speaking in native accents and dialects helped the students identify the words spoken. Interestingly, it was also found that watching those same videos with Dutch subtitles diminished students’ success in word recognition:
If an English word was spoken with a Scottish accent, English subtitles usually told the perceiver what that word was, and hence what its sounds were. This made it easier for the students to tune in to the accent.
In contrast, the Dutch subtitles did not provide this teaching function, and, because they told the viewer what the characters in the film meant to say, the Dutch subtitles may have drawn the students’ attention away from the unfamiliar speech.
In 2008, an academic study invloving 20 Chinese ESL students found that video content with captions helped students’ learn new words and expressions better than students who watched the same content without captions. Specifically, the study revealed that “the use of video plus captions can help students learn colloquial language [including] how and when native speakers use it.”
This means that by adding captions to their videos, English-speaking online video providers on YouTube and elsewhere can attract viewers anywhere in the world who want to improve their language skills and understand as much regionally-varied English as a native speaker.
3Play Media’s round trip integration with YouTube provides an automated workflow for adding captions and subtitles.Your YouTube videos can be processed in a matter of hours and captions will be automatically sent to YouTube and added to your videos.
‘Subbing’ vs. ‘Dubbing’
If you’ve ever gone to see a foreign film in a movie theater in which the actors talk in a different language, it is either ‘dubbed’ or ‘subbed’ (subtitled) so that viewers can understand what is being said. Everyone has their preference, but for students of a second language subbing tends to be much more helpful.
Subbing is better for ELLs because the translated text reinforces the speech, helping the viewer learn by encouraging them to match the foreign speech with words from their own language.
Hearing English speakers talk normally on video helps the viewer tune their ear to the unique sounds of spoken English, which is necessary to learning any new language. Whereas with dubbing, all of that exposure to another language is lost.
Other Benefits of Captions and Subtitles for ELLs
- Control: You can pause and rewind whenever you need to, so you can go to “ESL class” whenever you want!
- Subject-specific vocabulary: It helps broaden vocabulary about specific subjects (e.g. YouTube videos about science, cooking, politics, business, pop culture, etc.).
- Mouth movement: In most cases you can watch the mouths of the person speaking, which helps with lip-reading, and pronunciation of difficult sounds unique to a language.
- Situational context: Watching foreign films and TV shows with subtitles are great for understanding when to use formal/casual language, and knowing when and when not to use certain words.
How to Optimize Long-Form Video Transcripts for SEO: The Wrong Way and the Right Way
Time and again, online video publishers hear about the benefits of short-form video – simple and fun, yet engaging. When video moved online we favored this bite-size format for multiple reasons. Slow bandwidths made it easier to watch shorter videos. Also, from…
How to Write Effective Alt Text
Alternative text (also called alt text for short) is an attribute within an image tag that describes an image or gif. Alt text’s purpose is to ensure images are accessible for blind or low vision users. However, adding alt text also benefits…
5 Ways Transcripts Can Improve Your Podcast
Podcasting is a valuable marketing tool, but what if we told you that with transcripts, you can squeeze even more value out of your audio content? Why Podcasts? The way people consume content is changing. Brands must create content that…