Student Closed Captioning Use Survey: A Deep Dive

December 6, 2016 BY PATRICK LOFTUS
Updated: January 4, 2018


When results from the nationwide student survey on closed captioning use were released, a variety of students demographics were identified in the study.

Each one of these subgroups, including deaf students, hearing students, and students whose native language is not English, rely on closed captioning for different reasons and to different degrees.

We took a deep dive on the student study and explored these enlightening results in a webinar entitled, National Research Results: How Different Student Subgroups Use Closed Captioning.

Continue reading for highlights from the Q and A (and, be sure to check out the institutional study report to see how colleges and universities all over the US are handling closed captioning):

According to the study, 71% of students who have difficulty hearing said captions were helpful. Do you have information about the other 29%?

The overall percentage of students who found [closed captions] to be not helpful at all was only 1%.

KATIE LINDER: We asked if captions were very or extremely helpful. So, these were the respondents who rated captions as very or extremely helpful.

They did not necessarily rate it as not helpful. There were actually two other categories. I think it was “somewhat helpful” and then it was “not helpful at all.” And so for this population — for students who have difficulty with hearing — a little over 70% said they were very or extremely helpful to them. But they may have also answered that they were helpful to other degrees.

The overall percentage of students who found [closed captions] to be not helpful at all was only 1%. So I would say that this population is probably close to 100% finding them helpful to some degree. But 70% found them to be very or extremely helpful.

Do you have any thoughts on why the percentage of students with visual impairments rated the usefulness of captions so highly?

KATIE LINDER: We did not distinguish [between] visual impairments; the difference between students who were blind, wore glasses, or had visual impairments that required some kind of academic accommodation.

So I think that may be part of it — that we [may have] had a relatively large amount of one group which wasn’t necessarily seeking a kind of accommodation, or didn’t have such a low level of vision without the use of glasses, that it made a difference to them. But it also wasn’t something that really came out in the qualitative comments. And I think you’ll see in the numbers that the transcripts were more helpful to that group than the closed captions were, which may indicate use of screen readers.

But again, that was not something that came out in a particular way in the study. I don’t think anyone in qualitative comments mentioned screen readability when they were talking about the transcripts and how they could be helpful to them.

You said about 700 responses were disqualified from the data. What did you learn about the questionnaire vehicle or any other accompanying documents that made you disqualify these students?

KATIE LINDER: As we were cleaning the data, the main thing we were looking for was lack of response. And we had a certain cut-off point — students who maybe answered the first couple of questions, but then didn’t respond to anything else. We just couldn’t use their data. It was not useful to us.

And there were certain areas where we would say– at the very least, if we had information about their demographics and they answered some questions about their use of closed captions and transcripts, we kept that in the pool. And so you can see in the report we have different ends for different questions, just based on how many students responded to things.

But the main thing, I think, is that this was a longer survey. It was 46 items. We did really try to get at both the closed captions and the transcripts. So it wasn’t too surprising to me that we had students and respondents that just weren’t finishing it. And that was the biggest reason that the data was cleaned and we had that many that we removed is that they weren’t complete responses.

The other piece, too, is we certainly had students, I think, who gamed our system a little bit and wanted to be in the raffle, but didn’t necessarily want to respond to the entire survey. So they made their way through the survey so they could enter into the raffle, but only answered the questions that were required. And we did not require answers to absolutely everything.

So that’s, I think, a couple of the reasons. But the main thing was incomplete data, which wouldn’t have allowed us to do the kind of analysis that we needed to.

Were the learning disabilities claimed by the students reported individually, or did you get this info from the Office of Disability Services?

KATIE LINDER: All of this data is self-reported. So what we asked on the learning disabilities question was, “have you ever been diagnosed with a learning disability?” Not just, “do you have one?”

The students responded with whether or not they had been diagnosed with a learning disability. None of this information that we requested from students was cross-checked in any way, in part because we wanted anonymity with the study.

Even with some of our qualitative data, we were very careful with students who self-identified with having very particular kinds of disabilities that we made sure — because some of this data was returned to the schools. That was part of our agreement with them for helping us recruit.

And so we had to clean some of the qualitative data to make sure that students’ identities were protected. So for that reason — one of many — we did not connect it back with the institution. But we did ask for a formal diagnosis.

Did you see any crossover between students who struggled with maintaining attention and students who identified as having other disabilities?

KATIE LINDER: We did not. I worked with a couple different stats consultants on this particular piece of this study. And we basically all played with the data in a range of different kinds of ways. And that was not something that jumped out to us.

I’d have to go back and look at our specific analyses to see what are the kinds of things that we ran that would really get at that question. But it was not something that was significant enough that it made it into the report.

What was the percentage of students under 25, and why do you think they were less likely to use closed captions?

KATIE LINDER: So we had 36.3% who identified as adult learners, which means that we have about 63.7% who identified as students who were under the age of 25.

[Why were less likely to use captions?] That’s something I don’t know. I can hypothesize some different things.

Some of the adult learners that we heard from said things like they are raising small children. And so they use things like closed captions and transcripts to watch videos when [children] are sleeping in the same room. Or they’re often in spaces where it’s very quiet, and they can’t play video without a headset or something like that.

So there are definitely some environmental reasons that we might see a difference between those two groups. But it was not something that was very clearly coming out in the qualitative data that we looked at as to why that particular subgroup was not necessarily utilizing them as much.

Could you explain what you meant by hindrances? I’m not clear on how captions and transcripts would be hindrances, so I didn’t understand the study’s questions.

KATIE LINDER: Because we asked them about helpfulness, we also wanted to ask them if there was any reason why closed captions or transcripts would be hindrances for them.

…hindrances were [mentioned] significantly less. We had a lot more comments about the helpfulness [of closed captions].

For closed captions, we had students who commented a lot about things like captions blocking things on the screen — that’s a hindrance. So if they take up too much room on the screen and they block important information, or if there are typos, if they’re improperly synced with the audio… those kinds of things were what students noted as hindrances for closed captions.

For transcripts, students noted things like “they weren’t available at all,” typos, or that the kind of formatting of the transcript was not helpful to them because it was just these huge blocks of text.

And they noted things like it was difficult to print them out and to find time and resources to go and print transcripts if they wanted to have them in print form. So we actually allowed students to identify a range of hindrances in their qualitative responses. And those things are all broken out pretty specifically in the student report.

The other thing that’s interesting, though, is that those hindrances were really significantly less. We had a lot more comments about the helpfulness.

Qualitatively, I think we had 900 comments or more — maybe 1,200 — for the [helpfulness of] closed captions. And then for the hindrances, it was like 100 — less than 150 comments. And some of those comments said things like, it’s not a hindrance.

In general, we saw students finding closed captions and transcripts to be significantly more helpful than they found them to be a hindrance.

In general, we saw students finding closed captions and transcripts to be significantly more helpful than they found them to be a hindrance. We talk about in the first webinar -– and this is also included in the report -– that the good thing about the hindrances that the students talk about is that a lot of them are something that’s fixable in some kind of quality assurance.

So if you’re including the creator of the transcript or the closed captions in looking at what’s been created, they’re going to notice if there’s typos, if people’s names are spelled wrong, if the audio is not synced properly, if it’s blocking important information. If you have a kind of quality assurance measure of any kind, it might catch some of those things. And they’re definitely fixable.

So I would say if you’re interested in more specifics, and especially on sample comments related to hindrances, check out the study. And then you might also want to look at the first webinar recording as well.

Did students comment on the usefulness of being able to download the transcript versus having captions?

KATIE LINDER: They did comment on that, and in a couple particular ways. The one that immediately comes to mind — and again, I would definitely recommend looking at the study to get in more detail about this — was the searchability of transcripts and how that was not available for closed captions.

When we asked about helpfulness, we had some qualitative comments that just said, “Ctrl-F” — the idea that they could search for a particular word or phrase — and that really helped in terms of efficiency.

But we also heard that same thing in terms of the [option to replay video to review the] closed captions, so that students could very quickly scan through a video.

So, we saw that in a couple different ways. But definitely the searchability of the transcripts was something that was a big bonus for students.

Download the full research study on student closed captioning use in higher education below:

Download the research study results on student uses and perceptions of closed captions & transcripts

3play media logo in blue

Subscribe to the Blog Digest

Sign up for our blog digest. Your privacy is important to us. We’ll never share your email address.