« Return to video

Do Captions & Transcripts Improve Student Learning? [TRANSCRIPT]

SOFIA LEIVA: Thanks for joining this webinar entitled “Do Captions and Transcripts Improve Student Learning?” I’m Sofia Leiva from 3Play Media, and I’ll be moderating today.

And today I’m joined, from the University of South Florida St. Petersburg, by Karla Morris, Manager of Instructional Design Services; Lyman Dukes, Professor of Special Education; and Casey Frechette, Assistant Professor in our Journalism and Digital Communication department. And with that, I’ll hand it off to Karla, Lyman, and Casey, who have a wonderful presentation prepared for you all.

LYMAN DUKES III: Thank you. Good afternoon to everyone. We’re excited to have the opportunity to participate in the 3Play Media webinar series. This afternoon we’ll share the results of a study we recently completed, in which we examined the use of closed captioning and interactive transcripts in a set of online courses at our university, which is, as mentioned, the University of South Florida St. Petersburg.

The question and answer part of the presentation, as mentioned, is at the end of the slide deck presentation. But again, submit those questions as you have them, please. Our webinar is titled “How Closed Captioning and Interactive Transcripts Impact Student Learning.”

Our webinar objectives are listed here. And I’ll take a moment to read them. First, we’ll introduce the study details. Next, we’ll share how students use captions and transcripts to support their learning. Third, we’ll address the learning differences between students who use captions and transcripts at a higher level and those who use the tools at a lower level. And last, we’ll share recommendations and future directions.

We’ll begin by giving some background information related to this study that will provide a basis for the conditions that led to why the study was completed. And we’ll also share a bit about the demographic makeup of our institution at USF St. Petersburg.

Prior to sharing those details and results, we’ll introduce ourselves. My name is Lyman Dukes. And I’m a professor in the College of Education at USF St. Petersburg. I also have experience as a university administrator and middle and high school teacher. And one of my research interests is the design and delivery of courses employing concepts of universal design.

CASEY FRECHETTE: And good afternoon, everyone. I’m Casey Frechette. And although I teach and do research in our department of journalism and digital communication, my background is in education and instructional technologies.

And I’m particularly interested in my research in exploring how to create inclusive online learning environments. Some of that research concerns cultural inclusively, and some of it deals more with accessibility. And the study we’ll be sharing with you today on closed captioning and interactive transcripts touches on the latter.

KARLA MORRIS: And I’m Karla Morris. I’m the manager of instructional design services in our online learning unit. My background in research has been in accessibility, web accessibility, and assistive technologies. And my primary approach to this is helping institutions make sure that they have the processes and the support in place to put out e-learning that is accessible for everyone.

LYMAN DUKES III: So we’ll begin by sharing some demographic information about USF St. Petersburg in order to provide some context for our study design and subsequent data that we’ll share. So we’re regarded as a smaller postsecondary institution. The campus student population, including both full- and part-time students, is approximately 6,500.

We’re based in downtown St. Petersburg, which is considered an urban setting. And the majority of the students come from the greater Tampa Bay region, of which St. Petersburg is a part. Of many of our students, including many that live in campus housing, are choosing online courses over face-to-face course sections.

Currently, approximately one third of the campus SCH, or what is probably better described as Student Credit Hours, are generated through online courses in the fall and spring terms. And about two thirds of the campus SCH is provided through online course enrollment in the summer term. Clearly, many of our students are selecting online course options. And that number has been consistently increasing year over year.

So our institution has an accessibility committee that has had an opportunity to address a number of academic accessibility issues. And online instruction has been a particular focus. So we decided we were interested in examining a set of online instructional practices we believe will benefit all students participating in online coursework.

A few years back, we examined the use of closed captioning in the videos used in two of our larger enrollment online courses. And then more recently, as a pilot for the current study, we looked at the use of interactive transcripts in, again, two larger enrollment online courses on campus.

So our university system, which is currently made up of three separate universities, has determined that media used at the three institutions that is instructor-developed or third-party videos included in any online course must be captioned no later than 2021. Additionally, faculty are strongly encouraged and supported to employ the Quality Matters guidelines for online instruction.

Quality Matters are what is also referred to as QM. For those that aren’t familiar with it, it is a program and process for developing and measuring the quality of an online course. And indeed, one of the QM guidelines references the captioning of online courses.

I mentioned universal design a moment ago. And I’d like to take a moment and share some remarks about universal design and how employing closed captioning in online courses or the use of interactive transcripts reflects the use of universal design principles. Universal design stipulates that products, spaces, and experience, which would include education, be designed to maximize accessibility. So instead of providing alternatives for certain populations, universal design involves inherent accessibility with the potential to benefit everyone.

In the context of higher education, universal design for learning provides guidelines for developing and delivering instruction that promotes access to and understanding of content for all the students in a course. And central to UDL is the principle that multiple ways to interact with content and people are provided.

Because more students are enrolling in online courses, the potential for UDL principles to enhance higher education is definitely on the rise. Colleges and universities, for example, are seeing an increase in the number of students with disabilities enrolling. And in many cases, these students are not registering with the campus office that supports students with disabilities. In addition, students with disabilities participate in online courses at disproportionately high rates.

As mentioned a moment ago, we conducted two earlier studies that led us to the current examination. The original pilot study examined the cost and benefit of captioning all online course video content. And in this study, students benefited in a number of ways.

First, clarification became an option. Students reported difficulty hearing the instructor at times for various reasons. And captioning allowed them to understand the lectures more fully, even when the audio content wasn’t discernible in some cases. Second, comprehension was improved. Some students found the option to both hear and see content more consistent with the way in which they learn.

Third, spelling of keywords was noted by students as helpful. Students appreciated the chance to see how unfamiliar words were spelled, for example. And fourth, note taking was enhanced. More generally, students were reporting that using captions as a note-taking tool was a benefit.

Next, a pilot study on the benefits of the use of interactive transcripts was completed. And it included a comparison to closed captioning. It was similar to the study we’ll describe in detail in a moment. But overall, students found both options to be helpful. In each case, student course performance improved.

So in the earlier pilot study, both captioning and interactive transcripts made a difference in student learning. In fact, scores from the pre- to the post-test increased the letter grade in both treatments. However, the interactive transcript group showed twice as much improvement as the captioning group did.

Students in the interactive transcript treatment found access to the transcript particularly helpful, with nearly half saying that it was extremely beneficial and virtually all students indicating that there was some benefit to having transcript access. And of course, this led us to the current study, in which our goal was to examine the benefit of captioning and interactive transcripts with a larger sample and the inclusion of some additional research questions.

So next, we’ll dig into that study. We’ll dialogue or share and then subsequently dialogue about the study goals, methods, and findings. And as you may already know, you can find a recently completed report detailing the study and its outcomes on the 3Play Media website.

We’d like to take a moment and recognize 3Play Media, as they fully supported our interest in examining the use of captioning and interactive transcripts in our online courses. Education and industry collaboration is a particularly effective means of completing the important work around examining how to– in our case, at least– best provide quality instruction to the university students in our courses. And we believe that other institutions will likely find benefit in the use of valuable tools, such as the interactive transcript or closed captioning.

As far as the study itself is concerned, it evolved over time. And the 3Play team remained flexible throughout necessary changes to our procedures. And lastly, we’d encourage other researchers to consider such partnership, as it can provide opportunities to conduct important research, help spread awareness of research outcomes. And the partnerships can aid in getting new ideas out to larger audiences that can subsequently apply the valuable outcomes of this work.

So the study was designed to shed light on the educational value of both closed captioning and interactive transcripts in lecture-based online courses. We explored the effectiveness of closed captioning and interactive transcripts through a correlation study that examined the relationships between captioning, interactive transcripts, student demographics, student behaviors, and student comprehension of course material in an applied context.

So the study itself is comprised of three questions. And I’ll simply share those. First, how do students use captions and transcripts to support their learning? Second, do students who use captions and transcripts at a higher level learn more than those who use tools at a lower level?

And third, do students who use captions and transcripts at a higher level comprehend the content better than those who use the tools at a lower level? So this question addresses the depth of learning along three distinct levels, which are noted here. And that is of retention, comprehension, and application.

So next you’ll hear from Karla, who will begin explaining the study methodology.

KARLA MORRIS: Great. Thank you, Lyman. So to implement this study, we implemented it within seven courses from the College of Education, the College of Arts and Sciences, and the College of Business. And it was really important for us to capture courses and students from all of these different disciplines to make sure that we were examining whether the things that we were finding were common or had differences across the different disciplines there.

Students were able to opt into the study. When they were given the choice to opt in, they were sorted into two groups. One group received just closed captioning on their first module video, where another group was sorted into the group that had the option for closed captioning and interactive transcript on their video. When students were sorted into that group, they were given an orientation video that showed them how to use either or both of those tools depending on which group they were in giving them tips for how to use the player or how to turn the tools on and off– i.e., if they got distracting or anything like that.

The study treatment was only applied to the first module video. So this is the first unit of content within the 16-week or the 10-week course that it was studied within. And after that first module of content, all of the videos after that point had at least closed captioning or just closed captioning applied so that the students didn’t lose that support if they needed it. But it wasn’t essential to what we were looking at for that first module of content.

Students completed a pretest prior to getting to their first module. The pretest consisted of questions related to that content to gauge their expertise with the recall, the application, and the comprehension questions. Then at three regular intervals throughout the semester, they completed checkpoints, again with that same set of questions for recall, transfer, comprehension, but with different questions looking at those skill sets.

And then they took a post-test, which again looked at the same level of learning but also administered a demographic and student attitudinal survey. We also were able to observe the student behavior with how they interacted with the videos, such as how often they turned the treatment on and off, how they used the interactive transcript through searching and clicking through the highlighted elements, how long they were in the videos, and any sort of behavioral interaction they had with that module 1 video, to give us some information about that as well.

And we wanted to show on screen here– and I’ll describe it as well– what the students saw when they were sorted into each group. So the first one you see is an example of a video for the group that just received the closed captioning as a support. So the player itself functions just as you would expect a typical video player to function, with the play button where it should be, as well as the closed captioning button down in the lower right by the volume button.

For us, we had the closed captions appear below the video and the player controls. That is because typically in academia and in lecture-based courses, there is a visual aid that the instructor will put on the screen. And we didn’t want to block out any potential text that was on a PowerPoint slide that could be important for information for those students to gain.

The next slide shows what students who were sorted into the closed captioning and interactive transcript group saw. Again, they had the typical video player. It’s not shown here on the screen. But the play button and the closed captioning button, again, are where they should be located in the lower left and lower right, respectively.

Again, if they turned on closed captions, that appeared below the video so as not to block out any important information on the screen. And then even below that shows the interactive transcript. What you see there is an option to search the video. And when they type in a term into the search and click the search icon, it will highlight every instance of that term throughout the transcript. And the student could then click on that term and be taken to that simultaneous point in the video.

The video also– as it’s going, the words in the interactive transcript are highlighted as they are spoken. And then below the transcript itself, you see the opportunity to toggle the transcript on and off, as well as turn the highlighting on and off.

Now, we have both the closed captioning and the interactive transcript enabled just for visual purposes. However, some students can choose to turn both or either/or on off depending on what they needed. So now I’m going to turn things over to Casey to talk about our findings from the study.

CASEY FRECHETTE: All right. Thank you so much, Karla. And thank you, Lyman, for setting the stage here. I’m going to talk about our findings. And we’ll take a look in three categories reflecting how we thought about our analysis this time around.

As Lyman mentioned, our earlier pilot study was quite encouraging and did show that there were advantages both to closed captioning and interactive transcripts. In this study, our hope was to expand on that analysis. And we got many more participants this time around. We also wanted to dive deeper into exactly under what conditions each of these tools is most helpful.

So to begin with, we looked at a basic question. How do students use closed captioning and interactive transcripts to support their learning? And to provide answers to this question, we had a number of items on survey that we administered to get a sense of how participants handled the tools that we gave them. So first of all, we asked our participants how often they used the tools.

And we found in the closed captioning group that 55% of participants answered at least sometimes to that question. So they either sometimes, often, or always used the tool. And 29%, about a third, of those participants either often or always used the tool. For the interactive transcript group, the numbers were similar. 47% sometimes, often, or always used the tool. And 27% often or always used the tool.

We also asked all of the participants if they found the tool that they were assigned to helpful. And here again, the numbers were quite similar. 45% of participants in the closed captioning group said that they found the tools at least moderately helpful, if not extremely helpful.

And that number expanded to 60% when we looked at responses at at least the slightly helpful level. So 60% of the closed captioning group found the tools slightly helpful or better. And in the interactive transcript group, it was 53% of the participants, a little over half, who found the tools at least moderately helpful or extremely helpful, and 59% who found the tool at least slightly helpful.

We also wanted to get a sense of why participants used the tool. And we were curious to see if those responses stacked up or were different when we compared closed captioning to interactive transcript participants.

The top three reasons for the closed captioning group were that the tool helped that group to focus. 42% said that was the case. 37% said that the tool helped them with information retention, or recalling information from the lectures. And 28% said it helped when the audio quality was a little bit poor.

Now, for the interactive transcripts group, one of those answers was the same. And that was helping with information retention. But two of the answers were different. So 38% of the interactive transcripts group participants said that the tool helped with information retention. And then about a third, 29%, said that they used the tool as a study guide. And another third, about 29%, said that the tool helped them to find information.

And we believe that these responses begin to show how participants used these tools a little bit differently and how interactive transcripts, not surprisingly, are used in a more active and interactive fashion, using them as a study guide and as a way to search for information, whereas closed captioning is more of a supportive or supplemental tool that’s used to make information more clear and more easy to obtain despite environmental limitations or other factors.

Now, we were also curious to understand how helpful the participants found it to be to have video-based lectures built into their courses. As Karla mentioned, we studied seven different courses and nine sections total. And these courses, as Karla noted, spanned all of our colleges on campus and a variety of different topics and subject matters. The videos also varied quite a bit in length. But we asked everybody how helpful they found those video lectures to be. And the responses were quite definitive.

For the closed captioning group, 80% found the tool at least moderately helpful– that is, the video-based lecturing tool. And 62%, a little less than 2/3, found the videos very or extremely helpful. Those numbers were a little bit lower but about on par for the interactive transcripts group, where 77% found the tool at least moderately helpful– the video lectures, that is. And 56% found the videos very or extremely helpful.

We also had an open-ended question to get additional insights into why the tool was helpful if they found it to be so. And this is a compilation of some of those responses from both the closed captioning and interactive transcripts group.

So just to read through the list here, one participant said, “I can focus better when I read what is being said.” That’s an issue that’s been studied for some time. And that dual-coding approach of having content both in auditory and written or visual form has been established to be an effective tool. We’re actually incorporating it right now in this presentation.

Another participant said, “I’m not exactly sure how to explain how it helps, but I know for certain that it does. I felt as though I was retaining information a lot more readily than just following a PowerPoint.”

Another participant talked about those environmental limitations that I mentioned. “Useful in noisy households,” they said, “where it can be a struggle to hear the videos, and quiet environments where I didn’t want to disturb others.” I thought that was quite interesting because it’s not just the noisy environment sometimes that’s an impediment to using audio, but also very quiet environments. And captioning can be helpful in both circumstances.

And then finally, a participant said, “it was because the class use technical law terms.” And the transcripts help to “grasp what the professor was talking about.” And we find that that’s the case in a lot of our curricula. We’re introducing new concepts, new theories. The vocabulary may be unfamiliar. Understanding exactly what the words are and how they’re spelled can be very advantageous to a wide swath of students.

So that established the first pillar of our inquiry as far as our analysis was concerned. The second piece that we wanted to look at– and this was taking us a big step beyond our pilot study– was to see whether the amount of usage of these tools, the degree to which participants were engaged with them, was a deciding factor in terms of the learning outcomes that they experienced. So we asked the question, do students who use tools at a higher level learn more than those who use the tools at a lower level?

And to establish different levels– and we decided that we’d focus on low usage, medium usage, and high usage– we relied on participants’ responses to a survey question that asked them to self-report how often they used whatever tool they received, be it closed captioning or interactive transcripts. And we grouped them three ways according to their responses to that question. So we had 46 participants who said they never or seldom used the tool, 21% who said they sometimes used it, and 31 who said they often or always used it. And that established our low-, medium-, and high-usage groups.

With that classification in hand, we took a closer look at the results. And this table captures the information here, looking at low, medium, and high usage for closed captioning and interactive transcripts. So we’re comparing here pretest to post-test scores. And we’re also looking at the differences between the two.

Now, you’ll notice that the pretest scores vary a little bit across the different categories here. And one thing that you’ll also notice that we found interesting is, on the average, medium-usage participants performed better on the pretest than low-usage participants. And high-usage participants performed even better still.

So this raises an important factor for us to consider, which is that students who are inclined to use these tools more, who are predisposed to take full advantage of them, may have better study habits in general, may be more motivated, may be more inclined to fully engage in the online experience as a whole. And that could account for discrepancies that we see in the pretest scores here.

So comparing, for example, the pretest for the closed captioning group, they scored 57.64 on average. This is on a scale of 0 to 100. The high-usage closed captioning folks, on the other hand, scored 74.5 on average. So that’s a significant jump.

What we also see across the board, though, is increases in scores from pre- to post-test across all of these groups. So for the closed captioning group, they improved by 7.26 percentage points. That’s for the low-usage cohort. For the interactive transcript low-usage cohort, they improved a little bit less, but by 5.75 points.

Going down the line here, the closed captioning group for the medium-usage category improved by 6.4 points. And then we saw a significant spike here, where for the interactive transcript group, medium- and high-usage folks did particularly well. So the medium-usage group improved by 12.36 points. And the high-usage group improved by 8.70 points.

So one conclusion that we might reach based on these findings is that interactive transcripts are particularly useful when they’re employed, when they’re put to good use. And we see that when comparing medium- and high-usage differences to the low-usage differences. For the closed captioning group, on the other hand, there was actually a slight dip from the low to medium to high usage in the gains.

But again, we have to consider that in light of the fact that the high- and medium-usage groups performed a little bit better on the pretest. And so it’s more difficult to improve when you’re already scoring higher. So that’s a look at overall learning at different levels of usage. Sorry about that. Sorry about that. And actually, we can move on to the next slide now.

So the third category we wanted to look at here is different kinds of learning. And we talked about that briefly a little bit earlier. We classified learning in three different ways for this study.

So we looked at a basic-level recall of information, the ability to remember ideas that were conveyed in lectures. We also looked at learning in terms of comprehension, or understanding. We consider this a more substantive form of learning because it goes beyond memorization into the realm of comprehension. And then third, we looked at an even more advanced kind of learning, what we’re calling “application.” And that concerns the ability to transfer learning from one situation to a novel situation that hasn’t yet been encountered.

So let’s take a look at how the results stacked up when you considered these different kinds of learning. And we have a few more charts to work through here. And I’ll highlight some of the key findings because there are quite a few numbers here to go through. And these slides will be available as well. And all of the data is also available in our report.

So to begin with, let’s look at a recall, that basic level of learning, the ability to memorize information. And we’ll look at recall at different usage levels– so for low usage, for medium usage, and for high usage. And we’ll also look at it for both the closed captioning and the interactive transcript groups.

So we had a couple of surprising findings that I’ll highlight here as we consider these results. The first one that I want to note is a case where we saw a little bit of a dip from pre- to post-test. Now, that’s never a good sign when the scores are going down after the instructional intervention. But we did see that happen in one or two cases as we parsed through the data.

One of those circumstances was for the interactive transcript group that had low usage, here again suggesting that interactive transcripts may be most effective when they are employed. Though for the pretest, the low-usage cohort scored 75.93 for the interactive transcript group. And they dipped down to 70.37 on the post-test, so a 5 and 1/2 point dip.

Now, they scored fairly high on that pretest, particularly relative to the closed captioning group, who averaged only 57.74 points. And in their case, they actually increased by 6.63 points on the end of that test. So the starting score is, again, having an influence here on what we see in terms of overall differences.

Now, for both medium and high usage, the results were pretty consistent for recall. On average, participants scored about 6 points better in both the medium- and high-usage groups. And that applied to both closed captioning and interactive transcripts.

But let’s take a look at the second level of learning that we considered. And that was comprehension. So this is the ability to understand information. I do want to mention from a methodological perspective that all of the types of learning that we studied– recall, comprehension, and transfer or application– we measured these via multiple-choice questions. But the structure of the questions was a little bit different for each type of learning.

So for the recall questions, we essentially listed out groupings of terms. And we asked the participants to identify which grouping was relevant to the content that they had received. For the comprehension questions, we asked questions that required a demonstration of understanding. And for the transfer questions, we presented scenarios and asked the participants to apply what they had learned to these unique scenarios.

So I just wanted to point out that the format of the questions was consistent. But how we asked the questions and the exact structuring was a little bit different there. So overall, we saw bigger gains when we looked at comprehension versus recall. That suggests that these assistive technologies may make more of a difference for comprehension-level learning than for recall-level learning.

So we were looking at increasing understanding. And we’ll see in a moment here also when we’re looking at increasing the ability to apply knowledge that these tools are particularly helpful. And this was true at all usage levels. So there were no cases where a group dipped a little bit from pre- to post-tests even though, as we’ll see, there were some groups that scored fairly high on the pretest.

So for the low-usage group, looking at comprehension-style questions on the pretest, closed captioning– the average score was 57.17. And that increased to 64 and change on the post-test, a gain of about 7 points. The interactive transcript group performed at a similar level. But they gained a few more points. So they started off at 55.57, increased to 64.29. That’s a little less than 9 points.

For the medium-usage group, here again we see those pretest scores coming in a little bit higher– and the post-test scores as well. So the closed captioning medium-usage group averaged 64.9 on the pretest. They increased to 69.22 on the post-test. And the interactive transcript group moved from 66.67 to 73.33. In all cases, they’re increasing by about 5 or 6 points. So this is about half of a letter grade.

And then for the high-usage group, for closed captioning, they improved from 73.83 to 77.77 points. That’s about 4 points overall. And the interactive transcript group had a big improvement. They went from 58.61 to 71.94, a little more than 13 points, or a full letter grade. So that high-usage interactive transcript group, when we’re looking specifically at comprehension, really performed well overall in the study.

So then finally, we looked at application at different levels of usage. And here we found a few other really striking outcomes when we looked at these results– so again, the application, the highest level of learning in our study, involving the ability to apply knowledge to novel settings and novel scenarios.

So in this case, again– and we see overall a trend, where as the usage increases among our participants, so do both their pretest and their post-test scores. There was one case here where the post-test scored dipped down a little bit. That was a special situation that we’ll talk about in just a second. But let’s go back to our low-usage cohort and start with the closed captioning group.

So here the results were very similar to what we saw in all of the other ways that we analyzed the data. So scores increased by about 4 points. They went from 62.62 to 66.67. The interactive transcript group improved by about the same amount, a little bit less, 3.33 points. But they also started higher. So their average pretest score was 71.11. And the post-test scored was a little higher still.

What we found really interesting was for the medium-usage group. And for both the closed captioning and interactive transcript cohorts, the medium-usage group at the application level of learning really improved quite a bit. We saw the highest improvement in this category and also the third-highest overall improvement.

So the highest improvement overall in the study was for the interactive transcript group, the medium-usage group, at the application level of learning. And they actually improved by 28 points, which was a big jump. They went from 63.34 points on the pretest to 91.67. So they have the distinction of being the group that scored in the A range on the post-test here.

The closed captioning group also did better. They scored about the same on the pre-test, 62.74. And they increased to nearly 75 points on the post-test. That was about a 12-point gain– so the interactive transcript group doing more than twice as well here compared to be closed captioning group.

Now, what’s interesting and a little confusing to sort out when we look at the high-usage group is that the interactive transcript cohort didn’t do as well. This was a case again where those post-test scores dipped down a little bit. But what we found interesting here is how high their pretest scores were.

So the interactive transcript group at the high-usage level scored an average of 81.67 on the pretest. And then they did dip down to 78.89 on the post-test. And for the closed captioning folks, they started at 76.19 and then jumped up to 81.75– so again, a difference of about half of a letter grade, or 5.61 points.

KARLA MORRIS: Great. Well, what Casey has done is a really great job of sharing with you our findings and a lot of the implications and opportunities that come out of that. So what I’d love to do now is summarize some key takeaways that may be helpful for you, for your institution, or for implementing something like this at your organization.

So the implications for students– this is a lot of what Casey has shared. And so I’ll go over briefly here. But what we found is that when students have the option of turning off the tools, such as the captions or the transcripts, that most students don’t. So they leave them on because they either find them helpful or find them useful in some way.

And as Casey pointed out, there were some variations in how students performed based on their level of usage and the tool that they used. But overall performance improved under both conditions. And generally, the higher the usage of the tools, the greater the learning outcome. So really the implication there is to provide both of those opportunities for students.

And so, again, at instructional design, as we talked about earlier, all of us have an interest in making online learning accessible for our students. And so coming into this study, we really wanted to look at what the implications could be for making e-learning an even more universally designed opportunity. So really what we found was students benefit from both the tools that we have available. Therefore, whenever possible, we feel you should make both tools available to users.

What’s important to note also is that they perform better when they use the technologies more fully. We found that the students in the interactive transcript group who were high users did a lot of searching, did a lot of clicking on the terms that they found to go to those parts in the videos. So when possible, faculty could highlight the benefits and the use of those tools and how to employ those tools as study aids. And that could have a positive impact on student achievement in the course, as well as their level of learning.

We talked about universal design. Lyman did a great job of introducing that concept and how we apply it to our online courses. And this tool is just another great example of employing universal design in our online courses.

And what we also found was that involving instructional designers who were involved in the creation, the design, and the development along with our faculty members in all of these courses, really allowed us to get some ideas and some learning design background into the implication of how to use these tools. So it’s always a great idea to involve your instructional designers when you’re talking about how to employ accessible and assistive technologies in your online courses.

There’s also some implications for institutional commitments and some opportunities that we found just through this study, as well as the work that we’ve done in the past through our involvement with the online learning accessibility committee, our pilot study, and our previous studies. We really feel that there’s a strong point to make in involving these types of conversations with your vision for innovative teaching.

There’s a lot of work around innovating in the online learning space in accessibility and universal design. And these tools that support it are a good piece to innovate from the ground up. We feel that there’s a strong support from our institution. And that is what made some of this so successful and can make this successful across other institutions and organizations by building in that strong support for committees, task forces, and cross-functional collaboration.

One of the things that made our accessibility committee in this study so successful was the involvement that we had from our faculty from each of the colleges, our learning designers from our online learning unit, our involvement from our Student Disability Services office, the buy-in we had from administration. So bringing together all of these folks from these other departments to collaborate builds a very robust framework around accessible online learning.

And then articulate and support guidelines for quality online learning. As Lyman mentioned, we employ Quality Matters as our online course quality process. And they heavily stress the accessibility of multimedia, including captions and transcripts. So having these set guidelines in place will help make sure that that’s part of the design and the development process.

And some future opportunities based on the findings that we have found in this preliminary work is, we really need to explore the connection between student success and the length of the video that they’re watching because that may have some implications for how long we make video lectures and instructional videos for our online courses. And so that’s a really important connection we need to examine, as well as student success and the use of the tools, whether that be closed caption or interactive transcripts, by demographic profile Because that may tell us some more things that we need to know about the students who are taking our courses and how they’re using the tools and how that affects their learning.

So if you’d like to learn more, on 3Play Media’s website under their Webinar Recording section, you can find the previous webinar that we offered in June of this year, where we went into more detail about the study methodology and our preliminary findings that led us to this full analysis of our data that we’re presenting here today. And you can also find the full research report on their website as well. And I have the URL up. And I’m sure we can get that out after the webinar as well. And so now what we’d love to do is open it up for any questions in the time that we have remaining.

SOFIA LEIVA: Thank you so much, Lyman, Casey, and Karla. Were about to get started on the Q&A. And we encourage you to keep asking your questions through the Q&A window or the chat window. And we’ll compile them and get to as many as possible. So the first question that we have is, what was the most surprising finding of the study?

CASEY FRECHETTE: Well, I’ll jump in first. And I would say for me, the most surprising finding was that it was the comprehension group that we saw some of the biggest learning gains in. I was thinking that perhaps it would be the application group. That is, that the more higher level the learning, the more helpful the tool. But it turned out, at least in terms of how we classify the questions, it was that comprehension group that performed the best.

SOFIA LEIVA: Thank you. The next question we have is, curious about how student viewing behavior was measured. Could you expand on that, please?

CASEY FRECHETTE: Yes. And this is an area I want to mention also where we’re hoping to do some additional analysis. And so we haven’t incorporated that piece of our data set into our findings just yet. But we do have a wealth of information about behaviors within the courses. We were tracking using JavaScript interactions with the interactive transcript and closed captioning tools. And we were also measuring interactions with the video player itself.

So we were able to tabulate when the video started and when they stopped, how many sessions a participant would take up to watch a video, and also how long the tools were either visible or turned off for. One area that I’m particularly interested in looking at is whether the self-reporting information that we used for this analysis in terms of usage levels maps up against what we see when we look at that behavioral data.

LYMAN DUKES III: So I’ll just add to that. Obviously, from the perspective of a faculty member, the more we know about student behavior in these online environments, the better we’re going to be able to craft a course design and delivery method that will best serve the students. And I think it also speaks to a point that Karla made a moment ago about the importance of including instructional designers very deeply in this process.

Anecdotally, what some of that data, that student behavior data, is telling us is that students in some cases aren’t watching lengthier videos. They’re not completing that viewing process. Again, as Casey mentioned, we haven’t done a deep dive to really get a sense for exactly what we mean when we say that.

But at least an initial look at that data tells us that that’s something that we should be concerned about. And instructional designers, given the work that they do, would be in a position to inform faculty who are building courses about important information of that type.

SOFIA LEIVA: Thank you. The next question we have is, have you considered doing this study in a more controlled environment where the students cannot choose to turn off the closed captioning and interactive transcripts?

KARLA MORRIS: It’s something we definitely discussed as we were setting up the studies. But one of the things that we felt about not allowing– about keeping them permanently on was that for students who find that to be distracting, we didn’t want to take away from their learning and their experience as well. So it was important to us to keep that optional so that we’re supporting students who needed that at various levels. Do you guys want to elaborate on any of that?

CASEY FRECHETTE: I think we also felt as long as we were able to track that behavior that we could account for it in our analysis. And although we haven’t reached that point yet, we still do have the option of checking to see whether there is a correlation between turning the tools off or leaving them on and the learning outcomes.

But overall, our approach has been to study these tools in the most natural kind of environment that we can. And so our guiding star has been, well, if this is the way that a student would typically experience this in an online class, that’s the method that we want to follow on.

SOFIA LEIVA: Thank you. The next question we have is, did any of other seven online classes in this study include language courses, like Italian, Spanish? At my institution, we’re discussing the tension between providing captions versus not providing captions when students are expected to develop listening skills.

KARLA MORRIS: That’s a really important question and definitely something that I think we should be looking into. None of the courses in our study involved foreign language.

SOFIA LEIVA: Thank you. The next question we have is, can you please explain an interactive transcript versus a regular transcript?

KARLA MORRIS: Sure, absolutely. So a regular transcript is a set of text that could either be provided with a video as a downloadable document or on the page. But it’s essentially static text. For an interactive transcript, that is displaying below or beside the video. And it’s the full set of text that’s spoken throughout the video. However, as the sound or the spoken word on the video is going, that is simultaneously highlighting the same word on that transcript, and it moves along in time with the video.

It also typically includes the search function so that if you hear a term mentioned or you need to go back and search for a specific term, you could type that into the search function and click Search. And it will highlight all the instances of that term within the text. And clicking on that text connects it to that same point into the video. So if you click on a term a few sentences ahead of where you are, it will automatically forward you to that point in the video and start from there.

SOFIA LEIVA: Thank you so much. The next question we have is, was there a group of students in this study that were not provided closed captions or the inactive transcripts? And were they measured in the pre- or post-assessments?

LYMAN DUKES III: So the answer to that is, they were provided either captioning or interactive transcripts. And the reason for that, quite frankly, is at our institution, captioning is business as usual, so to speak. In other words, that’s what would typically be provided in a course.

So in our case, while we were very interested in learning about how students behave with regard to each of these accessibility tools, captioning is the way in which a course is typically offered. So the interactive transcript would be the one that would be unique to the way in which a course is provided to a set of students in a course at our institution.

SOFIA LEIVA: Thank you. The next question is, do you have the percentage of students who chose to turn off the captions?

CASEY FRECHETTE: That is part of the data we’ve collected. But outside of the self-reporting survey question, we haven’t analyzed that data yet.

SOFIA LEIVA: Thank you. The next question we have is, how is your university or how are you using these findings in your courses and university courses?

KARLA MORRIS: So from the Department of Online Learning perspective, in our prior study, we found interactive transcripts to be overall more helpful. And it’s something that we want to look at implementing. But it does require a little bit more time to set up on our end because right now it doesn’t organically display within our learning management system.

So what we hope to use this for– and especially when we can connect it with the data on the length of videos– is determining in which situations this may best be implemented by the type of lecture videos that are within a course. But also, now that we know that these benefits are in place, we’re going to be advising that faculty members consider including it as part of their design.

SOFIA LEIVA: Great. Thank you. Well, those are all the questions we had. Thank you so much, Lyman, Casey, and Karla for such a wonderful presentation.