« Return to video

Impact of Captions & Transcripts on Student Learning & Comprehension [TRANSCRIPT]

SOFIA LEIVA: Thanks for joining the webinar entitled Impact of Captions and Transcripts on Student Learning and Comprehension. I’m Sofia Leiva from 3Play Media, and I’ll be moderating today.

And today, I’m joined from the University of South Florida St. Petersburg by Karla, manager of instructional design services; Lyman [AUDIO OUT]; and Casey, an assistant professor. And with that, I’ll hand it off to Karla, Lyman, and Casey, who have a wonderful presentation prepared for you all.

LYMAN DUKES: We’re pleased to have the opportunity to participate once again in the 3Play Media webinar series. Our session today addresses the use of both closed captioning and interactive transcripts in online university courses at our institution, which is the University of South Florida St. Petersburg.

As our host mentioned, please feel free to submit questions, and the question and answer portion of our time together today will be once we’ve completed sharing our slides with you.

Again, our session is titled, The Impact of Closed Captioning and Interactive Transcripts on Student Learning and Comprehension. And the essence of the session is about online course features that we believe provide the best access and understanding of course content for the greatest number of students.

This slide spells out our session objectives, or what we might call our roadmap, for the next 45 minutes. We’ll look at findings from this study to date, as the study isn’t fully complete, but we’re close. We’ll chat a bit about the potential value of the use of both closed captioning and the use of interactive transcripts for students, and we’ll also share strategies we’ve developed at our institution that may be of benefit to webinar participants.

We’ll begin by sharing some background relevant to our study that we hope will provide a foundation for making sense of the study outcomes to date, as well as how these outcomes and relevant strategies might be a benefit to you also. Prior to getting started, we’d like to take a moment to once again introduce ourselves. My name is Lyman Dukes, and as mentioned, I’m a professor of special education at USF St. Petersburg.

CASEY FRECHETTE: And good afternoon, everyone. I’m Casey Frechette, an assistant professor in our journalism and digital communication departments.

KARLA MORRIS: And hi. This is Karla Morris, manager of our online learning department at USF St. Pete, working on course design and professional development.

LYMAN DUKES: So each of us has a unique background that’s contributed to both conceptualizing and completing the study that we’re going to share with you today. I’m currently developing guidelines for research that are intended to guide the field, examining access to higher education for students with disabilities. And additionally, I’ve conducted research around the use of learning strategies and universal design for learning practices in higher education settings.

CASEY FRECHETTE: This is Casey, and my research interests center around culturally inclusive instructional design. I’ve been working on a framework for the past 10 years or so called “Wisdom Communities” that helps instructional designers and instructors create culturally inclusive online experiences. And as part of that research, I’m also interested in the design of learning management systems, particularly for accessibility and cultural inclusivity.

KARLA MORRIS: Great. And I’m Karla again, and my research is mainly focused around online learning, looking at the accessibility in our course design, learning analytics, and then learner engagement for our online students.

LYMAN DUKES: So before digging into the study itself, we’d like to share just a bit about USF St. Petersburg in order to provide some context for our study explanation. We’re a relatively small institution, as our student body in total, and that is both full- and part-time students, is approximately 6,500.

We’re an urban campus, with most of our students coming from our region, though that’s slowly changing. And like so many higher education institutions, many of our local students are choosing online courses over face-to-face course sections. And as you see here, our online course offerings are on the increase. And currently, approximately a third of our student credit hours are generated through online courses in the fall and spring terms, while closer to 2/3 of our student credit hours are generated online during the summer terms.

Our institution convened an accessibility committee about nine years ago, and its composition has changed over time, and the three of us are currently members of it. We’ve been given relative freedom to address a number of academic accessibility concerns, particularly with regard to online instruction. And a portion of what we’ve chosen to do is take a closer, albeit scientific, look at various online instructional practices that we believe will benefit not only students with disabilities, but all students participating in online courseware.

About eight years ago, we examined the use of closed captioning in the videos used in two of our larger online courses here at USF St. Petersburg, and then more recently as a pilot for the current study. And we looked at the use of interactive transcripts again in two large, online learning courses.

Our university system, which is actually made up of three separate universities currently, has determined that media used at the three institutions– that is, instructor-developed or third-party videos– that are included in any online course be captioned no later than 2021. Additionally, our faculty are strongly encouraged and supported to employ the Quality Matters guidelines for online instruction. Quality Matters, or QM, as it’s also known, for those that may not be familiar, is a process for developing and measuring the quality of an online course. And in fact, one of the guidelines references captioning in online courses.

So our belief is that merely providing access doesn’t necessarily mean that an institution has met its compliance obligation. The access provided must be effective, and specifically, that means the timeliness, accuracy, and appropriateness of the information provided electronically should be addressed. Only then do we think that we’re meeting the spirit of our legal obligation, and we believe that interactive transcripts move us closer to that place. Of course, experience should be the same for students with or without a disability. At least that’s one of our goals of our committee as we work to help faculty more effectively meet the academic needs of their respective students.

It’s worth pointing out that making closed captioning standard in online courses with audio or video content is consistent with principles of universal design. Universal design stipulates that the products, spaces, and experiences, such as education, should be designed to maximize accessibility. So instead of providing alternatives for certain populations, universal design speaks to inherent accessibility with the potential, ideally, to benefit everyone.

In the context of higher education, Universal Design, or UDL, provides guidelines for designing instruction that promotes access to and understanding content for all learners. Central to UDL is the principle that there are multiple ways to interact with the content and the people.

Because more students are enrolling in online courses, the potential for UDL principles to enhance higher education is certainly on the rise. Law schools, for example, are seeing an increase in the number of students with physical and cognitive impairments. And across disciplines, students with disabilities are participating in online courses at disproportionately high rates. Therefore, it makes sense that particularly in this era of accountability and a focus upon student success in higher education that courses be built in advance to the benefit of all learners.

We’ve conducted two earlier studies that led us to the current examination of captioning and interactive transcripts. As you see here, the original pilot study examined the cost and benefit of captioning all online course video content. And in this case, students benefited in a number of ways. Clarification became an option. That is, students reported difficulty hearing the instructor at times for various reasons, and captions allow them to understand lectures fully, even if the audio was not discernible.

Second, comprehension was improved. Some students found the option to both hear and see that content more consistent with their learning styles. Third, spelling of keywords was noted by students as helpful. Students appreciated the chance to see how unfamiliar words were spelled.

And lastly, note-taking was enhanced. More generally, students reported using captions as a note-taking tool. And in the next study noted here, which included looking at interactive transcripts and also looked at comparison to closed captioning, students found both options to be helpful. And in each case, student course performance was improved.

So in our earlier pilot study, as I just mentioned, both closed captioning and interactive transcripts made a difference– positive difference– in student learning. In fact, the scores from the pre-test, the post-tests increased a letter grade in both treatments. However, the interactive transcript group showed twice as much improvement as the captioning group.

Students in the interactive transcripts treatment found the access to the transcript especially helpful, with nearly half of them saying that the transcript was extremely beneficial, and virtually all of them communicating some benefit to transcript access. So this led us to the current study, in which our goal is to examine the benefit of captioning and interactive transcripts, albeit with a larger sample.

Next, I’d like to hand off our discussion here to both Casey and Karla, who will spell out the study goals, methods, and findings to date.

CASEY FRECHETTE: Thank you, Lyman. Our study goals, as Lyman just mentioned, were really to build on our pilot work, and we did that by examining learning in three different ways– content retention or recall, comprehension or understanding, and application, or the ability to take information and apply it in normal scenarios.

So we looked at learning in three different ways, and we did it by measuring three kinds of variables. We considered outcomes on pre- and post-tests. We also administered a survey to capture attitudinal responses. And lastly, because our research is inherently online, where we can understand student behavior, we also captured interactions with videos and closed captioning tools and interactive transcript tools. So we have those course behaviors as well.

To give you some more insight into how we’ve conducted our current research, one goal was to expand on our work in our pilot studies. And to that end, I would say we were quite successful. We examined seven courses in nine distinct sections across our three colleges at the University of South Florida St. Petersburg– College of Education, our College of Arts and Sciences, and our College of Business. And in each of these sections, we gave students the opportunity to opt into our research early on in the semester.

We captured the bulk of the data over this past spring semester, which began in January and ended in May, and some additional data in our summer session, which just wrapped up. After opting in, students were randomly assigned to one of two groups– one that received closed captioning, or closed captioning and interactive transcripts. They saw orientation videos from their applicable professors to understand how to use these tools. And in all cases, the videos were tied to the first modules in the classes. These classes had a number of videos embedded in them throughout the semester. We focused on the learning material from just the first module videos.

The setup for our instruments was a straightforward kind of design. Before watching any of the videos, students took pre-tests to establish their baseline knowledge in each of the classes. After watching the video, at different points, they took checkpoint quizzes. These were short quizzes to measure their learning several weeks out and then several weeks beyond that. And then, at the end of the study, the students took a post-test, which mirrored the pre-test, and along with that post-test, also completed the attitudinal survey. All the while, we were captioning observational data in terms of students’ usage of the videos and the assistant tools.

This is an example of what a student in a closed captioning group would have seen. Here we have the professor in the middle of a lecture, and we can see the closed captioning beneath the video. Following best practices, we have high-contrast white text against black background, and the captions appeared below the video.

And in the case of our interactive transcript group, again, we have the video and the captions, and now the addition of the transcript. The current word highlights as it’s read, and as you can see, we also have a search capability here to use the transcript as a navigational tool. Clicking on text in the video updates the current position of the video itself.

KARLA MORRIS: And if I can add a little something here, Casey?

CASEY FRECHETTE: Sure.

KARLA MORRIS: The player controls don’t show up on the screen, but we did provide the opportunity to turn the closed captioning on and off, as well as toggle the interactive transcript on and off, so that we weren’t distracting students who did find that to be a distraction, but we were able to track that through tracking the student behaviors and use of the video at that time. So that was helpful.

CASEY FRECHETTE: That’s a great point. And so we do have the ability to go back in and see for exactly how long each individual student had, say, closed captioning enabled or the interactive transcripts.

Now, we administered this study across the nine sections over several months and in actual classes that were running online, and that was really important for us because we wanted to see how real students interact with these tools. The flip side of that is it leads to some complications in collecting data. It’s not as clean cut and straightforward as bringing students into a lab setting, for example, where we have very strict controls over when and how they’re experiencing our treatment conditions.

And so to help us to keep everything sorted out and to manage the process along the way, we built some custom tools to help us to see the data that we were collecting along the way. One of our primary tools was what we called our Spring 2019 Study Dashboard, and you can see a screenshot of it here. This was a way of tracking students as they progressed through the study. And so you can see whether they’ve opted in or out.

And it’s not clear from this screenshot, but we have different icons depending on which treatment condition the students were assigned to. And then, we can also see as they work their way through the study and complete each of the instruments along the way.

Along the way, we’re also tracking what we call our encounter rate. This is the percent of students enrolled in the class who either made a decision to opt in or out. And then, we also have our opt-in rate tracked there as well so we could understand how things were progressing as the study unfolded.

Another tool that we built surrounded the communications with our study participants. So after a participant had opted in, we needed a way of following up with them and encouraging them to complete the check points, to complete the post-test, and we did that primarily through an automated email system that we built. And as part of our dashboard, we tracked when the emails went out, how many emails we had sent. We had the ability to send follow-ups and reminders with custom text, and we could do that on either an automated schedule basis, or we could send unscheduled emails on a case-by-case basis.

And this is an example of what a study participant would receive when we sent our emails out. This is a final update for the class that we ran over the summer, encouraging the participant to complete the post-test for that research. And included in the email is a link with a number of unique identifiers that matched that student to the previous records in our study so we know that that student’s pre-test maps to that post-test. And of course, all of the links and information are encrypted and kept confidential.

This is an instrument that we built, also custom developed for our study. This is the first part of the survey instrument that we administered at the end of each section that participants were enrolled in.

So that, in a nutshell, is how we went about the methods for this study, and now we’re very pleased to share some preliminary findings from the spring and summer classes. And we’re very excited about what we’re finding so far, and in particular, how it extends the results that we’ve already seen in our pilot studies.

I thought it would be helpful to begin with some of the qualitative remarks that we captured from students as part of that attitudinal survey, and I want to emphasize that a lot of the themes that we’ll see here reflect the points that Lyman raised earlier in terms of the broad range of benefits that students see in using either closed captions or interactive transcripts.

So this student, for example, as some do, identified themselves as a visual person, a visual learner, and said, “so having the text there helped me memorize better the material. Also when taking notes, it was extremely helpful.” So students identifying that captions and interactive transcripts can help with recall, and also with note-taking.

A number of students talked about how having on-screen text in addition to audio helped them focus, this student saying, “I can focus better when I read what is being said.” Another student said, “I’m not exactly sure how to explain how it helps, but I know for certain that it does. I felt as though I was retaining information a lot more readily than just following a PowerPoint.” And this student said, “Just having a visual of what is being said helps me to better comprehend.”

What’s interesting here is that our student participants are using some of the same language that we did in constructing the design of the study, differentiating between retention and comprehension, for example. We never presented those terms to them, but they were using them in their descriptions of their experiences.

Another student in our closed captioning group said, “The closed captioning helped me, since English is not my first language, to better comprehend the content of the class and follow the professor.” We had a number of international students across the various sections who echoed similar points.

This is one of the comments that I found most interesting in terms of the situational factors that can affect learning, the student saying that the captioning was “useful in noisy households, where it can be a struggle to hear the videos, and in quiet environments where I didn’t want to disturb others.”

And here a student saying, “Because the class uses technical law terms, the transcripts help grasp what the professor was talking about.” This is the case with many of our classes, be it law or biology, any number of classes where we’re introducing new vocabulary. And at a basic level of comprehension, understanding what those words are, how they’re spelled, how they’re being used is essential to learning. Captioning and interactive transcripts can go a long way toward helping that.

And now we’re getting to some comments from the participants in our interactive transcript group. One of the things that you’ll find noteworthy is the qualitative remarks from both groups are quite similar, and the advantages that students, participants, are perceiving are lining up very closely between these two groups. This participant is saying, “Having the information provided in two formats, audibly and visually, helps with retention and even understanding.”

“I prefer reading over listening because it’s more convenient if I’m in a public space and have no headphones.” So those practical considerations. And again, working with students whose first language is not English– “I’m not an English– a native English speaker, excuse me– so it helps me to understand better.” This participant is saying, “It made it easier to keep track of what was going on and helped me to focus on what was being said.”

And this participant said, “It’s helpful to visually see what the professor is saying. If I can’t understand them or miss something, I can find it on the screen.” Comments like this lead us to believe that both of these tools, and particularly interactive transcripts, encourage a more active approach to learning, not only a more visual approach, but a more tactile or kinesthetic approach.

So those are the qualitative remarks– a sampling of them, but very much a representative sampling from what we’ve seen across our participants. Let’s take a look at some of the quantitative numbers that we’re seeing so far. Now, these results are preliminary, but they point to advantages in both groups, and particularly so in our interactive transcript group.

So across the board, we compared results from pre- to post-tests for the closed captioning group. We saw an increase in scores of 3%. And for the interactive transcript group, we saw that increase was 8%. Now, not in every case, but in most cases– 2/3, to be precise– participants benefited from the interactive transcripts that they received.

Let’s take a closer look at the numbers, including the sample sizes that we were working with here. Now, participants were randomly assigned to one of the two groups, and we had a number of students who signed up for the study and took the pre-test, but didn’t follow all the way through to the post-test.

The results that you’re seeing today only include those participants who completed all aspects of the study, pre- and post-test. And because of the random assignment and the fact that certain participants dropped out, didn’t make it all the way through, we have slightly different end values here for our captions group and our interactive transcripts group. 63 participants completed the captioning group, 34 in the interactive transcript group.

As you can see, their pre-test scores were quite similar, about 64% on a 100% score possibility. And then we see the increases there on the post-test– nearly an entire letter grade for the interactive transcript group, improving by 8% from 64% or so to about 72%.

Now, what’s interesting is when we drill down to specific sections, and I want to show you two sections that had some of the largest participation. The results are even more striking. This was a course on early childhood literature, LAE4414. We had 12 participants in that captions group and 6 in the interactive transcript group. Again, the pre-test scores, quite similar, but a big difference there in the jump to the post-test, with the interactive transcript group improving by about 15%, a letter grade and a half.

Another class that had some of our highest participation was a class called Intro to Blogging. This was a undergraduate class that had a graduate add-on section with about six graduate students. Overall, 28 students participated in this particular section. Again, similar pre-test scores between the two groups– 55% average for the captions group and 60% average for the interactive transcript group. And we see those jumps on the post-test, with the interactive transcript group improving by about 16% on average.

KARLA MORRIS: Great. So as Casey mentioned, these are the preliminary results that we have at this point. And what we wanted to do was also think about who’s attending the webinar today and what we can share with you to take back with you that may be helpful for your institution based on what we’ve learned so far from our findings as well as our experience with doing a few research projects at this point and giving some opportunities to maybe replicate some of this stuff at your institution.

So here’s some food for thought. One of the things that we wanted to look at first was strategies for implementing interactive transcripts in terms of implementing them in your online courses. Some of the things that we found helpful– the first thing was creating an orientation video or a set of materials for students.

Most folks who are in instructional design, as well as our faculty members, know that when you’re introducing a new tool to students, giving them the opportunity to understand how they can use the tool to benefit their learning and how to use all of the functions of the tool is incredibly helpful. So we wanted to make sure that the students that were going to be put into this group had everything they needed to know to be successful with the tool and to be able to use all of the features.

In those materials, highlighting how to capitalize on each of the tools for learning was essential– so explaining how to use the search function to type for keywords, how to pause the video when you get to a certain point, and click on other words to get back to where the professor was speaking about that, and using that for note-taking, comprehension, definitions, spelling, and things like that.

One of the other strategies that we found really helpful, and this also sort of came off as an aside out of one of the courses, is encouraging the faculty to emphasize this tool as a study aid. A lot of the things that we hear from our TAs in these large online courses, where they have a lot of emphasis on the exams, is that students, when they don’t do well on an exam, they come with questions about, how can I study better? How can I– you know, I watched the videos. I read the textbook. How can I do this better? And adding this as a strategy for organizing your note-taking or as a supplement to the traditional paper and pencil note-taking can be a very helpful strategy as well.

One of the things that we often get asked about as we’ve been around speaking at conferences and webinars about the research that we’re able to do here and the initiatives that we’re able to get into place is, how do you get that institutional commitment to really see this as an important initiative to pursue? And so we’ve had a lot of success with the strategies that we have up here.

The first one is finding your champions on campus. This is really getting to know the faculty, the staff who support our students with disabilities, the instructional design, the online learning team, the media team. Who in those areas is really passionate about this topic and would really jump on board to something like a committee or a task force and help dig into, how can we implement this here? What does this mean for online learning at our institution? And from there, you can form interdisciplinary committees, task forces, work groups that are charged with this type of work, and that’s how you can get a lot of that done.

Another strategy we’ve had a lot of success with is highlighting folks who are already leading in that area and giving them the opportunity to share how they’re leading with the other faculty, other instructional designers, other media folks, sort of lifting them up and spotlighting all the great work that they’re doing and having them share their knowledge so that everybody understands how it can be implemented and how it supports the students.

And within that category as well, I would also emphasize highlight emerging leaders. Think about the people who may not be leading in that research area yet or implementing that initiative yet but really want to be involved and just want to spark ideas and conversations.

The other success that we’ve had is the pilot study approach. And I’m going to say something that I know everyone can identify with– is that higher education is notoriously very tight with the money. We don’t have lots and lots of money to work with, and so when we spend money on initiatives, we want to make sure that we’re getting a good return on investment.

So touching back onto the pilot study that Lyman mentioned earlier, the way that we were going to get the institution to put money behind captioning and interactive transcripts was to prove to them that it was important and beneficial to our students so that they could see the significant jump in achievement, as well as understanding the students’ perception and the students’ perceived benefits to them in the course.

In our previous study, we found that the student assessment of instruction and the ratings of the instructor in our courses where we implemented these tools jumped very, very high, and that is very significant to retention, not just success, in online learning. And so having a pilot study to create those initial numbers and show those initial benefits really helped with getting institutional buy-in and commitment. And then that becomes previous research that you can draw on and use to spark ideas with your administrators, with the people who are in charge of funding and research at your institution so that they can help you identify avenues to explore that further.

So the next thing is we thought people may attend this webinar who are really interested in exploring research at their institution and replicating maybe a study similar to this or something else. And so these are the things that we found really successful in terms of implementing research, and the first one is interdisciplinary committees.

So the accessibility committee that Lyman mentioned earlier is made up of faculty from each of our colleges, staff who are interested and/or have a role related to this type of work at the institution. And having all of those differing perspectives on one committee can really help spark ideas because you have the folks who have the research expertise, the folks who have the practical application. And that coming together really sparks new research that may have not been done in the field yet because these two areas may not necessarily have come together. And so having those committees really helps explore from all angles.

Another key to this success is partnerships. And as we may have mentioned earlier, this research is being done in partnership and is sponsored by 3Play Media. And so having organizations that are external to the institution that have just as much of a commitment to student success and are really interested in how their tools can support students and partnering with an institution that has the access to do that research and conduct that here– that’s a really good opportunity to find that relationship and make that successful from both sides.

You can also explore internal partnerships. There’s a lot of funding available through centers for innovative teaching and learning, other centers for excellence within the institution that may be able to fund projects, or at least help guide you in the right direction to find those external partnerships.

And another office that can do a similar task for internal partnerships is research support offices. The individuals in these offices can help identify funding, help you structure grants and budgets, and help you understand the whole administrative side of that to help you be successful with the research support.

On another note, I do want to revisit the interdisciplinary committees piece. The real benefit to this– the person that you saw in the image earlier of the closed captioning and the interactive transcript video was one of our psychology professors, Dr. Tiffany Chenneville. And the real benefit to doing things like getting interdisciplinary committees and finding champions on campus is you get to work with a lot of wonderful people.

This project that we’ve done wouldn’t have been possible without the other faculty that we had out in the College of Education, Arts and Sciences, and the College of Business, who have a history of really caring about the success of our students and partnering with us on that. And that has been an essential relationship to nurture to make projects like this successful.

So what we want to talk about is, so what’s next? Because we’ve given you a little bit of a teaser of the preliminary results, some of the strategies that have come out of the work that we’ve been doing, but we’ve really– we’ve got a lot more to share.

So over the summer of 2019, what we’ll be doing is we’ll be completing the data analysis. We’ll be exploring from different angles and different connections that– you know, how does the student who uses interactive transcripts at a much higher user rate than someone who just passively watches it– what is the difference that we see in their student achievement, and what is the connection that we have there? We’ll also be looking at, what does this mean for online course design at USF St. Pete? If we’re seeing a significant enough impact, we will be making changes to our instructional design processes for online courses, and then maybe we’ll have things to share in that regard as well.

Looking in fall 2019, we’ll be pulling together all of that information into a full report that is going to be hosted on 3Play Media’s website. So you’ll be able to work through, see all of our findings, get some more practical application strategies and ideas that could be useful for your institution.

We’ll also be offering a second webinar where we’ll go over all of that information in great detail and again take questions of how this could be helpful for you. And then, last but not least, we will be presenting all of this at the Access Summit in Boston on October 3, and so I hope that we will see some of you on this webinar at that session as well.

So at this point, we would like to open up this session for any questions that have come in from the audience.

SOFIA LEIVA: Thank you so much, Karla, Casey, and Lyman. We’re just gathering the questions on our end, and we encourage you to keep chatting your questions through the Q&A window. Just to mention really quickly about Access, we do have a special code for webinar attendees, so we’ll put it in the chat. You’ll have the link to learn more about the conference and also the code, which is “webinars,” for 10% off.

OK, so the first question we have is, what was the most alarming and most surprising aspect of the study?

KARLA MORRIS: Hmm.

CASEY FRECHETTE: Well, I could take an initial swing at that question. This is Casey. On the alarming side of things, I would say a real challenge that we faced was retention across the entire semester that we were collecting data.

So it was one challenge in the beginning to encourage students to opt in. I think we had a pretty strong opt-in rate on the whole. But getting students to follow through throughout the semester, and particularly come back in and take that post-test, was a real challenge, and I think in some ways, an inescapable reality of doing this kind of research in real-world online classroom settings.

I would say– I don’t have the numbers in front of me, but we had at least two times as many students start the research as we did finish it all the way through. So something on the order of 200 students signed up, opted in, took the pre-tests, but they just didn’t finish it all through and take the post-test.

SOFIA LEIVA: Great. Thank you. The next question we have is, you showed the 3% and 8% test scores increases, but how do you define the benefits that students received? How were those benefits defined and measured?

CASEY FRECHETTE: Those percentage increases are including recall, comprehension, and transfer of questions. And so those particular changes were basically saying student learning increased, and student learning increased more so in the interactive transcript group.

What we’re not able to say just yet, because we haven’t done this part of the analysis, is, are there differences when we drill down into those three types of learning? Do we see differences in recall, for example, versus comprehension?

And we’re also at a point now where we’re about to do a dive into the attitudinal data that we’ve collected to see if student perceptions of their experiences changed or may have been enhanced. Were they feeling more motivated? Were they feeling more focused? Those factors are not yet accounted for because we haven’t done that part of the analysis just yet, but that’s on our radar next.

SOFIA LEIVA: Thanks. The next question we have is, will you share data on how many students turned off the captions?

LYMAN DUKES: We do have that data, and we’re certainly in a position to report that it would– that’s data that will certainly be included in the 3Play report that we’ll complete sometime in the fall of 2019.

SOFIA LEIVA: Thank you. The next question we have is, did you offer any incentives to the students for using captions?

KARLA MORRIS: Yeah, actually, we did. Through 3Play Media, we were able to acquire funding to purchase $25 Amazon gift cards, and we had a finite number of those. I think the total was 50 for all of the groups. And at the end– we told them upfront that they if they participated and completed all of the checkpoints and the post-tests that they would be entered into a drawing randomly to win one of those gift cards.

SOFIA LEIVA: Thanks. The next question we have is, you spoke about the power of captioning and interactive transcription for higher education. Do you think that the results of the study could be recreated in high school– in high school sphere?

KARLA MORRIS: Yeah, I wouldn’t see why not, especially for our virtual schools. I certainly think as long as the students know how to use the tools and the faculty know how to advise students to use the tools to benefit their learning that we could certainly see that result.

CASEY FRECHETTE: And just to piggyback on that, I would also say that there could be some distinct advantages in exposing students to these tools earlier on. There is a certain learning curve and a certain acclimation process that we’ve observed across our pilot studies and our current research. And I think that the earlier that students are familiar with using these tools to become more active in their own learning, the better.

LYMAN DUKES: Just to add to what Casey said, which was actually a great point, it was one of the reasons that we also wanted to orient students to the tools, particularly the interactive transcripts, prior to the students actually employing those tools in a live course environment.

SOFIA LEIVA: Great. Thank you. The next question we have is, was there a group without closed captions and the interactive transcript which shows their percentage increased from pre- to post-assessment?

LYMAN DUKES: So there was not. While in an ideal sort of study situation one would have a control who didn’t receive either of the treatment, because these are live online courses, it’s important– in fact, probably ethically appropriate, to ensure that each student had reasonably equitable benefit of tools that would allow them to perform at their highest level within the course.

So we made the decision to not include a control of that nature, and frankly, I’m not sure that our institution would have given us permission to do so if we had asked. They would have, I suspect, wanted all students to have access to either the captioning or the interactive transcript tool.

SOFIA LEIVA: Great. Thank you. The next question we have is, so this study focused on online courses, but how do you think this could work for a regular classroom space?

KARLA MORRIS: I think it would be interesting to examine a course in a face-to-face course, if you’re going to caption or transcribe the live lecture that’s happening– that’s something that could be shown on the screen for all to see rather than just on a laptop for the student who needs the captioning. I’m not sure of the interactive piece of that and how that would work.

LYMAN DUKES: I agree.

CASEY FRECHETTE: And I think that as live transcription software becomes better and becomes more accurate, this is a really interesting possibility for us to be thinking about live classroom settings. And also, just to underscore it, I would say in any case where there is a hybrid kind of class, where maybe it’s an in-person class but there’s material posted online, of course, in those cases as well, we see tremendous value in these tools.

Our focus was on exclusively online classes, but anytime material was posted or perhaps shown in class– within my department, one of my classes is video storytelling. We watch a number of examples in class. And even as we’re discussing it today, I’m realizing I’m not always thinking about enabling closed captioning or showing examples with closed captioning when we’re reviewing media in class. So I think it’s really great to be thinking about how all of this could apply in that face-to-face setting.

SOFIA LEIVA: Thank you. The next question we have is, how are you using the findings to talk to faculty to get buy-in from them?

KARLA MORRIS: I think that’s definitely something we are going to be thinking about as we compile our final report for 3Play Media. The good thing here at USF St. Pete is because we have faculty that know that we’ve been working on these areas– I just, for example, this morning had a faculty member stop by and say, hey, what did you find? Does this mean that we’re doing interactive transcripts starting in the fall?

And so I think we’ll probably plan to share out the report through formal communications. We’ll probably offer professional development workshops, maybe partner with our Center for Innovative Teaching and Learning, and really just include it as a part of the conversation when we do course design and instructional design intakes with our faculty members, and also reach out to our network of champions and make sure that all of our faculty, even if they’re not teaching online and they’re just thinking about teaching online, are getting the information. Other avenues?

CASEY FRECHETTE: To add to that, one thing that I think we’ve tried to focus on is the broad range of benefits that we’re seeing in these assistive tools. An earlier study that we published was subtitled “Examining the Value of Closed Captioning For All Students,” and this reflects some of those qualitative comments that we saw, where students who may not have English as their first language, students who find themselves studying in a variety of different environments, students who see themselves as visual learners all benefit from these tools. And in encouraging our colleagues to consider their adoption, I think a key theme that we underscore is how this is benefiting a wide swath of their students.

LYMAN DUKES: Just to add to what Casey shared a moment ago, it’s also worth pointing out that recent data indicates that 7 out of 10 students with documented disabilities leave high school and actually choose not to disclose upon entering college. So while these tools we believe and our data demonstrates certainly helps all students, people affiliated with institutions of higher ed should be aware they’ve got a large percentage of students with disabilities on their campuses that aren’t disclosing, and therefore aren’t receiving services. So universal design tools of this nature are actually benefiting them, even though they’re not officially documented students with disabilities. So having these tools for many reasons is quite important.

SOFIA LEIVA: Thank you. The next question we have is, what percentage of students in your study had a disability?

CASEY FRECHETTE: That is a good question that I’m afraid we don’t have an answer to at this moment. We do ask that question. It’s not part of the data that we’ve analyzed just yet.

SOFIA LEIVA: Great. Thank you. The next question we have is, where were students watching videos? Was it mainly on desktop or on their mobiles?

CASEY FRECHETTE: That’s another really interesting question, and we don’t explicitly ask that in our survey. If I’m not mistaken, I don’t think that was one of our questions, although I do believe that we have that as part of our tracking data. And it’s really a good prompt for us as we move into the next phase of our analysis here to examine things like screen size and operating system and browser, perhaps, because I believe that we’ve captured that as part of the observational data, and it may be another interesting covariant to look at as we get deeper into the results.

SOFIA LEIVA: Great. Thank you. The next question we have is, are you providing your students with verbatim captions? And if you are providing verbatim captions, have you faced any resistance to them?

KARLA MORRIS: We are providing verbatim, and we haven’t had any resistance. I don’t think so.

LYMAN DUKES: If I recall– and this is just anecdotal information based upon my own conversations with other faculty members– is that if there were resistance, it would be to captions that are not verbatim as opposed to verbatim captions.

CASEY FRECHETTE: Another issue that has come up for us in conversations with our colleagues, and I know it’s a concern sometimes, is whether to allow for printing of full transcripts. That’s something that we haven’t really formally dealt with in our research, but it is a factor that we know has come up, and some faculty do have concerns about their content and maintaining control of their content. It hasn’t emerged as a show-stopper by any means, though. And certainly, in our research to date and all the faculty that we’ve worked with, everybody has been happy to provide full transcriptions of the lectures that we have included in our research. We should also note that although we only studied the first module video for each section, all of the videos in these classes were captioned, and transcriptions were provided for all of them.

SOFIA LEIVA: Thank you. The next question we have is, if you’re working with a limited platform, what sort of advice do you have for providing captions and/or interactive transcripts?

LYMAN DUKES: One initial answer to that would be I think there is good news when we look at all of the major video platforms, from YouTube to Vimeo. From what I’m observing, their closed captioning and interactive transcript options are getting better over time. Vimeo, in particular, has made a lot of strides in recent years to increase the functionality of their captioning.

And so I think that there’s good momentum in general, and one of my tips would be to keep an eye out on third-party platforms that can be used if you are facing limitations within whatever software you’re using to deliver your content. There’s probably an option out there that’s free or low cost that does enable the kinds of tools we’re talking about today.

SOFIA LEIVA: Great. Thank you. I think we have time for a couple more questions. The next question is, was downloading transcripts an option in your study? And if so, what is one benefit of providing the download option?

KARLA MORRIS: So we did disable that option in our study because of the concern that Casey had brought up– the concern for redistribution of that content.

CASEY FRECHETTE: I think that one implication of this is around the note-taking. Downloading transcripts could be used as part of a studying process, part of a note-taking process, and then particularly for students who like to be more tactile in their learning– highlighting content, taking notes in side bars. I think alongside of concerns around copyright or protection of content, it’s also important to keep looking at what the learning benefits of being able to print out transcripts or partial transcripts and using that as a study or note-taking tool might be.

LYMAN DUKES: So myself, I’ve used sort of a very simple means of providing students access to printing out content that’s included in an online PowerPoint lesson in this particular case. So what I do in my case is I provide very rich descriptions in the notes version of PowerPoint, providing sort of a deeper dive regarding what’s included on a PowerPoint slide itself. So that doesn’t necessarily address the specific question here but is the means by which I provide students an opportunity to print information that may be talked about over a PowerPoint presentation so that they can use it for study purposes.

SOFIA LEIVA: Great. Thank you. I think we have time for one more question. How did the university respond to the findings from the study?

KARLA MORRIS: So they are so preliminary that we haven’t shared the findings with them just yet, but I think once we have all of the data analyzed and the report prepared, we’ll be sharing that out, and I think that their response in general to all of our work has been really positive. It’s really advanced the profile of the institution, its support for our students, and really shown the initiatives that we have in place.

LYMAN DUKES: So I would add to that, just in closing, that accountability metrics are very much in play in our state. Our funding as an institution is tied to a set of accountability metrics that apply to all the public universities in the state of Florida.

And for those reasons, student success– and this is not unique to us. This is a nationwide sort of movement. Student success is really job one here at this time, so my expectation is that the institution will be very interested in the findings, and we hope that they do provide some benefit in the near term to students on our campus.

SOFIA LEIVA: Great. Well, thank you so much, Karla, Casey, and Lyman, for such a wonderful and insightful presentation. And with that, I hope everyone has a wonderful, great– wonderful rest of their day.