Captioning for Lecture Capture: Georgia Tech and Indiana University Share their Video Accessibility Experiences
LINDA BRIGGS: Hello, everyone. Welcome to our Campus Technology presentation today. My name is Linda Briggs, I’m a contributing editor here at Campus Tech, and I’ll be your moderator today. Our event is entitled Captioning For Lecture Capture: Georgia Tech and Indiana University, Purdue University, Fort Wayne Share Their Video Accessibility Experiences.
We have speakers from our two guests schools, who are going to discuss their lecture capture and closed captioning solutions, including some interesting lessons learned, as well as pitfalls that you’ll want to avoid when implementing video and closed captioning. It’s a good presentation on an important topic for higher ed, as you know if you’re joining us here today, so glad you could be here.
And also just, to note to the audience, as is appropriate for this topic, the recorded version of today’s webinar will be captioned and available to you later. OK, before we get started, just a couple of brief items for the audience.
First of all we, suggest that you maximize your browser for optimum viewing. We are recording today’s event and will be sending each of you an email when the archive is available. And this is an interactive event, so we encourage your questions and comments.
You can type them in at any time during the event to into the chat box there in your console and send them over. We’re going to set aside a portion of the event at the end of the presentation to answer your question, so do send them over. If you’d like to download a PDF directly before the presentation even begins, you can do that.
There’s a download slides link there in your resource box. We do suggest that you disable any pop up blocking software you might have running. That can interfere with the presentation management system. If you’re having trouble hearing me, please do go ahead and adjust your volume before we get started with the main event.
And if you do run into any trouble during the event, if the slides freeze up or the audio stream ceases, you can click Ctrl+F5, we find that often helps by refreshing your console. And if you still need help if there’s something I haven’t touched on here, there’s a help widget there on your console, you click on that for some technical assistance.
OK, Our agenda for today after this brief introduction, we’ll move right into our two case studies from IPFW and Georgia Tech. And then as I said, we have a Q&A session at the end, so please send questions over as they occur to you throughout the event.
At this point, let me go ahead and introduce our guest speakers. Joining us from Indiana University, Purdue University, Fort Wayne is Michael Phillips. Mike is a multimedia services technologist currently working on his MBA at IPFW. Mike comes to the ITS team with a wealth of corporate media and technology experience.
Prior to joining the IPFW team, Mike ran his own business where he managed, produced and delivered media projects for a wide variety of organizations. His background in media development and delivery provides him with the knowledge and tools to move IPFW’s media delivery forward and beyond.
And joining Mike today is Matt Lewis. Matt is instructional media services coordinator for Georgia Tech Professional Education. Matt has worked in distance learning and professional education for almost 10 years at Georgia Tech.
He and his team are responsible for processing close to 20,000 hours of rich media content annually for graduate, undergraduate and professional students enrolled in Georgia Tech’s Professional Education program. The department supports a number of remote Georgia Tech campuses, delivering synchronous and asynchronous content to students around the globe.
And as I said, I’m Linda Briggs with Campus Technology. Also, just a quick note that we also have here from Tegrity and 3Play Media. We have folks from those companies on the line here as well to address your questions.
So if you have questions that are more specific to the products, please go ahead and send those over as well. As I said, we have speakers here to answer those questions.
Just a quick note about who we are. Campus Technology, a magazine and a website. You can sign up for the magazine if you don’t receive it now at campustechnology.com. Our annual summer conference is in July in Boston, Massachusetts again this year.
And a wide range of email newsletters available to you on specific topics on technology and higher ed. You can sign up for any of those on our website. You can also join us on Facebook and Twitter if you’re inclined to do that.
As I said, our sponsors today are Tegrity and 3Play Media. Many of you are probably familiar with Tegrity Campus, it’s a lecture capture solution used in traditional hybrid and online courses to record lectures as well as supplementary course material.
Tegrity has been shown to improve course completion rates, keeping students enrolled and on the path to successfully graduate on time. Tegrity’s co-sponsor today is an interesting company as well that you may not have heard of, 3Play Media.
3Play was founded by four MIT graduate students who were looking for a lecture captioning solution. Their solution draws on research from the MIT Spoken Language Systems group. It combines both automatic speech recognition and human editing clean up. It’s an interesting approach, and you’re going to learn more about it today.
On that note, let me go ahead and turn things over to our main speaker, to our first speaker. Please welcome Mike Phillips with IPFW. Mike, welcome to the program today.
Presentation by IPFW
MIKE PHILLIPS: Thank you very much, Linda. A little bit about IPFW. IPFW is the fifth largest university in the state of Indiana, with an FTE of slightly over 10,000. But we are unique in that we represent both Purdue University, West Lafayette, and Indiana University, Bloomington, while still having our own autonomy within the Fort Wayne community in the Northwest Indiana region.
And what really makes that special is our students have the opportunity to earn degrees from either Purdue, Indiana, or IPFW. Our approach to closed captioning really came out of an immediate need.
About a year ago we had a student who came to us with a closed captioning need, and we needed to fill it, and we needed to fill it very quickly. Well, in doing our research as quickly as we could, we set out to create some very specific vendor selection criteria.
Number one, it had to work with our cloud-based lecture capturing service, which is Tegrity. It also had to work seamlessly with our cloud-based media storage, which is Kaltura, as well as some older technologies we have here on campus, which would be Echo 360 and Mediasite.
After we started delving into our are closed captioning vendors, we really saw that there were two that were rising to the top. One being 3Play Media, and the other being Automatic Sync Technologies. And for a variety of different reasons, we really felt most comfortable with the folks from 3Play Media.
Beyond pricing and things like that, 3Play Media really worked seamlessly in the background with all our existing systems. The workflow, getting our media directly from wherever we were storing in our beginning cases, things are being stored in Media Vault, or what we know as Media Vault, you may know as Kaltura. We kind of re-branded our-self here on the university.
But we really wanted a system that I didn’t have to load media into multiple times. So with 3Play Media, I can have my media, in this instance, in Kaltura, and then push it, push links directly to 3Play Media. And that was a real time saver in my workflow process.
Our first project with 3Play Media truly went flawlessly. Here you can see a little screen capture. o one of our instructors. This media was taken from an older Echo 360 lecture that we had archived and pulled it out, put it into Kaltura, and then directly linked with 3Play Media.
And three days later we had 29 hour and 20 minute lectures done, and everything just happened in the background for us. And really, all we had to do was go back through and do some minimal spot checking. The transcription, that took about, I think one and a half to two days.
The total project turnaround from the time we located our media, because here at IPFW, our media can be located in several different areas. So once we find that media, in this particular case, we loaded it into Kaltura, and then and then directly linked it back.
The total project turnaround was about six days, which really, for our first time out of the gate, really beat everyone’s expectations. And since then, we’re continuing to refine our workflow to not time out of our process.
Once the first project was completed, IPFW really started looking hard at the solution we came up with, and then how we could start to minimize costs as much as possible. We looked at doing it in house as well as other vendors, where maybe we did the transcription in house and then sent it out for the timing through someplace else.
And every time we went back to look at it, we really wanted zero latency and the highest accuracy we could possibly find. And in looking at that very critically, we really felt that 3Play Media was providing everything we wanted, and they were willing to work with us on a variety of different levels to provide us the service, the right size service we needed for IPFW.
As far as minimizing costs, what we did, we worked with IPFW’s purchasing department, and they helped us take our relationship with 3Play Media to the next level.
Once we discussed our needs with purchasing, as far as system integration, functionality, workflow, ease of use, and then our purchasing department then goes back and double checks our research to make sure that they can’t find another vendor that maybe we didn’t find.
And they came back, and 3Play Media stood out. So what IPFW did to minimize our costs was to purchase what 3Play calls a bucket of hours. And that way we can draw, and contractually we can draw from that bucket over the next two years.
Purchasing in this way dropped their price to a point where we felt more comfortable in making that investment, as well, we didn’t have to chase down funding for each captioning project as they arose. Our bucket of hours was already paid for, and then what we do is once the job is done, we then go back to that department, and we then bill back that department internally.
That helps us keep our workflow moving quickly to get that student the content as quickly as we possibly can. I know in creating our bucket contract, there were some minimal contract modifications made to best suit our needs, but all in all, I understand the process went quite smoothly.
So given the position we were in a year ago, in a reactive situation, we really had ourselves postured in the most proactive way possible to meet the needs of our hearing impaired students, while reflecting positively on IPFW being not only fiscally responsible, but having a very functional plan, foundation, and workflow moving forward.
We’ve used captioning very well with Kaltura, our cloud-based media storage and media access entity. And really, it works very flawlessly. We are also then poised with Tegrity and Echo, and Mediasite, our other lecture capture instances. We’re just waiting for those requests to come in so we can then fulfill them. A lot of the requests that we’ve had up till now, our lectures that have been captured, say, over the last several years, that instructors are continuing to use that same lecture that was currently captured in another system.
Currently our captioning projects has come from, as I said earlier, the older Echo 360 systems, but having them archived all throughout the campus, that’s one of the points in our process that we’re continually trying to shorten up, because when a request arises, we have to then chase down where that media may live.
And that is typically about a two day process, to chase that down and figure out where it is. But once I go back and check all my media, and everything is done, at the end of the day, 3Play Media, Kaltura, and Tegrity, it just works. It really just works.
And I think if you’re in a position like I am, you want the reliability that your vendor and your technology is going to be there, it’s going to work for you. I have a lot of confidence in 3Play, Tegrity, and our cloud-based media storage entity, Kaltura. It just works.
With that said, Matt, I’m going to hand it over to you.
Presentation by Georgia Tech
MATT LEWIS: Well, OK. Thank you very much, Mike. Wow, and thank you, Linda, for the introduction. Mike, a lot of this I did copy and paste, and basically say a lot of the same stuff. But I won’t.
So my name is Matt Lewis. I’ve been with Georgia Tech’s Distance Learning for 10 years now, and I started out as one of the guys who would go out and record these courses, and moved my way up to one of the content managers here.
And so my experience with Tegrity began in 2004, when we started using it here at Georgia Tech’s Distance Learning. And the very next year, we started using this to distribute, of course, the calculus class that we send out to several schools around the Atlanta metro area, and it’s expanded every year since.
And this is kind of where the closed captioning comes into play here in a second. We also use Tegrity, so that a lot of our remote professors around the world can easily record wherever they are, and distribute these lectures to students who are here on campus, and who are, like the high school students around the state, and also around the world.
So to any campus or any location that the student wants to take a Georgia Tech class from. We also a lot of faculty who are starting to embrace the flipped classroom environment, where they’ll record some main points, only five minutes or so, and then meet in classroom, in the classroom to discuss these and go into depth in questions, and to use that kind of as a lab time.
So that’s kind of a brief, really brief overview of our history here at Georgia Tech with Tegrity. Now this past fall in August, right before the semester started, we were facing our largest high school calculus program so far.
That’s where we were recording five lectures a week, delivering to 23 of our area high schools. 355 or more students, where students were using Tegrity just to review for quizzes, or to review some of the concepts they weren’t getting off hand.
And like I said, this was our eighth year preparing to deliver this program, when we were approached by one student who had a hearing impairment, and had a special need for closed captioning.
The student made himself known a few weeks before the semester, so we didn’t really have a lot of time to go out there and prepare and look at other options.
We knew about Tegrity’s automatic captioning that had been released a few months before, and we had heard good things about 3Play Media. And I believe we had used them on a couple of trial basis, and had a really great experience with them. They’re very approachable, and I don’t think they are never away from their email. They’re always responding to questions. So, the second this need came up, they were on top with the answers. And really, no one else compared to them cost-wise or as far as customer service for us.
So we decided to do with 3Play and Tegrity, since we had a nice, longstanding relationship with both. Some of the stuff that made 3Play really stand out was that you got everything in one bundle. You didn’t have to hem and haw over what kind of formats you need, which is extremely important for us, and I’ll get into this in a second as far as back ups go.
Because we Georgia Tech have kind of a unique set up in the way that we record our classes. We have hired technicians at some points through and year. We have 20 full time staff who go out on campus record these lectures and make sure that everything gets taken care of right. But, sometimes, the professor will show up with a different laptop and doesn’t have Tegrity installed.
So there’s elements of human error, and we always record on backup systems. And 3Play made that easy to get around those human errors.
So, so far this year, in the academic year, this course alone, we captioned 100 hours. So it’s a good bit of work they took care of for us. So, the process– and Mike kind of talked about a little bit of this– the process was very, very easy. We would send out laptops to the rooms where we knew these lectures would be given for this calculus program.
So, Monday, Wednesday, Friday was the main session, so it was all in one room. And then, we would have five breakout sessions that were split all over campus, and we would send five different technicians out to capture that. And we only caption the ones that a student with closed caption needs was in, so that made that a little easier.
Like I said, we needed something at the last second, and 3Play was there, right on the dot, along with Tegrity. So some of the backups. Like I said, we always have human error. We make sure that we could manually caption the files from 3Play. We could manually add these to Tegrity recordings by sending an mp3 of the recording.
We also had other rich media that we would post these to, and flat files like QuickTime and Windows Media. And we only had to turn to the backups, I’d say, two or three times during the academic year, and each time it was human error on our part, which happens. And overall, the Tegrity 3Play integration is extremely intuitive, and extremely easy to teach others how to use.
So, once I learned how to use it, I didn’t have to spend my time on it. I taught three other staff members here how to take care of it, and they rarely had any questions for me. If they did, it was just because they forgot.
So, I just wanted to give you a screen shot of a 3Play admin console, where it’s really easy to use if you do have to go in and manually take care of it. And the only reason I’m talking about the manual side of this, is because the automatic captioning is so easy, you hardly see it. You can go in and change the name, and you can request what kind of file types you want, and manage your assets here.
Here’s another screen shot of the file formats they offer. So, really, I think Mike took care of most questions. And I’m more of an interactive person. I want to hear your questions and talk about– you send me your concerns and what you think you’ll face, or what you want to hear of that we faced, and we’ll answer those. So Linda, I think that does it for me. Now I guess we’ll go over to the Q&A.
LINDA BRIGGS: All right, thanks very much Matt, Mike, both of you. Great job on those presentations. Yeah, let’s move into our Q&A session. We’ve got lots of time to answer audience questions. Let me remind everyone that if you have questions that are specific to either Tegrity or 3Play Media, along with Matt and Mike, we have folks from those companies on the line here as well to address your questions.
From Tegrity I believe we have Mike Berger, who is the senior director of marketing, and from 3Play Media, we have Josh Miller, who’s one of the co-founders of the company. So, as I said, you can get as technical as you want. We have a lot of expertise here on the line for your questions today.
On that note, let me start with– let’s go back to some of your comments Mike. Was your first project one lecture or one whole course? I think you said you started with a single lecture, is that right?
MIKE PHILLIPS: No actually, that’s not true. It was one class that encompassed, I want to say, 29 lectures, that were an hour and 20 minutes apiece, and, once we got them to 3Play, we had everything flipped in three days.
And, if you have an immediate need where you have to have something flipped like within 24 hours, 3Play also as an option where you can have– and the upcharge is actually minimal. There is an upcharge for 24 hour service, But, if you really have to have it, it’s well worth the money. We’ve used it a couple of times.
MATT LEWIS: Right, and to piggyback on that, Mike, as we would approach exams throughout the semester, our student would speak up and say, hey can I get these little faster towards the end of the week, because I have exam on Monday? And, like you said, the cost of doing that was minimal, and it was well worth it. And the student was very appreciative.
MIKE PHILLIPS: Yeah, in our case, in our first experience with 3Play, the student waited almost nine days into a summer semester before we were notified did the student had to have this. So, what we did was, once we found that media, we actually did the first three classes for him in 24 hour service so we could get him some content right away.
And that gave us a three day pad to get everything else done.
MATT LEWIS: Right. The good news is, now the word is out on your campus that there’s a process in place to take care of these students, and the same for us.
LINDA BRIGGS: Sounds like you both have some of the same challenges. Speaking of the process, Yeah, Mike, back to you, what department are you with at IPFW?
MIKE PHILLIPS: I work in ITS, Information Technology Services. I’m kind of an anomaly in our department, because my specialty is media. And one of the biggest part of my job is to go out and find the new technologies that are going to support the pedagogical effort here at IPFW in supporting students, faculty, and staff.
LINDA BRIGGS: OK, and so that’s how that process works for you. What about you Matt? Are you also on the IT staff?
MATT LEWIS: That’s a very good question. Sort of. I’m kind of the bridge between our IT staff and our admin staff. So, I kind of do a lot of translating. They tell me something they need, and they know that there’s an answer out there, and I could find it for them, and work with our IT department to kind of deploy that solution.
LINDA BRIGGS: OK. So I think the questions there are trying to get to the essence of how the flow works with the captioning solution.
MIKE PHILLIPS: At least at IPFW, I can walk the media walk, and talk the media talk, where the IT professionals lean on me for the media knowledge. Whereas, I tend to lean on the IT professionals for the technical savvy in getting everything to work the way we want it to work on the backside.
LINDA BRIGGS: Good summary on that. Let me go over to Matt with Georgia Tech with this question. Have you done any captioning to math or science courses is the question, and how effective has the captioning been for that? And then I want to go to Mike as well with that question.
MATT LEWIS: That was a big concern, because one of the things we threw out there at the beginning was, we should get a student assistant to come in here since they’re going to know the language. And we couldn’t really find any student assistants who wanted to caption five hours of video a week. I don’t know why.
But that was a concern going to 3Play, and we talked to them about that, and they said, no problem. Don’t worry. And we took them on their word, and I have no complaints. Our student has had zero complaints, only praise. And I watch these lectures, and they’re spot on. I haven’t found them missing anything yet.
Also, this professor, while he doesn’t have a thick accent or anything, he can mumble a bit, and be a little difficult to understand at times. Because he gets really into what he’s talking about, and he’ll speak faster, and faster, and faster. And they’ve kept up with him as well, with no problem. So I’ve been very impressed with that. I wouldn’t be able to do it.
LINDA BRIGGS: That’s an interesting point, and Mike, I want to go over to you in just a minute. But Matt, on the speakers with accents or who speak very quickly, no problem with captioning those kinds of speakers?
MATT LEWIS: OK, so, the Monday, Wednesday, Friday session for this math course, it was all the main professor. And then, we broke it out to five TAs who could divide up 355 high school students, and maybe answer some of their questions on Tuesday, Thursday sessions.
The student assistant that was given the class that needed to be captioned had a very thick accent, Indian accent, and he spoke very softly. And again, we had no problems. We got out files on time. 3Play never barked at us. Not once. I even called an asked, and they’re like, nope. No problem, we’re good.
LINDA BRIGGS: That’s a great story.
MIKE PHILLIPS: Well, and even for hearing impaired students, to be able within Tegrity, to use Tegrity Connect, the virtual interactive space– our instructors here at IPFW, when they’re using Tegrity, many of them actually set up virtual office hours, which, for the hearing impaired student is great, because they can do their dialogue back and forth in that virtual blog type space.
And everything will be there for the student in print.
MATT LEWIS: Right. One thing I wanted to mention is, we have a program on campus, because we have a lot of foreign students, we have an English as a second, and apparently I needed to take it, English as a second language. And we’re going to begin testing with them.
They showed a lot of interest in this where, the students being able to, even though they can speak English understand English, being able to see the words on the screen as someone gives you instruction on the English language, it helps. It makes a big difference for them.
LINDA BRIGGS: Here’s an interesting question for both of our speakers. Mike, I’ll hand this one to you first. Have you dealt with courses with caption requests where the content hasn’t already been captured? Has the turnaround proven to be fast enough to make that effective?
MIKE PHILLIPS: I have not, at this point. Again, we are poised. We are cocked, blocked, and ready to rock, as far as sending those off. We’ve tested the system. We know the system works just as Tegrity and 3Play have documented it to work. But, we have not had to do it in real time yet. What we’ve had to do is pull older lectures that have been recorded.
Many of our instructors may record a lecture, and then use that lecture for two or three years. And that’s what I have had to do up until this point, as well as caption a few small video pieces for marketing purposes.
LINDA BRIGGS: So you haven’t tried it get, but you’re ready to go with that.
MIKE PHILLIPS: Yeah we are. We’ve tested it, and everything seems to work well. Just exactly. You just follow the documentation, and everything is, I think Matt, you calling this phrase earlier spot on.
LINDA BRIGGS: How about Georgia Tech, Matt. Have you done any non-recorded courses yet?
MATT LEWIS: As far as we record it, caption it, and post it pretty much as soon as we get it back from 3Play?
LINDA BRIGGS: Well the question is, have you with courses with caption requests where the content hasn’t already been captured?
MATT LEWIS: Yes, that was the nature of that calculus class. We had to follow it as we were going. It was a handful. Actually, it wasn’t a handful. It was really easy to keep up with. I was just worried going into it, but for no reason, really.
LINDA BRIGGS: Sounds like it worked well. At this point, let me bring in our speaker from 3Play Media, Josh Miller. He’s one of the co-founders of the company. Josh, what about that? When a course caption request, and the content has already been captured, what is the turn around time? How does that work?
JOSH MILLER: So it’s a great question. We basically have a self service type model so that it can be used any time on demand, whether it be through the Tegrity integration, or manually through out account system. Users can request that captioning anytime. It does not have to be scheduled ahead of time. And the turnaround will range anywhere from four business days and lower.
So it could be as fast as one business day if that’s what’s requested. And that request, again, can be made on demand as well. So whenever the captioning request is being made, you can decide what type of turnaround you need.
LINDA BRIGGS: OK. And I think this question could go to you as well Josh, while I have you here. The question is, can these systems be used for live captioning, or does the student have to wait to get the captioned content?
LINDA BRIGGS: So we don’t offer any live services. Everything we do is based on recorded content, but the possibility of getting it back the next day is very real if that’s what’s requested. And it’s definitely safe to say that shorter files, so if it’s a section of a class, those will go faster than a longer two hour lecture. So they can be done very quickly if necessary.
LINDA BRIGGS: OK, thanks Josh. Question back to, let’s go with Matt with Georgia Tech on this one. Have you ever had to make any corrections to the automated captioning? You were talking about that very accurate transcript from a very soft spoken, accented speaker. Any problems with some of the captioning?
MATT LEWIS: No, I have not. And I actually looked for, especially in the beginning, I would scan through and look for problems, just to see if I could gauge how accurate this was going to be. No, I didn’t find any problems with it all. Now, one of our tests, our quick tests that we did before, as we were getting ready to go for the semester, was trying to use a text to speech. And it was erring out crazy.
I forgot what program it was, but it was Evernote or one of those that you could talk into it and it would try and interpret, but it would cause problems. With 3Play, no, I never got any, that I found, any messed up words or missed words or anything like.
LINDA BRIGGS: That’s quite a testimonial. How about you Mike? Have you gone through a transcript and found any problems?
MIKE PHILLIPS: No, I haven’t. I think we may have had like, maybe one lecture where there was somehow there was some link confusion there. But as soon as I found it, again, I go through and I spot check everything to make sure the text is matching the content, and the one time we found it, within a matter of a few hours, 3Play had it dialed back in.
MATT LEWIS: And I think you can go– I know you can go into the 3Play console and you can edit this yourself.
MIKE PHILLIPS: Yes you can, exactly.
MATT LEWIS: It is pretty easy. All you do is just export the file again. As a matter of fact, at the end of the semester, we went through and exported transcripts of every single lecture, and kind of gifted them over to this instructor. So we gave him a hundred hours worth of his lecture material because my boss had run across an article where other instructors had been using transcripts to help them update their textbooks, or in writing new textbooks.
LINDA BRIGGS: Yeah. Question directed to you Matt at Georgia Tech. Is any effort being done at Georgia Tech for using automatic speech recognition technology for captioning?
MATT LEWIS: Not at this moment. We’re happy with what we have. Like I said, we tested it with some simple apps that you can find, and it wasn’t very good, but who knows what the future holds?
MIKE PHILLIPS: I resemble that remark exactly. Same here. We looked at it real hard here as well, and the latency issues and the inaccuracies that we are finding with the speech recognition technology. It just wasn’t at the level of quality that we wanted to deliver to our students.
LINDA BRIGGS: So those were apps you were describing. That was the automatic feature recognition.
MATT LEWIS: Right. That’s some of the stuff we tested. We also came across an option to use student assistance to do our captioning for us, and it was about as accurate as that voice to text. And less reliable because come midterms and finals, where did most student assistance go to?
They go back to their dorm and they study, and they don’t work. That’s another plus in using someone like 3Play, who, they finished their degrees years ago, and they’re there for us 24/7, it feels like.
LINDA BRIGGS: Yeah, they’re pretty focused. Mike, have you had any experience adding captions to clip, perhaps from a video that is being shown to a class?
MIKE PHILLIPS: Actually, we have. Actually, we have down in our CASA area, we captioned some video for them, which was very nice. They were short little clips, which made it extremely affordable because 3Play prorates.
If I’m only captioning three minutes of video, they’re prorating that based on three minutes, not on X amount of dollars an hour. So yeah. We’ve done shorter stuff, and in fact with our bucket of hours, as we draw to the end of, say, our first year, or our second year contract, second year of our contract, we’ll look and see what our usage has been.
And so then we’ll go out and we’ll do some shorter pieces of media that may live on the public side of our website, to continue to make IPFW’s web presence compliant as well.
LINDA BRIGGS: Well, that kind of touches on this question. Are other students, have you found that other students are interested in obtaining the transcripts and the captioning. Matt, you touched on this, too. How do you handle that?
MATT LEWIS: We haven’t had any requests outside of our student who needed the closed captioning, and the instructor who loves the transcription. I think that would be up to him, as if students asked for them, that we would certainly provide it. I’d see no problem with that, as long as he was OK with that.
LINDA BRIGGS: Yeah, I think sometimes these transcripts can be useful to English as a second language students, for example. So you haven’t seen any of that yet.
MATT LEWIS: I know that they are very interested, and they keep coming by, knocking on our windows asking for more examples, and how we could scale this to fit their program, which would be a nice undertaking.
LINDA BRIGGS: How about you, Mike? Anything to add on that? Students interested in obtaining the transcription and the captioning who aren’t hearing impaired?
MIKE PHILLIPS: The classes that we’ve done, I think it’s my understanding that more students watch the caption side as opposed to the non caption side. Have we gotten a lot of requests for it? I have instructors requesting it, that their entire class be captioned.
But at this point in time, we would love to caption everything we produce here at IPFW. Unfortunately, we are in more of a posture of accommodation, as opposed to captioning everything, simply because of the costs involved.
3Play has worked well with us as far as pricing structure and things like that with our bucket of time, but to caption everything at this point in time, it would be very difficult.
MATT LEWIS: Yeah, and I wish we could caption everything. We record, too, like in the introduction, Linda, I think you mentioned that we record and process 20,000 hours of lecture material every academic year. And to have that captioned would be extremely valuable, and it would set use aside from anyone else who’s doing this, especially with our content.
We have a lot of engineering degrees where, and a lot of professors who are, like I said, from France, from Singapore, Shanghai, all over the world. The captioning would definitely help.
LINDA BRIGGS: Interesting. Back to a question here on the process. The question here is both of you have dedicated staff to implement the video capture itself and the captioning of the video. Is that a correct statement, Mike, first to you?
MIKE PHILLIPS: No. Actually with, for example, with our Tegrity, we have our instructors initiate and start their own Tegrity classes here. If they’re using one of our other lecture capture solutions, those start basically on a clock.
Like at 10:00 every morning, it starts from 10:00 and stops 11:00, or whatever that class time would be. And actually, we find that with the Tegrity solution, the instructors like having the control of starting and stopping it, because then they don’t have this enormous leader at the front of their lecture before their lecture actually starts.
So, as far as our Tegrity solution, we like to give our instructors control to start it, stop it, and pause it, and then upload it as they see fit.
LINDA BRIGGS: Now Matt, anything to add there?
MATT LEWIS: We do have a dedicated staff who all they do is capture course recordings. And we kind of take a different approach. There are some cases, obviously the satellite instructors, they start and stop when they’re ready to go, and they use webcams and microphones that work with the system.
But, here on campus, we have dedicated staff to start the recording and can troubleshoot anything that comes up. Kind of our philosophy is, we are in the background. They teach, we capture it. we’ve tried to schedule starts and stops, and, at a research Institute with professors who were very into what they’re researching at the moment and realized they’re 10 minutes late to class, that created a lot of unused material that we had to go back and trim up later.
So it looks like, either way, you would still be able to use Tegrity with this.
MIKE PHILLIPS: Well, and I have to say, Tegrity is so simple to use, so simple to launch, our instructors, I give them a 20 minute overview of the system, and they take off and they’re recording within 10 minutes after I’m leaving their office.
MATT LEWIS: And now you’re falling behind them cleaning up all their tests?
MIKE PHILLIPS: No, we don’t have a lot of problems with it at all. We really, really don’t. Our instructors have been great.
LINDA BRIGGS: Here’s a good question for Josh Miller from 3Play Media. Josh, let me hand you this one so you can explain the process a little bit. The question is, are the captions added by human hand, or are they automated?
JOSH MILLER: So we have a unique process where every file will go through an automated step, but a lot of the technology we’ve developed is around this editing platform that we’ve created.
It’s really focused on having a human edit that draft from the speech recognition very, very efficiently, very carefully, and then it goes through another QA check. So, when it’s all done, we have a completely synchronized document that can be used for many different caption formats and transcripts.
But, more importantly, it’s often more accurate than manual transcription, just because we’re starting from a draft that gets edited. And the editors can actually be more careful and more thoughtful about what they’re doing, as opposed to trying to capture everything from scratch. So that’s how it all works.
LINDA BRIGGS: Interesting, so the answer is really both. It’s automated and then done by human hands.
JOSH MILLER: Yep.
LINDA BRIGGS: Another couple questions for you Josh, while I have you here. What’s the accuracy rate, is one of the questions. Any number that you can throw out there?
JOSH MILLER: Yeah I mean, it’s definitely over 99% accurate. I think we measured it recently and it was something like 99.5% accurate. It’s really tough to measure perfectly, because we’re scrubbing out things like um’s and uh’s, and false starts. So, that in itself, you can’t perfectly map what someone says to what’s written. Based on the way we’ve measured it, that’s the way it comes out. So it’s very accurate.
LINDA BRIGGS: Sounds like it. And one last question for you Josh. Is 3Play Media services available in Canada?
JOSH MILLER: Absolutely. Everything we do is web-based, so you can access it from anywhere. We have customers actually, really, all over the world at this point. So Canada, UK, and Australia are some of the most common non-US countries that we serve, but definitely all over.
LINDA BRIGGS: OK, thanks. Back to Matt that with Georgia Tech with this one. Do you have any technical recommendations for a purely online certificate program aimed at hard of hearing adults? Matt, anything you could suggest to that questioner?
MATT LEWIS: As far as course material delivery?
LINDA BRIGGS: Yeah, purely online certificate program. I mean, based on your experience with Tegrity and 3Play, any issues you can see there?
MATT LEWIS: Not at all. No. As a matter fact, when you asked that question, I, in my mind had already moved on to the communication between a professor and a student. Because there is no issue with the Tegrity server or with 3Play’s captioning. So that would be easy.
So, that would be it. That would really be all you need, because the Tegrity will hold other files like exams and PDFs and documents, and it will deliver the captioned recording, so I don’t see any problem with that.
LINDA BRIGGS: OK, how about this one. Does a student at Georgia Tech does a student have to identify themselves as needing captioning, or is that just done as part of web accessibility efforts?
MATT LEWIS: At this point, they need to identify themselves. I would love for everything to be captioned, but this is as a request basis. And, as soon as we get that request, the ball is rolling and we’re going to take care of them.
LINDA BRIGGS: Let’s see. Question here, is captioning used for strictly internet-based classes? I think the question is, is it only for internet-based classes? Mike, how are you using the program?
MIKE PHILLIPS: No. I would have to say no. I mean, any piece of media that we have that resides in, for example Kaltura, our cloud-based media storage and accessibility instance, any piece of media that’s in there can also be captioned. And that can be delivered in a myriad of different ways. So, yeah, no, not strictly web-based classes at all.
Even for our marketing side, I mean, it can be used for any marketing piece that might be on our public website.
LINDA BRIGGS: Here’s an interesting question Mike. Do you capture what students say class as well as the instructor?
MIKE PHILLIPS: Yes.
LINDA BRIGGS: And you also do that Matt?
MATT LEWIS: Yes we do.
LINDA BRIGGS: That leads to another question, which I think I’ll hand to Josh again, our 3Play Media guest. Josh, when there are multiple speakers in a lecture, does 3Play Media indicate speaker one, speaker two, and so forth? Is that how that works?
JOSH MILLER: Yeah exactly. In fact, we actually give the user, our customer, the power to select how they want people to be identified in files. So, for example, if someone is identified with their name, and it’s pretty clear who they are, we’ll use the name to identify that person in the text.
However, if it’s not as obvious, and there’s no real identification throughout the file, maybe it’s most appropriate to use a setting like Professor or student. In other cases it might be more appropriate to use interviewer, interviewee, or interviewer, subject. So we actually give the ability to choose the different settings beforehand for every file.
And you can actually change those at any time so that you’re getting appropriate speaker IDs.
LINDA BRIGGS: And what does 3Play send back? Is it a completed video, ready to go?
JOSH MILLER: We basically keep– everything is kept separate, as a separate transcript or caption file. So in case of a Tegrity integration, we would post those captions back to the right place so that they show up with your Tegrity video and Tegrity presentation ready to go. But technically, it is a separate caption track that will always remain as a separate caption track.
And that’s pretty much how we handle any of our captioning. It’s always a separate track that can be added. Most web video operates that way at this point. It’s a separate track to get referenced by the video player.
LINDA BRIGGS: And Josh, another question for you. How clean to the audio have to be that you received, and does pace at which an instructor talks make a difference?
JOSH MILLER: Not really. Is it going to be easier to transcribe if the person speaks slower? Sure. But we can handle pretty much anything. Really the only issue would be if there’s a lot of background noise, or someone’s really far away from a microphone, so it’s just very– I mean, if it’s hard to hear as just a human listening to something, it’s going to be hard to transcribe.
So, we can handle a lot at this point. But it’s really not going to make a huge difference of someone speaking quickly, or if it’s not perfect broadcast quality.
LINDA BRIGGS: Sounds like you can deal with anything. Back to Matt and Mike with this question. Have you guys captioned any podcasts? Mike.
MIKE PHILLIPS: No, I have not .
LINDA BRIGGS: How about you Matt, at Georgia Tech?
MATT LEWIS: Inadvertently, yes. All of the Tegrity captures are available in podcast. So, they should have the captioning on there. And, yes, as a test, I did do one separately for a webcast we did a couple years ago as a test.
LINDA BRIGGS: And that worked fine?
MATT LEWIS: Oh yeah. That’s the thing. That’s the easy answer, please keep asking me that, because yeah, it worked fine.
LINDA BRIGGS: Well how about this one? Has the video access by students increased after captioning? Have you seen that happen?
MATT LEWIS: That’s a great analytic I need to run. I would imagine so, because we don’t just make this captioned recording available to the one student who requested it. The decision was to keep it available to everyone, and I would guarantee that it has. I haven’t looked it up to confirm, but yeah, I would bet it has.
LINDA BRIGGS: So the captions are available to anyone, once you’ve done the captioning?
MATT LEWIS: To any student enrolled in that course, yes.
LINDA BRIGGS: Gotcha, OK. How about you, Mike? Have you tracked video access by students after captioning?
MIKE PHILLIPS: We haven’t, but I would have to believe– we also, the way they’re posted, any student that’s in that class can see either the captioned or not captioned versions. And, I would tend to believe that the caption versions are probably getting used more heavily.
LINDA BRIGGS: Here’s an interesting question for both of you again. Have you heard any conversation from colleges who show an interest in sharing the captioned videos so other hard of hearing students from other colleges can see the captioned videos? I don’t know, have you guys talked with colleagues who are interested in sharing, or maybe between your two institutions?
MIKE PHILLIPS: That really hasn’t come up here at IPFW, because the content, the posture of IPFW, is that that content is the property of that instructor. So, to then share it with another organization would, I think, could easily become an issue.
MATT LEWIS: Right. Same for us, but I wouldn’t doubt we get that request. Our instructor for the calculus program is very open to sharing and communicating with other universities. But, we would definitely wait on him to make that request. Let him know that option is out there. But I think that would be a great way to collaborate with other schools.
Get other lecture material, share ours. Who doesn’t want a calculus class from Georgia Tech?
LINDA BRIGGS: Exactly. I think that sounds like a great idea. And, on that note, we’re just about out of time. Great questions everyone. Thank you so much to the audience for sending in a lot of really good questions. And of course, thanks to our speakers, Mike Phillips with IPFW, Matt Lewis with Georgia Tech. Wonderful job with your presentations and answering all these audience questions.
And Josh Miller as well with 3Play Media. Thank you to all of you. Also, a big thank you to Tegrity a division of McGraw Hill Higher Education, and 3Play Media for underwriting today’s event. We very much appreciate your sponsorship. Again, for questions that we didn’t get to today, our sponsors will be following up offline. So do continue to turn in last minute questions while I’m wrapping up here if you’d like.
You will get answers to your questions via email. Again, we did record today’s event, and we’ll make it available for on demand viewing. Again, after the event, we’ll be sending each of you an email with a link to the slides and the audio within the next 24 hours.
And finally, from all of us here today, thank you for taking time out of your day to attend today’s event and for your many questions and comments. This concludes today’s presentation.