« Return to video

Video Accessibility Essentials with Wistia [TRANSCRIPT]

EVANNA PAYEN: Hello, everyone, and thank you for joining us, and welcome to today’s webinar on video accessibility essentials, hosted by Wistia and our friends from 3Play Media. My name is Evanna, and I’m a Growth Marketing Manager here at Wistia, and I’ll be your moderator for today.

LUKE POMMERSHEIM: Hey, everybody. My name is Luke, and I’m a software engineer here at Wistia. I spend a lot of my time working with video playback and accessibility on the web.

KELLY MAHONEY: And my name is Kelly. I am a Content and Partner Marketing Specialist, which means I create content on all things accessibility at 3Play. I work alongside our partners to share what we know, so excited to be here and happy to dive into it.

LUKE POMMERSHEIM: Wistia, an all-in-one video platform that hosts videos, that’s made for businesses and marketers, for folks like you. And we’re co-hosting this webinar today with 3Play Media. They’re our accessibility partner and in-app captions provider. Some fun facts about our partnership are we’re both headquartered in the greater Boston area, not too far from each other. We’ve been working together for over 10 years now, and we’re a top partner on their site, as well. When you order captions within Wistia, you’re actually ordering them through 3Play Media, and we use their services internally, as well.

So you know, who is this webinar for? If you’re producing online video, looking to grow your audience, or just want to know some best practices out there to make your marketing materials more accessible, we recommend that you stay with us for the next 45 minutes or so while we go through all these techniques and tips and tricks.

You know, no matter where you’re at in your video marketing journey, today’s presentation is going to give you the essentials that you need to make your content accessible on the web. We’re first going to cover why accessibility matters, then we’ll move on to making your videos accessible. We’ll touch on accessibility services and how they can help during the production phase and process. And we’re also going to just chat about fostering a culture of accessibility within your organization, and hopefully end with some action items and open things up for a Q&A. So yeah, let’s get started.

If there’s one thing to take away from this presentation today, it’s that accessibility matters. So video accessibility gives people equal ability to consume your content in a way that’s inclusive and approachable to their needs. Basically, in other words, different people consume video differently. Accessibility ensures that all those different needs can be met by those who want to access your content.

So here’s something you might not have considered. Making content accessible is a brand value statement to your audience and the deaf or hard of hearing and blind and low vision communities. When you make content accessible, you’re communicating to your audience and these communities that they’re valid viewers and that they deserve an equal experience to all of your content.

Accessible video can also boost your SEO, and extends the reach of your content to the broadest audience as possible. Google, in particular, is very good about reading a page and getting metadata from a video. Transcript information can be included in the SEO metadata that Google adds, so they can have full access to the transcript and they can use that when analyzing your page.

They’ll also analyze the video player itself, and really all content on the page, to make sure it’s accessible, and they’ll rank it if it’s more accessible than not. So if you have a web page that has all the correct ARIA accessibility attributes and has good color contrast and keyboard-accessible content, you’re going to rank higher in an SEO than a web page that doesn’t, even if it was the same content, just structured differently.

So there are also legal requirements that require online videos meet accessibility standards and guidelines, depending on what industry you are in. And I think Kelly here is a little bit more of an expert on this topic, so she’s going to take a little bit from here.

KELLY MAHONEY: All right, thank you, Luke. So yes, we’re going to get into Digital Accessibility 101. We’re going to go over the three most important documents that you need to know about that impact accessibility, specifically online accessibility.

So the first one we’re going to talk about are the Web Content Accessibility Guidelines, commonly abbreviated as WCAG. And 2.1 is the most recently published version, 2.2 is the current working draft, and 2.0 is the legally referenced version. So we don’t have to get too bogged down in the details. The point is just that this is a constantly evolving document. It’s the one that is most relevant to digital accessibility. Then we’ll also talk about how the Americans with Disabilities Act and the Rehabilitation Act of 1973 touch digital accessibility, as well.

So like I said, we don’t have to get too bogged down in the details, but 2.1 is the most recently published version. This document is constantly being updated and improved by the World Wide Web Consortium, abbreviated as the W3C. This organization is considered the international authority on web accessibility, and they wrote WCAG as a standards manual, basically.

Their definition of web accessibility is that websites, tools, and technologies must be designed and developed so that people with disabilities can use them. And so what this means, in order to make that comprehensible, since that’s a pretty big task, they outlined success criteria and divided it into levels.

So Level AA is the happy medium, and that’s what most businesses should strive for. We’ll get into a little bit of a couple of examples in just one second. But generally, accessibility and usability are measured by the POUR principle, which stands for perceivable, operable, understandable, and robust, and so those are the sort of methods by which people measure the accessibility of their content.

So for some examples, I’ll talk about Level AA first, since that’s the happy medium. It’s the baseline that most people– not baseline, excuse me. It’s what most people should be striving for. It should be considered the baseline. This is what is referenced in Section 508 of the Rehabilitation Act of 1973, but we’ll get into that in just one moment.

It includes all of the requirements listed in Level A, which are listed on the screen here, a few examples– alternatives to audio and video-only content, captions on pre-recorded content. But the primary difference with Level AA is that it also requires captions on live content, as well as recorded content, in addition to extended audio description requirements and things like that.

Level AAA is the highest level of accessibility, which requires more work, but it would make sure that you are completely compliant. The thing is, this requires a higher level of skill to sometimes produce the necessary accommodations, such as sign language interpretation, as well as extended audio description, which is a little bit more extensive than standard audio description, so you would need a bit more expertise to meet this level of accessibility.

So that’s just a brief overview of some of the WCAG requirements. Now, we’re going to get into the Americans with Disabilities Act. This is the most important legal document related to accessibility in the United States. It’s a broad anti-discrimination law for people with disabilities in all facets of their life, including employment, education, transportation, as well as public and private business organizations.

So an example of this is a restaurant or a hotel. Although they may be private business organizations, they provide what is considered a public accommodation, and I’ll explain that in just one second. Titles I and II of the Americans with Disabilities Act are the most pertinent to web accessibility. They effectively extend the physical access requirements written in the ADA to the digital world, because they include the internet in the ADA definition of a place of public accommodation.

So something that is considered a public accommodation has to be generally and publicly accessible. And a really great example of the internet being included in this is a lawsuit between the National Association of the Deaf and Netflix. The media giant Netflix was sued for failing to provide captions because their online streaming platform, their entire catalog was considered a place of public accommodation. And they now provide captions, of course, to be publicly accessible.

So next, we’ll talk about the Rehabilitation Act of 1973. Once again, two titles here are really important. Section 504 applies to federal and federally funded programs. Section 508 requires electronic communications and IT be accessible to people with disabilities. So this includes emails, websites, and web documents, and it sort of just takes Section 504 a step further, a step more specifically in the direction of digital accessibility.

In January of 2017, Section 508 was refreshed to include a direct reference to the Web Content Accessibility Guidelines, version 2.0. So this is what we mean when we say that’s the version that is legally referenced, because that was the version that was most up to date at the time. And a lot of states in the US follow this legislature as their sort of standard.

So that’s a lot of legal information. What does all this mean for marketers? It essentially means that your organization, your business, and the content that you’re putting out there is a public space. You should consider your website or your social media channels a public forum, just like you would a movie theater or a park or a mall. Just like you would make your physical space accessible to more customers, you want to do the same with your content and your digital properties. You should think about it in the same way.

So that’s a little bit about why accessibility is important. I think next, we can get into how to actually make your videos accessible. What are some actionable steps that you can take towards doing that? So some examples. Some of the most common accessibility features when it comes to video accessibility are captions, subtitles, transcripts, audio descriptions, and of course, an accessible video player is most important for all of these.

I’m going to give you a landscape idea about the captioning industry before we get into captions specifically. Every year, 3Play Media conducts an annual survey called The State of Captioning, and it just sort of gives us an outlook on how our customers are using captions, where they’re publishing captions, and what their strategy around captioning is like.

In 2022, we found that 88% of businesses are captioning all, most, or some of their video content, and in 2021, Wistia saw 11% growth in videos using captions. So businesses are using captions, specific– I mean, especially in a world that was forced to go digital given COVID. A lot of businesses have started integrating captions into their strategy, not only to make their content accessible, but because without them, you could be missing out on a greater potential audience.

We’ll get into some statistics in just a second, but a lot of people prefer to watch video without sound, or they enjoy the flexibility of captions. It makes it possible to watch your videos on the go, in a loud environment, or in a quiet environment, if you forgot your headphones, for example. And also, of course, captions can be used to reach an audience that may be deaf or hard of hearing, because they require this as an accommodation.

So to sort of contextualize this usage, approximately 48 million Americans have some degree of hearing loss, and about 11 million of those people consider themselves actually deaf or hard of hearing. These people need captions. But in addition to that, a 2019 Verizon study found that 92% of people watch videos with the sound off, and about 80% of people are more likely to finish watching an entire video if captions are available. So there’s also an audience that simply prefers captions.

Consider yourself. How often do you watch video on mute, especially on social media, where sometimes videos autoplay with no audio? Captions provide an easy way to make your content more accessible to a wider audience, but also to catch the attention of users on mobile devices where sound might be turned off automatically. But depending on where you’re posting your videos, you might want to think about the types of captions that you’re using.

So closed captions are what most people are familiar with. These are standard in broadcast and streaming, like your Netflix, your Disney Plus. They allow the user to control whether they’re turned on or off. They support multiple language options, but each language option requires an individual sidecar file in order to display that translation.

On the other hand, open captions are always on. They can’t be toggled by the user. They only display one single language, and they don’t require a sidecar file because they’re burned into the video or just placed directly on the video itself. Because of this, they’re commonly used in foreign film as well as social media. So certain platforms like Instagram and Twitter don’t support sidecar files, so sometimes writing the captions directly on the video is a great way to circumvent that.

Subtitles are very similar to captions, but there’s one important distinction. They merely translate spoken dialogue, whereas captions include non-speech elements like sound effects, because that’s what’s required to be an accommodation. Subtitles are primarily a translation, which means they’re great for people who may not speak the language of the source content, or they may be listening and reading, trying to learn a new language by having two types of exposure at one time.

Transcripts are simply a text version of spoken dialogue in audio or video content. These are good for podcasts. They’re also great for people who like to read at their own pace, or sometimes they use an accommodation like a screen reader, so it’s much easier for a screen reader to parse information this way than to watch a video. And it’s also good for people who might be searching for specific information in the text. And we’ll get into one example of how that can be helpful in the video production process.

But this can also be helpful in higher education environments. 3Play actually did a study with the University of South Florida, St. Petersburg, and students reported that it was really helpful using transcripts from their lessons as study guides. It can also be helpful to marketers to chop up derivative content for social media bits, blog posts, et cetera, et cetera.

And finally, we have audio description. This is– it originated as an accommodation for people who are blind or low-vision, but essentially, it provides a verbal description of the visual context so that everyone watching has the same level of situational awareness. And this can also be helpful to people who are listening but not watching a piece of content. So if you’ve ever put headphones in and then put your phone down and walked away from it– I do that when I’m cleaning the house– it’s really helpful because you still know what’s going on, but you don’t have to have eyes on your screen.

So that was a lot of information. It may seem like a lot of complex services, and you may not know where to start. But 3Play and Wistia, we like to make it simple. With our technology and our network of professional transcriptionists, we make it easy and we integrate for Wistia customers to make sure that their videos are accessible.

Wistia, like I said, we integrate. And it makes it easy to plug in your services to your branded video player. An accessible video player is arguably one of the most important parts to ensure that you can use all of the features I just explained. But to get into that, I’ll hand it back to Luke, who is far more qualified than I to speak on that.

LUKE POMMERSHEIM: Great, thanks. Yeah, it’s amazing to hear all of the professional accessibility services that 3Play has to offer. Here at Wistia, I’ve been working on a video player that tries to support all of those services to be displayed or heard, and as well as some extra customizations that the Wistia player offers that customers can make use of.

So basically, we’ll get into some more detail of some of these. But obvious player controls is a big thing with an accessible video player. Your player controls need to be both– they need to adhere basically to those POUR standards we heard about earlier. They need to be visible, they need to be interactive, they need to be understandable, and they need to be keyboard-accessible and screen reader-available.

So a user will be able to tab through the web page and an outline will go around the currently focused element, and they need to be able to focus and tab through all of the elements in the video player and interact with them in that way. And we do that usually through HTML5 and some JavaScript, and using the ARIA labels that are WCAG 2.1-compliant that basically tell a screen reader what an element is, and how it should interact, and how to control it.

The other thing an accessible video player will need is high-contrast colors as a part of WCAG 2.1. For AA, there are different levels of color contrast you need based on the size of different elements. So when things are on mobile, you actually need a little bit higher contrast than if things are on desktop or you have a little bit more screen real estate.

An accessible video player will also need your basic controls for playback and volume, and this ties directly into auto playback. Auto playback can be really disorientating for certain viewers, and so it’s important that that autoplay behavior not actually be allowed. And if you are going to have autoplaying behavior, it does need to be pausable, and I believe it shouldn’t autoplay for more than seconds at a time, if I’m correct. And of course, we have the accessibility features we discussed earlier, like captioning and audio descriptions, as well.

So how you can do this inside of Wistia itself is, when you’re on the media page, you can go in and customize each individual media, and go into that menu and kind interact with all the different things we’ve got in there. So you can go into the Appearance section and you can choose the player color that matches your brand guide. But we’ve also given you the ability to limit the color spectrum to one that’s going to meet those WCAG guidelines. So you’ll only be given a range of colors that should have that high-contrast ratio of the controls in the player.

For the customized menu, you can also sort of upload your captions. You can upload a transcript. You can upload audio description files. You can also order captions and transcripts from 3Play Media directly in the app. In Wistia’s own State of the Video report, we saw that accessibility is becoming much more of a priority in 2021, for all of our users, and more of their videos meeting more of the accessibility criteria.

So how can you make sure that your videos are the most accessible to the most customers? For starters, we’ve got a great checklist that appears in-app that makes it easy to ensure each of your videos are optimized for accessibility inside the player.

Inside of Wistia, every video comes with an accessibility checklist to help you make your videos more accessible for your audience. So this will give you a nice overview to say, hey, it looks like I picked a player color that isn’t exactly meeting the contrast guidelines I’m going to need to meet if I want to be WCAG 2.1 A-compliant.

Some are already, by default, enabled that customers don’t have to do anything about. Mainly, that’s the screen reader-legible and the keyboard-compatible. The player, out of the box, should work with most major screen readers, like Apple’s voiceover for iOS and desktop Safari, TalkBack, which is Android’s screen reader, and NVDEA, which is an open-source screen reader primarily for Windows, and Jaws, which is also a major screen reader technology.

And that keys directly into the keyboard-compatible player, where a player is fully operable with the keyboard alone, so you can navigate and control the player just by using your keyboard to handle various assistive technologies.

Yeah, the high-contrast player color, the default player color is accessible, but we obviously know that many brands don’t have that color as a part of their brand identity. So we want to make sure that customers can choose the player color that exact to their brand.

Visible player controls. Wistia does have the option to disable certain controls. So what this checklist item is going to do is tell you that you’ve got the right ones enabled to ensure that you are WCAG 2.1 AA-compliant. And the main ones are the Play button, Volume, Settings, and captions, that they’re persistent in the play bar,

So if you hover away from the video, you will actually see that, visually, the control bar will go away, and you’ll see that in most major video players. But if someone is using the keyboard to operate, those controls actually won’t go away. They will stay focused. The focus won’t shift around the page, and they’ll be able to continue using the player, using the keyboard alone, even if visually, the player controls were to fade while the video is playing.

Videos also have a thumbnail that appears before you play, and we have the option to give that thumbnail alt text. And alt text is a way to describe what’s going on in an image that a screen reader will then read back to a user using a screen reader.

So obviously, if you’re having trouble viewing the image, a screen reader can tell you what is displayed in the image. You might have seen this, actually, on the web. If, for whatever reason, an image didn’t load, you might see text appear instead. That is the alternate text of that image being displayed.

For transcripts and captions, we integrate directly with 3Play. So you can order your captions and transcripts directly from 3Play Media inside your Wistia video library. You can do this on a per-media level, or you can even turn it on for the entire account, so that whenever you upload a video, you’ll order a transcript and a caption file.

Audio descriptions. Those are those alternate audio tracks that are used for narrating important video content. These can be ordered from 3Play Media outside of Wistia and then uploaded into Wistia. And we touched briefly on the different kinds of audio descriptions.

The ones supported in Wistia are the AA ones that are the exact same length of the video, and the narrator sort of waits for a moment of low audio action to describe what’s going on. There are audio description files that will pause the video, and then the narrator will speak, and then the video will resume, and those go past the length of the video. Those are not the ones that Wistia currently supports.

And just to recap, here’s a list of the accessibility features and services that will help you extend the reach of your video content. We’ve got captioning to display the text of dialogue or other auditory information on screen, subtitling to display the spoken word of the viewer’s language. Transcripts is a textual document of all spoken language, often encompassing speaker labels and time codes.

3Play Media can also do automatic speaker identification. So you can go into your 3Play Media, and they have a bunch of categories for the kind of content you might be uploading, and they can give speaker identification tags there that are more contextual. Audio descriptions. Visual content that is very verbally described by a narrator, often in non-speaking parts. And the accessible video player itself. So accessible in color, contrast, controls, and all the other listed accessibility features. Kelly is going to talk a little bit about some next steps.

KELLY MAHONEY: Perfect. So at 3Play Media, we always like to say that it’s easier to bake in accessibility from the beginning than to try and build it in later. So let’s talk about how you can start integrating access into your everyday workflows, and almost more importantly, into your workplace culture. So like I mentioned earlier, captions and transcripts can improve the production process and your distribution strategy. They can be helpful to video producers or content creators.

We’re going to go ahead and dive into some examples of how transcripts can help you. Luke, if you just want to hit the next slide– perfect. So like I said, they can be helpful to video producers and content creators, specifically– next slide– perfect.

Transcripts in post-production can be very, very helpful to help video editors work more efficiently and more quickly. You can search through the text instead of scrubbing through the footage if there’s a certain sound bite or some minute of audio that you really know you want to grab but you can’t find it through the 45 minutes of footage that you managed to record.

Wistia also shared with me that this is called a paper cut or a paper edit, where you use the physical version of a transcript to highlight or strikethrough or edit your text, so that way, you know what you’re going back and looking for in the actual footage.

But in addition to your website and the video that you publish, we really want to encourage that your content is accessible across all channels, including social media. Whatever marketing medium that you use, try to make sure your video is accessible.

So during production, this could look like the video blocking or the shot framing, your production design and the aspect ratio. Try to consider the effect that that will have on captions in the editing process. Captions typically display in the lower center third of the screen, so do your best to avoid putting really important content there if you think that captions might be placed over top of that.

And on the screen, you can see some examples of how you can incorporate accessibility into your production process. So this includes live captions on team kick-off calls or planning sessions, transcripts in post-production, like I just mentioned, and captioning all of your videos that go public. These are just some ways that accessibility can be helpful not only to your audience, but to your internal team, to your editors. And with that, for a final wrap-up, a little bit more about how you can integrate accessibility into your workplace culture. I’m going to hand it back to Luke.

LUKE POMMERSHEIM: Thanks. Yeah, so as Kelly mentioned, accessibility services like transcripts and captions can be beneficial throughout the entire production process, all the way to distribution. You can extend your reach, increase your SEO, make your team communication and editing more efficient, and ultimately, make decisions about accessibility earlier in the production process, long before you’re actually publishing that final product. All of these great benefits can apply not only to just your viewers, but your team, as well. It’s important to think about the accessibility needs and benefits of your organization and not just your end viewers.

So you can lead the change at your own workplace to create a culture of accessibility, both internally and externally, for every video that is shared or produced I know we, at Wistia, have come a long way at this, as well. We make a lot of external videos. But we’ve also realized how important it is for all of our internal communications to be accessible, as well. So you can really set the standards and lead by example for your audience and your employees or your coworkers.

You need to set the bar high as you’re living out those brand values that we expressed, not just for your viewers, but internally, as well. All of your video content, at a minimum, should have high-quality closed captioning. And this goes for every public-facing video on your website and social media channel, but also every internal training video, company update, weekly all-hands. With this in mind, you can implement best practices across the board for anyone creating video content.

Getting captioning for all asynchronous video and live captioning for your synchronous events is really important. It just makes sure that at the very least, all of the spoken content is accessible to everyone at all times.

So if we want to review some of the key points here and action items that we’ve covered, I think the first one would be to just start making your video content accessible. And you can easily start that by getting captions and getting an accessible video player that complements your brand. It’s a great first step, and a huge step forward if you’re not already doing it.

From there, you can kind of explore more ways to make your videos even more accessible throughout your organization through things like subtitles or audio descriptions, and you can set the standard in your own organization by requiring captions for all of your content.

EVANNA PAYEN: All right. Thank you for all of that amazing information. We have a ton of questions in the Q&A Section. Continue to drop in those questions if you have more that come up, but we’re going to get right into it and start asking some of those questions. OK, so let’s see. First up, what is the difference between captions and subtitles?

KELLY MAHONEY: I would love to jump in on this one. So the biggest difference between captions and subtitles is, subtitles mostly are just a translation of any spoken dialogue in a piece of content. So like I said earlier, captions will include those non-speech elements, like sound effects or speaker identifications, because that’s required for them to be an accommodation to an audience that may be deaf or hard of hearing, whereas subtitles, it’s strictly the dialogue in a different language.

EVANNA PAYEN: Awesome. That’s really helpful. OK, let’s see what the next question’s– auto captions generated on our YouTube videos. Is it necessary to update the auto-generated captions with the proper SRT file? That might be a Luke question.

LUKE POMMERSHEIM: I think it’s always a best practice to sort of review your automated captions. That being said, if you feel that they’re adequate enough, I don’t think there’s any other action you need to take. However, you may need to go back and edit those caption files. And both 3Play Media and Wistia as well have in-app editors as well, if you see something incorrect with your audio captioning. I know that acronyms and company names are sometimes a little bit tricky for AI to get correct the first time around,

EVANNA PAYEN: Awesome. And then the next question we have here. Do you have research or perspective on kinetic text– in parentheses, motion-based text– and its accessibility versus subtitles? We wrestled back and forth about the design and energetic benefit of dynamic text on screen/kinetic text versus the consistency of subtitles, and which is more readable and accessible, or if it’s about equal.

KELLY MAHONEY: That’s a really great question. It’s quite technical. So Luke, please, if you know more about the technical side of things, please jump in. But I just wanted to say that the Described and Captioned Media Program has something called a captioning key, and that’s a pretty great set of standards, another set of guidelines that specifically address the design of closed captions more specifically than maybe another document would. So Luke, again, if you have any more technical knowledge to lend, I’ll pass that one off.

LUKE POMMERSHEIM: I unfortunately don’t have a big opinion on kinetic captioning. I’m not too familiar with the technology. I will say, for regular captioning, there’s a format similar to SRT called WebVTT, and that is a very common captioning format, maybe the most ubiquitous captioning format on the web. And you can put captioning directions inside of WebVTT, so you can maybe bold certain words, or put certain words in different colors, or even give it notes to say actually, there’s some on-screen text here. I need this line of captioning to be at the top of the video and not at the bottom.

Unfortunately, Wistia’s player actually does not adhere to some of that metadata at that time, but that is maybe a good middle ground for you if you do need some customization of your captioning files. And it’s something I would like to see us implement a little bit more in depth in the future.

EVANNA PAYEN: Awesome. Thank you for that. And then the next question, can you please explain in more detail about how the audio description is created? Is that something Wistia creates? Does the customer need to assist with creating the audio description?

LUKE POMMERSHEIM: Yeah. So I can answer maybe part of this, and maybe Kelly can take the second half. But we at Wistia don’t create any audio descriptions. That is a third-party sort of file that you would upload to Wistia that we attach to that media. And then in the Wistia player, a viewer could start playing– there is an icon in the control bar that says audio descriptions.

You can open that menu and you can choose between several different audio descriptions or none at all. And that would basically replace the audio track that you’re listening to to an audio description audio track. That just happens in line.

Now, in order to actually meet WCAG 2.1 AA compliance, my understanding is that as long as you have actually a link to that audio description, you can meet the guidelines. So if you have a link to a separate video, a whole different video on a whole different page, you can actually meet that guideline.

However, of course, it is a much nicer convenience to have that option directly built into the video player itself. And unfortunately, we don’t yet have an ordering process with 3Play Media built into our app, but you can purchase audio descriptions to be created by 3Play Media. And maybe Kelly can speak a little bit to that.

KELLY MAHONEY: Yeah, absolutely. We have a team of audio description experts who know exactly what needs to be captioned and where exactly it would fit within the content that they’re given. This is maybe jumping ahead a little bit, but I do also see another question asking something related to that. If there’s a voiceover that doesn’t have very many breaks in the audio, how would you include audio description?

This is what extended audio description addresses. So extended audio description will pause the content of the video to include an audio description and then resume the video’s content, whereas regular audio description, like we said, just sort of fits in where it can. So it’s definitely a term– it depends, in terms of use case, what type of content you have, what would be best for you. But that’s exactly why we worry about that, so you don’t have to.

EVANNA PAYEN: Awesome. Thank you for that answer. And the next question we have here– which services allow non-English speakers to translate dialogue into a language other than English?

KELLY MAHONEY: Yeah, I would once again shout out subtitles. Like I said, they’re just translated dialogue. But even further than that, if you need to provide an accommodation for your video captions, subtitles are not going to be sufficient for that. So you can also get closed captions in another language that will include those non-speech elements in whatever language you are translating to. I think it’s just, once again, a matter of use case and what type of content you’re dealing with

EVANNA PAYEN: All right, thank you. And I want to give another plug to anyone who’s joining from LinkedIn Live. Feel free to ask questions in the comments there, as well, and we’ll tackle them here.

So the next question we have. Does the Wistia player support attributes such as italics and colors, and is it possible for users to change font size while viewing on different devices, for example?

LUKE POMMERSHEIM: Yeah. Currently, we do not, for the captions, allow those. We do have an embed option that you can find in our support docs for changing the scale of the font size in your captioning. But yeah, currently, we don’t support the positioning or color, or some of the other WebVTT customizations that exist right now.

EVANNA PAYEN: OK. And then this question might be for Kelly. So it asks, does every business in the US have to be ADA-compliant? Is there a rule that says only businesses that have 15 W2 employees, and then they need to be ADA-compliant, or is that a UK rule?

KELLY MAHONEY: Yes. So I’m going to blanket statement and say yes. If your organization provides any sort of service or product or location that is intended to be accessible to the public, then yes, you should be compliant with the ADA. And digital accessibility legislation is still in the works, but that’s why we point to things like the Web Content Accessibility Guidelines, since those are referenced in another very important legal document relating to accessibility in the digital world.

I do not think that this applies to the United Kingdom. It’s just in the United States. But there are also accessibility laws in the United Kingdom that are very similar to what the United States’ ADA is like.

EVANNA PAYEN: All right, thank you. And the next question. What’s the best way to handle already produced videos that have text baked in but no audio aside from a music track?

KELLY MAHONEY: This may come down to the editing process. If the text is already baked in and there’s no audio aside from the music track, typically, you can just denote that there is music playing, if it’s going to be playing for the entire time, if you just denote that periodically. There are specific requirements in terms of how long captions should be on the screen and where they should be placed when there is things like this, text already on the screen.

So you may have to adjust the placement of them to ensure that nothing that’s already in the video is going to be obscured. But Luke, again, I don’t know if that’s something that may come up towards the editing process or specific requirements for video processing.

LUKE POMMERSHEIM: Yeah. Off the top of my head, I feel like I share similar feelings, that you’ll need to come up with some sort of transcript or captioning file before the video was made, or even after, but it wouldn’t be automated. But a way to just get that in a text-based form somehow, not burned in directly to the video, unfortunately.

EVANNA PAYEN: OK, awesome. And then the next question is, I currently remove “um,” “uh”, “yeah,” “you know,” et cetera, from the YouTube captions. Do you see any problems with that, since it doesn’t technically match the speaker but it makes the text much easier to read?

LUKE POMMERSHEIM: I’ll let Kelly speak to this, as well, but I don’t think it’s a problem. And in fact, I think there are a lot of tools out there nowadays that actually will do that for you. And I’m doing it right now as I speak. But there are a lot of tools out there that will remove those from you automatically, and I think it actually just makes things, actually, more clear. But I’m not sure in terms of the legalities.

KELLY MAHONEY: Yeah. Once again, it depends on what sort of content you have or what sort of content you’re presenting. So this is where the DC&P captioning key that I just mentioned a little while ago, this is where they go into detail about what you do and don’t need to caption. And there’s actually a distinction between verbatim captions and clean read captions.

So captions on streaming platforms like Netflix, they will normally include those sorts of filler words, but that’s because it’s part of the scripted dialogue. The characters are supposed to be hesitating about something. But if you’re uploading content like a lecture or a webinar like this, it’s best to cut those words out so that the viewer can read the captions cleanly without those filler words. And I also do the “ums” and “uhs,” too. I always catch myself during presentations like this.

EVANNA PAYEN: OK, awesome. That’s really helpful. We have a bunch more questions, but I think we only have time for a few more. This next question asks– OK, it says, this may have already been covered. What is the law around accessible videos and content for employees related to internal information and training? For example, if there is a virtual meeting, do there need to be closed captions or some other way to convey spoken language or visual content?

KELLY MAHONEY: Yeah, I would say that internal video content is a little bit different, of course, than public-facing content. So while it may not necessarily be legally required in the same way, like I said earlier, it can still be really beneficial in the content production process. It can become derivative content, or it can be used by the team that’s actually editing and working on whatever the videos or trainings that you might be using.

I would say, though, absolutely, if someone requests an accommodation, that they request the fact that they may need captions or audio description or whatever other accessibility features may be necessary, then of course, it’s best to provide those. But accessibility can only help internally, so of course, we’re always going to encourage that things include captions and transcripts, and the more the merrier.

EVANNA PAYEN: Awesome, thank you. This question is also related to captions. So when talking about captions on live content, are you considered to be in compliance if you use the auto-generated captions through Zoom and Facebook, even if they aren’t 100% correct?

KELLY MAHONEY: This is a great question. No, automatic captions are normally not considered enough to be legally compliant. As a sort of a fun fact– again, I don’t know if that’s fun to anyone except myself– but auto captions will actually commit more substitution errors than human editors, so human-generated captions, but humans will actually commit more omission errors, which means they’ll skip words that they miss or they don’t understand.

But substitution errors can actually be more problematic for accuracy, because even if the computer doesn’t understand what’s being said, it’ll switch the word out. So one example of this that I heard was JetBlue. In some transcription that they had, it was written as Jeff Blue. So as you can see, that would be a little bit misleading in terms of your brand identity and things like that. So automatic captions are normally not sufficient, and they could get you into some funky misunderstandings.

EVANNA PAYEN: OK, awesome. That’s really good to know. All right, and our next question. Let’s see, this one has to do with social media. So it asks, can I upload audio description files to all social media platforms, or does it only work with in-depth players like Wistia?

LUKE POMMERSHEIM: There may be some, but most social media platforms that I’m aware of do not allow alternate sort of files at all to be uploaded, and most– that’s not even audio descriptions. Many don’t even allow captioning files to be uploaded, as well. There might be some niche social media platforms that I’m on, but I don’t know of any.

EVANNA PAYEN: All right, thank you for that. And then another question we have here. How do you identify background information in the video and decide whether it’s informative or not to be included in the captions or descriptions? What’s the process for making modifications if the original you provided are generated by Wistia?

KELLY MAHONEY: Yeah, that’s a great question. Luke, I don’t know you have more insight about the Wistia side of things, but at least from our side of things, this is why we definitely encourage professionally created captions. Some sort of human involvement is always going to improve accuracy.

Computers and machine learning are not as good as we are at filtering out background noise and figuring out what is or isn’t important. So a lot of the time, if you are using automatic solutions, that sort of background noise can and will interfere with what’s being transcribed. But if you have a human working on it, you have a little bit of a better ability to figure out what’s pertinent.

EVANNA PAYEN: Awesome. OK. Thank you for that. And we actually have a comment here from Natasha. She says, to reply to the question on captions and audio descriptions at companies internally, she says, in Europe, yes, it’s mandatory, because you never know who may be blind or deaf in the audience. So thank you, Natasha. That is very helpful.

I think we have time for one more question. So the last question is going to be– all right, what is the best way to implement an accessibility policy at work?

KELLY MAHONEY: Well, I would love to kick this one off. I would say first and foremost, baking it in from the start, like I said earlier. We always say that it’s easier to start your projects being accessible than to try and make them accessible later. So think about it from the pre-production process, building these accessible features so that every person, no matter how they’re interacting with your content, can have the same sort of experience and get the same information out of what you’re producing for them.

EVANNA PAYEN: Yeah. And I think we have a little more time. I’m not sure if, Luke, you have a perspective on that, or if you want another question to round things out.

LUKE POMMERSHEIM: Yeah, let’s do another question.

EVANNA PAYEN: All right. So last question of the day. OK. What kind of videos need audio descriptions?

LUKE POMMERSHEIM: Yeah, and maybe Kelly can help with this one, too, but my understanding for audio descriptions is that they are really necessary for content that has a lot visually going on that will need to be explained to the user. If you think of it like maybe a TED Talk or something like C-Span or something with two individuals just talking to each other, that actually doesn’t really qualify for audio descriptions, because the visual information in the video is actually not all that pertinent to what the content is really about, versus a car chase down the streets of Manhattan or something is a very visually expressive content that will need to be explained.

And I’m sure there is a fine line somewhere, where it’s questionable as to whether if it’s yes or no. But generally, that’s sort of my gauge, is to just, when you watch it, is it like– hmm, could I just be washing my dishes and actually knowing everything that’s happening right now, or not?

KELLY MAHONEY: Yeah, absolutely. I’ll echo that. At 3Play, we also just published a blog– this is a little bit of a plug– on audio description quality. Because those standards are still in the works, there aren’t standards that are so clear-cut like there are for captioning now. But like we’ve said, it sort of depends on the content that you’re presenting to people.

One of my favorite examples of audio description was a clip that we use to show in 3Play webinars of the movie Frozen, and it was the section of the movie where Olaf the snowman loses his nose, but you wouldn’t really be able to know that just based on the dialogue, because all you hear is him running around and huffing and puffing and looking for his nose.

So audio description really helps describe OK, Olaf is doing this. He’s running across the lake. He slipped and fell. So that, like I said, every user, including those who may be blind or low vision, they’re still aware of the situational context and what’s going on, even though they may not be able to visually see it.

EVANNA PAYEN: Yeah, that’s a great example. And that is the last question of the day. So that wraps up our webinar. I want to thank everyone for joining. Look out for the recording in the next day or two. And thank you so much to our hosts, Kelly and Luke. I appreciate you guys doing this webinar with us. And we will see you all next month. Thank you so much. Bye.

LUKE POMMERSHEIM: Thank [? you. ?]

KELLY MAHONEY: Thanks for joining, everyone. Bye.

[MUSIC PLAYING]