Accessibility Comparison of Major Video Platforms [TRANSCRIPT]
ELISA EDELBERG: Thanks for joining this webinar entitled, “Accessibility Comparison of Major Video Players.” I’m Elisa Edelberg from 3Play Media, and I’ll be moderating today. I’m joined by Gian Wild, CEO and founder of AccessibilityOz. And with that, I’ll hand it off to Gian, who has a wonderful presentation prepared for you.
GIAN WILD: Good afternoon, everyone. If you want to access the presentation, go to a11yoz.com/player17, the number, 17. That’s a11yoz.com/player17. And so I’m going to start by showing you our team in Australia. We work on the beach. No, we don’t work on the beach, but everyone thinks that we work on the beach, because we’re in Australia.
We have a similar-sized team in the US. And basically, it’s a policy to actively hire people with disabilities at AccessibilityOz. And one of the things I wanted to point out by showing you our team photo is that you wouldn’t necessarily know that any of us have a disability. So often, disabilities are hidden in plain sight. And that’s something to remember when it comes to dealing with accessibility.
So we have a whole variety of different people with different disabilities, including dyslexia, moderate vision impairment, severe vision impairment, epilepsy, migraines, physical impairments, fibromyalgia, multiple sclerosis, Crohn’s disease, PTSD, and Asperger’s. And it’s also important to know that it’s not just about vision impairments. A lot of people think that it’s really just about screen reader accessibility, and that’s definitely not the case. So although, of course, we will be talking a lot about screen reader accessibility today.
So in terms of our services, we do everything to do with web accessibility, so I won’t go through this. But if you need some assistance, place reach out to us. Or if you have any questions about accessibility, do let us know.
We have four products– OzPlayer, which we talk about very briefly today in comparison to the other players on the market– OzART, our automated accessibility testing tool, OzWiki, and Accessibility Voices. So Accessibility Voices is our website that contains blog posts by people with disabilities about their technical accessibility experiences. So we are focusing on education and navigation, so definitely have a look at a11yvoices.com. And we’re always looking for bloggers, as well. And of course, we pay our bloggers.
So a little bit about video. So firstly, 1/3 of all online activity is actually watching video, and over half of all video content is viewed on mobile. Over 500 million people are watching video on Facebook every day, and that accords to about 100 million hours. And more than 500 million hours of video are watched on YouTube every day, so it is something that we really need to make sure is accessible.
85% of Facebook videos are watched without sound. And I certainly find that that’s the case with me. I’m often checking Facebook and things like that when I’m in a taxi, or on a plane waiting at an airport. And I don’t watch Facebook with sound ever. So if a video doesn’t have captions, then I’m not going to watch that video. And we see this in terms of the completion of a video. So if you have one video with captions and one video without captions, the completion of the video with captions is twice as large as the completion of the video without captions.
We also find that videos with transcripts earn 16% more revenue than videos without transcripts. And 80% of people that use captions are not actually deaf or hard of hearing. And we find this is– we call this the Broadchurch effect. And it was something that the BBC came up with. And basically, the accents in Broadchurch– it’s a TV show– were so strong and so difficult to understand that a lot of people watched it with the captions on. So there’s a whole lot of reasons that people without disabilities would use– would turn on captions, so it’s certainly a good idea to include them where possible.
The common accessibility issues for video include things like inadequate keyboard access, insufficient control and operation, incorrect implementation, inaccessible content, so things like having low color contrast for items on the screen, or flickering content that might trigger an epileptic attack, the inability to gain a full and complete understanding of all information contained in the video– and that’s through the lack of things like transcripts, captions, and audio descriptions.
So if you want to learn a bit more about accessible video, there’s some articles on our website under, About – Articles and Interviews. There’s 8 steps to create accessible video, Video Accessibility Principles, Accessible Videos Are Your Friend. We also have a video and accessibility webinar under, About – Presentations. And we have fact sheets on the accessibility principles, impact on users, and a testing checklist.
So how do you actually create an accessible video experience? So these are the seven steps, and then we’ll move on to video player accessibility. So the first is to make the content in the video itself accessible. Have an accessible video player. Never auto-play the content, so never have the video automatically play when loading a page. No flashing content, because that can trigger epileptic attacks and migraines– and I’ve actually had five migraines triggered by flickering content- accessible transcript captions, and audio descriptions.
So in terms of the video player testing– we’ve tested 37 players in all. And there were a handful of players that we didn’t test. They included the ABC iview and the BBC Player, because they were both geo-locked. Then there were a handful we couldn’t find examples for, which was Echo360, Adobe Webinar, Blackboard, and Canvas, though it looks like Canvas uses MediaElement.js, so we tested that separately. And then there are three video players that we’ve found now obsolete– let me turn that thing off– and they include [INAUDIBLE], and the Vision Australia Foundation Video Player.
So these were the video players that we tested– AblePlayer, Acorn, Adobe, AFB, Amazon, AMI Player, Brightcove, Facebook, JWPlayer, Kaltura, MediaElement, MediaSite, Ooyala, OzPlayer, Panopto, PayPal, Plyr, RAMP, Video.js, Vidyard, Vimeo, Viostream, Wistia, Yahoo, YouTube, and the YouTube embedded player. So we started off with just testing on Windows 10 Google Chrome, and the first thing that we tested for were what we called showstoppers.
Now, if a video had one of these issues, then we didn’t continue testing the player. And the reason why is because these are deemed noninterference clauses in WCAG 2.0. So there are four noninterference clauses. And if a video player has one of these errors, then it’s actually seriously inaccessible, so much so that it will stop accessibility for a wide range of people.
So we decided that we would basically exclude any players that had these kind of issues. One was the audio should not be played automatically, unless the user is made aware this is happening, or a pause or a stop button is provided. We tested that the video didn’t contain a keyboard trap. And we also tested the full screen video didn’t contain a reverse keyboard trap. Now, what that means is that when you went to full screen on the video, if the keyboard was trapped on the page underneath, then you couldn’t actually escape from that full screen video. And you could only close the browser and start again.
So if a video player actually had any of those issues, then they didn’t continue with testing. And this actually excluded quite a lot of video players. And I really hope that, in the coming years, that these things will be addressed. It excluded Acorn, Adobe, AFB, Amazon, AMI Player, Brightcove, Facebook, MediaElement, Ooyala, PayPal, RAMP, Video.js, Vidyard, Vimeo, Viostream, Wistia, Yahoo, and YouTube. So the only ones who were actually left were AblePlayer, JWPlayer, Kaltura, MediaSite, OzPlayer, Panopto, Plyr, and the YouTube embed.
So then we did some further testing. And basically, if a player failed one of these requirements, they got the score of 1. And we added up all those scores, and that’s how we came up with a percentage accessible.
So the first thing we tested were controls. Could the video volume be changed independent of system volume? And this is really important for people that have hearing issues. We needed to make sure that color alone was not used to convey information in the player itself and that the color contrast was sufficient.
We also tested whether the video was keyboard accessible. If the video player was keyboard accessible, that it had the correct keyboard focus order, so the keyboard items received focused in a relatively intuitive order, that there was a keyboard focus indicator that was very clear, and that there was a highly visible keyboard focus indicator, which is actually a AA requirement. Then we tested requirements for accommodation– so where the video supports a transcript, the video transcript should be easy to find. We tested whether the video sports captions, whether the captions are easy to find, whether the caption color contrast is sufficient, which is AA. And where the video supports multilingual captions, we needed to make sure that the correct LANG attribute was used on those captions or transcripts.
And then the last one– last but not least– was whether the video supports, audio descriptions. And so audio descriptions are just as important to people with vision impairments as captions are to people with hearing impairments. So it is really important that video supports audio descriptions. And unfortunately, at the moment, only AblePlayer and OzPlayer support audio descriptions.
So at the end of the testing, from lowest to highest, MediaSite scored 57%. Panopto scored 64%. The YouTube embed player and JWPlayer scored 71%. AblePlayer scored 79%. Kaltura scored 86%, and OzPlayer scored 100%.
Now, what I find really interesting about this is that we’ve been doing this testing– this is the third year that we’ve done this testing, and Kaltura actually was at 33% last year. So JWPlayer, which is now 71%, was our lowest ranking video player last year at 16%. So really, you have to sort of look at the video players and see how consistent they are in maintaining accessibility. So it’ll be interesting to see what the players do next year.
So in terms of issues for those players– for MediaSite, there was low color contrast, low or no keyboard accessibility, no keyboard focus indicator, inaccurate keyboard focus order, and no audio descriptions. The Panopto– their player used color alone. It had low color contrast. Didn’t have a highly visible keyboard focus indicator. The transcript was not easy to access, and there were no audio descriptions.
For the YouTube embedded at 71%– low color contrast, low caption color contrast. Didn’t have a highly visible keyboard focus indicator, and there were no audio descriptions. For Plyr, 71% as well, also used color alone for player controls, low color contrast, no highly visible keyboard focus indicator, and no audio description.
JWPlayer used color alone, also had low color contrast, no highly visible keyboard focus indicator, and no audio descriptions. AblePlayer used color alone, had low color contrast, and transcript was not easy to access. Kaltura had low color contrast and didn’t support no audio descriptions.
And OzPlayer, thankfully, scored 100%. But I do want to say that we found some errors in the testing in OzPlayer, and then we worked through and made those fixes. So it is really important to test, and test, and test these video players, because things can change with the change of an operating system, or a browser, or the release of a new iOS system, or something like that. So you do need to really continually test these players.
So there were only eight that made it through the first round. And so in terms of what we did next was screen reader testing. And it is very important. This is one of our screen reader testers. So he’s vision impaired, and he might even be on the call. But he says, “Video is a very important source of information and is a remarkable of culture now. Blind individuals must not be excluded from access to data provided in video content. You can’t see, but you can understand.” So it is really important that these things are accessible to screen readers.
So we tested with Windows 10 with JAWS on IE, Firefox, and Chrome. We didn’t test with JAWS on Edge, because it wasn’t supported. We tested with Windows 10 NVDA on IE, Firefox, Chrome, and Edge. And we tested on iOS Safari with VoiceOver and Android Chrome with TalkBack.
So the showstopper was if the video couldn’t be played by the screen reader user, then it was excluded from further testing. And that excluded Panopto. So the ones that remained were AblePlayer, JWPlayer, Kaltura, MediaSite, OzPlayer, Plyr and the YouTube embed. So the screen reader tests were player’s controls are labeled, player’s controls are easy to find, button status, i.e., on or off, is announced correctly, volume level is announced while changing, the current time of the video is announced when requested, fast forwarding or rewinding is operable, the title of the video is easy to find, captions are read automatically, and where a transcript is provided, the transcript is easy to find.
So a little bit about our screen reader testing. One of the things to take into account when doing tests with screen readers is that there are actually two different ways of using a screen reader– there’s the virtual browser versus the keyboard navigation. And virtual browsing is the default system. And there’s definitely a lot of screen reader users who don’t know that they can move into keyboard navigation, so you need to make sure that the player is accessible in that default system, but also, in the keyboard navigation.
One of the things that the screen reader users found very difficult was that in order to move to the next button, you use the B– you know, you press the B button, and that takes you to the next button. And that’s really confusing if there are lots and lots of videos on a page because, of course, you might have seven Play buttons, and seven Pause buttons, et cetera, et cetera. So it really is important that you really only display one video at a time.
The other thing were things like accurate labels. So we did come across some players that their Captions button was just called Captions. So you press the button, and it says, captions. You press the button again, and it says, captions, when really, you need to say, turn on captions, or a button to turn off captions.
And a little about captions for people who are vision impaired. A lot of people don’t realize that captions are important for people with vision impairments. And captions are important for people with vision impairments, just like they’re important for people without vision impairments– so subtitles, if you’re watching a video in a different language, accents, like the Broadchurch effect that we talked about, the ability to increase the sound, slow down the captions, copy and paste. If someone mentions a URL or a long number or something, then the ability to select those captions and access that information is really important.
The other thing that our screen reader testers came across was too much ARIA. So too much ARIA really did kind of mess things up. And it’s really important when building anything, not just a video player, but anything, that you make the underlying system accessible– you use all the HTML elements in an accessible way, and then you add ARIA, that you don’t rely on ARIA to make your system accessible.
So at the end of this screen reader testing, from lowest to highest, we had JWPlayer at 59%, MediaSite and YouTube embed at 69%, Kaltura at 73%, Plyr at 77%, AblePlayer at 79%, and OzPlayer at 98%. And I know that Terrill Thompson, who has the AblePlayer system, spend a lot of time making sure this stuff is accessible to screen readers. And I think that’s something that needs to be looked at.
So in terms of the sites themselves, the players themselves, JWPlayer, which scored 59%, the player’s controls were not easy to find. The button status and volume level were not announced by the screen reader. Captions were not read automatically by the screen reader, and fast forwarding and rewinding were not available on the mobile. The MediaSite, which rates 69%, the player’s controls are not easy to find. The button status is not announced. Captions are not read automatically, and fast forwarding and rewinding are not available on the mobile.
YouTube embed– the player’s controls are not easy to find. Captions are not read automatically. Caption text is not easy to find, and caption status was not announced on Android, specifically. Kaltura at 73%– the player’s controls were not easy to find. The volume level was not announced. Captions are not read automatically. And fast forwarding and rewinding are not available on Android. Plyr, which got 77%, the player’s controls were not easy to find. The button status volume level was not announced, and captions are not read automatically.
AblePlayer– the player’s controls are not easy to find. The button status was not read correctly, and we couldn’t operate, or our screen reader users couldn’t operate captions with JAWS or NVDA on the mobile. On OzPlayer– fast forwarding and rewinding was not available on Android Chrome, so we’re not perfect either. And we are trying to find a way to fix this, but it’s not as easy as you would hope. So hopefully, we can address that in the next month or two.
So then there are only seven players. So now we tested with mobile. We tested with a Google Pixel 1, Android 8.0 Chrome, and an iPhone 7 Plus, iOS 10 Safari. And the showstoppers were couldn’t play the video with the mobile, can’t pause the video– so it’s absolutely essential that you be able to pause a video once you’ve played it– and also, whether the video crashed the browser, which did happen in one instance.
So that excluded Kaltura, MediaSite, and YouTube embed. And we were left with four players– AblePlayer, OzPlayer, JWPlayer, and Plyr. From lowest to highest, JWPlayer scored 75%. AblePlayer scored 79%. Plyr scored 86%, and OzPlayer scored 89%.
The mobile tests were, can you change the video volume independent of system volume– but we didn’t penalize video players that just played in the standard iPhone iOS standard video player– the color alone has been used to convey information, color contrast was not sufficient, and where the video supports a transcript, the video transcript was easy to find. We also tested whether the video supports captions, whether the captions are easy to access, whether the caption color contrast is sufficient, where, if the video supports multilingual captions, the correct LANG attribute is used, and whether the video supports audio descriptions.
So JWPlayer– you couldn’t change the volume independent of the system volume. They used color alone. And they had low color contrast, and they didn’t support audio descriptions. AblePlayer– volume couldn’t be changed independent of system volume, used color alone, low color contrast, and the captions sometimes appeared twice. Plyr, at 86%, used color alone, and there were no audio descriptions. And OzPlayer, at 89%, the video volume couldn’t be changed independent of system volume, and there were no audio descriptions.
So that brings us down to four. So then we tested mobile with a keyboard. So we see this more and more, that keyboards are used with mobile devices. And it is really important that you can actually access everything via a keyboard on a mobile device. So the showstoppers were you couldn’t play the video with the keyboard, you couldn’t pause the video with the keyboard, or it crashed the browser. So that actually excluded two players– JWPlayer and Plyr– which left us with AblePlayer and OzPlayer. So in terms of what we tested, we just tested that controls are available to the keyboard. And both AblePlayer and OzPlayer scored 100%.
So the final results were AblePlayer got 79% for starter testing, 79% for screen reader testing, 79% for mobile testing, and 100% for mobile keyboard testing. OzPlayer got 100% for testing, 98% for screen reader testing, 89% for mobile testing, and 100% for mobile keyboard testing. So the results were 84% for AblePlayer, and OzPlayer, 97%.
So in reality, if you want to use an accessible video player, you really only have two choices. And that is AblePlayer and OzPlayer, because every single other player, you either couldn’t play the video, you couldn’t pause the video, or the video contained one of the noninterference clause errors of WCAG 2.0, which are seen as critical errors in WCAG 2.0.
So if you want to access AblePlayer or OzPlayer– AblePlayer, you can go to ableplayer.github.io/ableplayer. OzPlayer, you can go to www.accessibilityoz.com/ozplayer. And there’s also a code generator on that page, which is www.accessibilityoz.com/ozplayer/ozplayer-code-generator. And you can just put in the location of all your videos, where they’ll be– if they’re on YouTube or wherever– and it will output some HTML code that you can copy and paste.
So just a few things– it’s still not good enough. I mean, we aren’t at 100%. And really big video players, even things like YouTube, are still making pretty standard accessibility mistakes. It’s really something that needs to be addressed.
I mean, I do want to say that we’ve done this testing three times now, and things have improved. So when we first did testing, there were standard keyboard traps in a whole bunch of different players. Things are improving, but I don’t think that they’re improving quick enough. And one of the problems is that iOS doesn’t support the audio descriptions through the HTML5 audio requirement, so that makes it really difficult to provide audio descriptions on mobile. But that doesn’t mean that you shouldn’t have video players supporting audio descriptions across the board on desktops. So I think that there’s definitely a lot of things that can be improved.
And this was a statement by our screen reader user, who said, “To enjoy online materials, you need to have three browsers, three screen readers, and a smartphone with accessibility features, and extreme patience, and still, satisfaction is not guaranteed.” So if you’re not using AblePlayer or OzPlayer, you really need to think about what you’re going to be doing in terms of risk mitigation, because there will be people that won’t be able to access your video content.
So what’s next for us? We’re going to test on Mac. We said that last year, too, but we will test on Mac next year. And we are considering testing on other browsers. We did start off by testing on a variety of browsers, but we found that the results were so similar it didn’t really make sense.
And I do just want to say that even though we did exclude those browsers because of those showstoppers, we did continue doing some testing on those video players. So the ones that were removed right at the beginning, we did actually continue testing. We just decided that we shouldn’t share those results, because they’re critical issues, and they really need to be addressed before anything else is looked at.
So the other thing is, did we miss any players? I mean, as I said, we had no examples for Echo360, Adobe Webinar, Blackboard, or Canvas. And as far as we’re aware, the [? eStatus ?] player, which ranked very highly last year at 76%, which is third, and the Flowplayer, and the Vision Australia Player, they’re obsolete as far as we could tell. So are we wrong about that? Are there players that we missed? Do you think there are examples of the players that have audio descriptions that we weren’t aware of? Let us know, and we’ll make sure they go into the next round of testing.
And we’re happy to share our results. We do have a white paper coming out in the next two weeks. And I’m actually running this presentation again for Accessing Higher Ground in Denver, towards the middle of November. So if you are in Denver, please do come along. And as I said, we’re happy to share these results, and we’ll be putting them online shortly.
So as I said, if you want access to this presentation, please go to a11yoz.com/player17, the number 17. And now I might hand over to Elisa, and we can talk about questions.
ELISA EDELBERG: Great. Thank you so much, Gian. Everyone, we’re getting ready to begin Q&A, so let’s get started. The first person is asking, do Australia and the United States have the same accessibility requirements?
GIAN WILD: Yes, only recently, with the Section 508– well, if you’re talking about Section 508, it refers to WCAG 2.0. Whereas, in Australia, WCAG 2.0 has actually been– the Web Content Accessibility Guidelines has actually been our standard since 2010.
So this was all tested according to WCAG 2.0, which is the standard that when the Department of Justice sends out letters, or you get sued by the National Federation for the Blind, they all refer to WCAG 2.0. So everything here, even though I didn’t talk about how they tied back to certain WCAG 2.0 criteria, everything that we tested here was WCAG 2.0, which is something that the US needs to meet, just like Australia. And we did test US video players, not necessarily Australian ones.
ELISA EDELBERG: Great. Thank you. The next person is asking, could you address compatibility with speech recognition software as part of your testing?
GIAN WILD: Oh, that’s an excellent idea. Yes, I will make sure that’s something that we do. And we’ll see if we can get something happening before Accessing High Ground. Does the person have anything specific, like a particular video player or a problem that they had? I mean, it might even just be easier just to send us an email, and we’ll make sure that we include it definitely next year, and hopefully earlier than that.
ELISA EDELBERG: Yeah, great. That would be wonderful. Someone else is asking, would it make sense to reverse the order of testing and start with mobile accessibility, if that’s where most video is being watched these days? There may be some video players that were good on mobile but were eliminated from the testing because they failed the desktop.
GIAN WILD: That’s interesting. Yeah. As I said, we did continue to test. What I might do is I might see how that changes the results. So as I said, we did testing on all the players and everything, we just excluded them from the presentation. But what I might do is, yeah, I’ll look at the model testing results and then see what gets excluded then and see how that changes things.
Yeah. So please– we’ll put that in the white paper as well. And please reach out to us. Send us an email– firstname.lastname@example.org. And yeah, we’ll let you know how that changes the results.
ELISA EDELBERG: Thank you. Someone else is asking, can you explain more about iOS not supporting audio description through HTML5? Is there any further information on this?
GIAN WILD: So OK, I’m not technical, so I’m not sure I can describe this. But basically there are HTML5 video elements that support things like a second audio description track. So basically, you have your video track, you have your audio track, you have your VTT, your text transcripts, you have your captions, and they all work very well with HTML5 on desktops.
The problem is is that iOS doesn’t support this second audio track. So the proper way to do audio descriptions is to have your video playing, and then a second audio track sitting on top of it to speak audio descriptions where necessary. But iOS doesn’t support that second audio description track. That’s why OzPlayer doesn’t have audio descriptions on mobile. And one of the reasons that Able Player does have audio descriptions on mobile is because they’ve actually used a workaround where they’ve created an entirely different video that has the audio descriptions burnt in.
So that’s sort of the proper way of audio descriptions would be to turn them on and off, like you would with captions. So you could be halfway through a video, turn on the audio descriptions, turn them off five seconds later, et cetera, et cetera, like you do with captions. But that’s not supported with iOS. So the way Able Player got around it was to create an entire video that had the audio descriptions as part of the video itself. Which means if you were halfway through the video and you wanted to turn audio descriptions on, then it would actually start the video again.
So hopefully, that answers your question. But if it doesn’t, as I said, do reach out to us and I’ll get someone technical to answer your question.
ELISA EDELBERG: Thanks. You mentioned not over using ARIA, or was it using it incorrectly? Could you say more about that?
GIAN WILD: So the ARIA usage– I’m not sure, actually. I might have to just look at the results. So if you want to ask me another question while I’m looking at the results, I can– yeah, I can pick one.
ELISA EDELBERG: Sure. Can you tell us a bit more about the origin of the OzPlayer and any future updates?
GIAN WILD: So we started OzPlayer in 2012 because we did a whole bunch of audits, because that’s what we do, and we’d always find that the video players that people were using were inaccessible. And so our clients would come to us and they’d say, well, what should we use instead? And we said, well, there isn’t anything accessible. And this back where a whole bunch of video players were seriously inaccessible where they had keyboard traps and things like that.
So we decided to build OzPlayer. And we released it in 2013. And it took some time for it to, I suppose, take off. But one of the things I suppose that we– and one of the reasons why we do this testing every year is because one of the things we found out very quickly is that we really needed to test it every month because a new browser would come out, a new operating system, et cetera, et cetera, like a Google Pixel phone or something like that. And all of a sudden, it would break what we had built. And so that’s what we found, that the existing players, if they weren’t really on top of testing, then something that was perfectly accessible today might not be anywhere near accessible tomorrow. It might even be a keyboard trap or something like that tomorrow.
So that’s where it’s sort of come from. And it was also one of the first players to support audio descriptions. I mean, there’s only two now that support them, OzPlayer and Able Player. But yeah, that’s one of the reasons we built it as well, because it’s a AA requirements under WCAG 2 to provide audio descriptions of your videos. But if your video player doesn’t support audio descriptions, how can you provide that to users? So that was one of the reasons we built it as well, because we were writing audits saying, you need to provide audio descriptions for these videos. And people were like, well, but our player doesn’t support it. And there is no player that can support it. So yeah, that’s where it came from.
ELISA EDELBERG: Great. Thank you. Someone else is asking, what are some of the benefits of an accessible player, in addition to accessibility, that might be helpful to know to be used for gaining buy-in or in sales?
GIAN WILD: So things that have captions are searchable via the Google search and things like that. So that’s one reason why you’d want to have a player that supports captions, supports captions correctly. Having captions that support multiple languages means that, once again, you’ll have those captions being searched by Google search in multiple languages. And you’ll increase in the search engine results.
Transcripts also are searchable. So that’s another reason why you would want to have transcripts. And transcripts aren’t absolutely necessary by WCAG 2. It is, however, they are very helpful in terms of accessibility compliance. And I think, also, the more accessible something is, the more likely people are going to use it, and therefore maybe tell people about it and share it on Twitter, or Facebook, or things like that.
So yeah, I think probably the best thing to use as an example is the Google search, the ability to search those things. And if you have a transcript– this is how we deal with transcripts. Our transcript is the captions and the text version of the audio description added together.
So if you’ve got a whole lot of visual information about how you’re fixing a coffee machine or something like that, you have to put the pump in here, and this there, and et cetera, et cetera, that’s basically captured in this black box on the video that can’t be searched. But as soon as you put the audio descriptions and the captions together as a transcript, then all that can be searched. And so people who are searching for such and such coffee cup will be able to find that through your transcript. So yeah, I think that’s probably the best reason why you’d want to use the accessible video player for a reason that wasn’t accessibility.
I have a short answer for the ARIA question. And we found one player. Basically the labels got repeated over and over again. So the labels got repeated two or three times. And then we found another player where, if they hadn’t had ARIA, each label would have been read correctly. But because they put the ARIA labels incorrectly over a number of controls, then the controls couldn’t really be used because something– let’s see. The label is read only when– yeah, so you could only read things in the keyboard navigation mode. And instead of pressing B and then hearing “Play button,” you hear that it’s a video player region, which is an ARIA term. So if they hadn’t had ARIA at that point, then it would have just read “Play button,” as opposed to “Video player region,” which doesn’t really mean anything to anyone.
ELISA EDELBERG: Great. Thank you. Someone else is asking, do you know what player Netflix uses? Their player supports audio descriptions, which can be turned on and off just like captions.
GIAN WILD: I would say they have their own player. And we did actually think about testing, I suppose, streaming services like that, so Netflix, and Amazon Prime, and Hulu, and all that. It would be interesting for those people that are on the chat, if you think that that’s something that we should include, yeah, we definitely could look at including that next year or maybe earlier.
ELISA EDELBERG: Great. Thank you. Someone else is asking, what are the biggest barriers to creating accessible video players? Is it technology? Cost? Lack of knowledge about the need?
So it’s not something that is easy. Certainly, I know when we first built OzPlayer, it was very difficult to create a Flash fallback that wasn’t a keyboard trap. So we still do have a Flash fallback, other than the HTML5 video. But in reality, that’s not something that’s so important anymore. It’s not even necessarily almost that it’s hard, it’s that it’s something that has to be done continuously.
So we did this testing over the last month to two weeks. And as we found issues about OzPlayer, we kind of cheated in a way, and we sent them off to see if OzPlayer could be fixed. And there are some things that we still haven’t figured out a way to fix accessibly, so we’re not quite sure. Especially Android and TalkBack are not really friendly towards accessibility, and so they’ll behave incredibly differently to iOS and VoiceOver. And you kind of have to be in a situation where you’ve got one player that has to support this whole range of different operating systems, and browsers, and devices. And sometimes, it can be really, really hard.
So in reality, even with two or three weeks of our team going back and forth and talking to our testers and talking to the developers and all that kind of stuff, we still couldn’t make it 100%. I mean we will. I hope we will in the next couple of weeks, in the next month. But then a new operating system will be released, and we’ll have to start all over again.
So it can be difficult. I don’t think that it’s impossible. Certainly, Terrill Thompson, who has Able Player, has done it himself. It’s just something that requires, I suppose, constant vigilance. And that’s something that we don’t necessarily see from the video player manufacturers is that, I think, they build something, they release it, and they don’t look at it again a year. And in that year, the accessibility of that video player can vary significantly as things change and are introduced.
So as I said, I don’t think it’s impossible, I think it’s just something that you have be doing constantly. I mean, we do test OzPlayer. We spend two days a month testing OzPlayer and making fixes based on the testing, because things change that rapidly. So it’s a bit like social media that I talk about as well where, if social media was accessible today, you can’t guarantee that it will be accessible tomorrow. And the same is with the video players as well.
And I think that that’s something that maybe the video player manufacturers haven’t really thought about or haven’t incorporated into their systems, that level of testing, that level of fixing. And I’m sure it would be very hard if you’ve got hundreds of thousands of clients or millions of clients to send updates that regularly. But I think at the moment, that’s kind of where we’re at until HTML5 is better supported.
ELISA EDELBERG: Great. Thank you, so much. Someone else is asking, did you test a plain HTML5 video element? How does that compare to the other players on browsers that support it?
GIAN WILD: We decided not to test that. It was not my decision, but basically everything is kind of– HTML5 is kind of the basis of what these players are relying on. So yeah, we did not test that.
ELISA EDELBERG: Thank you. Someone else is asking, what involvement from blind or low vision individuals did you have in testing or coming up with the testing criteria?
GIAN WILD: So our policy at AccessibilityOz is that we only ever test with vision-impaired people who are reliant on screen readers when we’re testing screen readers. So everyone who tested was vision impaired, who was entirely reliant on their screen reader.
Now we use someone who is incredibly skilled at their screen reader use. And that’s something else that you need to think about when testing, because you need to be sure that it’s not that video player is inaccessible, it’s that it might be that the screen reader user doesn’t have the skills to really figure out how to use it. So use someone who is incredibly skilled. And he worked with us to come up with the tests for the screen reader testing.
So there was a bit of back and forth, but it was– I mean, there were even things like we didn’t get the screen reader users to test that video players had audio descriptions because we knew that only two of them had audio descriptions. But yeah, the tests were decided mostly by them, with some input from us, just to tie it back to WCAG 2.
ELISA EDELBERG: Thank you. Someone else is asking, in the beginning, you mentioned a lot of cognitive disabilities. What testing did you do specific to these?
GIAN WILD: So that was a lot about making sure the controls were easy to find, that the captions were easy to find, the transcript was easy to find, that you could turn those things on and off. I do think there’s the capacity to add tests for people with cognitive disabilities, so even things like slowing down the captions, or changing the color contrast of the captions, increasing the text size of the captions.
The reason why we didn’t actually include that this year, though, is because most of those things are AAA. And I think it’s important that we get video players up to AA compliance and then start talking about AAA. Not that those things aren’t absolutely essential for those users, but I didn’t want a response to these results being, oh, but half of that’s AAA, so it doesn’t really count. Certainly, with the advent of WCAG 2.1, there’ll be some additional tests around that.
But there’s a lot of things, like even color contrast, and using color alone, and things like that, do help people with cognitive disabilities. So yeah. And we certainly added tests this year that weren’t around last year. So last year, we didn’t test with mobile on keyboard. So I think next year, once WCAG 2.1 hopefully has been endorsed, we can add some more cognitive tests.
ELISA EDELBERG: Great. Thank you, so much. We have time for one more question. Someone else is asking, do you think the need for more accessible players is known and at the forefront? And where do you see this going in the next few years as video continues to play a huge role in our culture?
GIAN WILD: That’s a great question because, as I said, we’ve been doing this testing for three years, and the video players have improved. And they’ve often addressed things that we have identified, which is great. But we still only have two players that support audio descriptions. And audio descriptions are absolutely essential for people who are vision impaired.
And if you ever follow Netflix, they released Daredevil, which is about a blind superhero without audio descriptions. And there was a big uproar. But in reality, there’s this dissonance between caption requirements and audio description requirements. So you see that on TV as well. So everything on TV in the US needs to be captioned. I think they’re just increased it to 50 hours per quarter for the four major broadcasting channels needs to be audio described.
So there is this real difference between how people treat captions and audio descriptions. And I do think that that’s something that really needs to be addressed. And really, I don’t understand why that hasn’t been the case. I don’t understand why it’s not as important to provide audio descriptions. And we’re in year three of testing, and we’re still only two video players that support audio descriptions.
So I’m not sure if that answers your question, but yeah, I suppose that’s where– you know, I would assume that video players would begin to support audio descriptions. But then I would have thought when I started the testing three years ago that they would have supported them by now. So I think probably that’s the big thing that people need to do. And as I said, I really don’t know why it hasn’t happened yet.
ELISA EDELBERG: Great. And we actually have a couple more minutes. And we got another question in. Someone else is asking, how did you come up with the criteria for testing?
GIAN WILD: So at AccessibilityOz, we test via category. So basically, we went through all the WCAG 2 techniques, and we pulled them out and we said, well, this belongs to the images category. This is testing images. This belongs to the tables category. This belongs to the video category.
So we have 635 areas grouped by different categories. There might be 120 in forms, and there’s maybe 85 in video. And so those 85 in video, we removed all the ones that were reliant on what the owner of the video did, like whether the captions were accurate, whether the content in the video was accessible, and things like that, and we retained just the video player tests. And so that’s where they initially came from. And so that’s why they do each tie back to a WCAG 2 technique.
We also looked at the– my brain has stopped working– the video player guidelines, the media player guidelines in W3C, which we looked at when we were building OzPlayer as well. And then what we did is we went to our screen reader users and said, these are the things we think you should test. Are there other things that you should test as well? And so that’s where things like captions– like I personally, when we first started testing the turning captions on, having captions accessible to a screen reader user, I didn’t think that was necessary. And I remember, when it came back from the screen reader users saying this is what we want to test, I was like, but you’re vision impaired. You don’t need to use captions. And so I’ve learned a lot about why people with vision impairments use captions.
So yeah. They came from WCAG 2. And then the screen reader test was specifically worked out with the screen reader users.
ELISA EDELBERG: Great. Thanks, everyone, for joining. And thank you, Gian, for a great presentation.