« Return to video

What the Pandemic Has Taught Us About How to Make Every Meeting More Accessible: Online or Hybrid Learning that Leaves No One Behind [TRANSCRIPT]

JACLYN LEDUC: So hello, everyone. Thank you for joining this webinar today, entitled “What the Pandemic Has Taught Us About How to Make Meetings and Classrooms More Accessible– Online and Hybrid that Leaves No One Behind.” I’m Jaclyn Leduc from 3Play Media. I’ll be moderating today. And I’m joined today by David Berman, who is a top expert in the fulfillment of digital accessibility requirements for web-based applications, virtual meetings, and documents, with 30 years of experience.

David has been a senior consultant in applying e-accessibility standards for higher ed, government and private sector clients. He is vice chair of IAAP, the global body for accessibility professionals, and is on the ISO Committee for Accessible PDF and Accessible Plain Language. David is also the chair of Carleton University’s Carleton Access Network. So welcome, David.

A few more things. We have about an hour for this presentation. We’ll leave about 5 to 10 minutes at the end for Q&A. This presentation is being recorded, and you’ll receive an email with the link to view the recording and the slide deck tomorrow. And with that, I will hand it right over to David, who has a wonderful presentation. David, welcome.

DAVID BERMAN: Thank you, Jaclyn. I’m so glad to be back to do another 3Play webinar. We do one about once a year. And I was looking over everyone who’s joined, and I see some familiar faces from my world, from 3Play world, from past webinars. So I am delighted that we get to share some knowledge with you all again today.

Now, today’s Zoom webinar, I just want to go through the ground rules, because whether you’re on Windows or Mac OS or on your smartphone, your Zoom web interface should, by default, be presenting my camera and my slides, and as well, it should be broadcasting my voice. So if any of that’s not working as desired, please ask for help.

And the way you ask for help, we prefer you use the chat for help. So if you chat in the window and you choose All Panelists, you’ll get our team, which is Jaclyn, Kelly, Casey, who will help you out.

Now, if you’ve got questions about accessibility for me, we prefer you put those in the Q&A window. It’s a little different than Zoom meetings. Zoom webinar has a Q&A window. And if you have a follow-up question, start a new question.

Now, if you ever want to get attention, just like in a real room, in a hybrid room, raise your hand. And if you have something to say or share, then interrupt any time you think appropriate, just like you would in a real room. And the way you do that is you use the Raise Hand button. So the Raise Hand button is labeled Raise Hand.

To lower your hand, you select that same button again. It will be labeled Lower Hand. And on Windows, that’s Alt-Y as a shortcut. And on Mac OS, that’s an Option-Y. And if you have the absolute latest Zoom, if you can actually raise your hand, and Zoom will figure it out based on your video. But only if you installed this week’s Zoom.

Now, regarding audio, please mute your mic when you’re not speaking. And you can mute it any way you’d like. However, in Zoom, you can use the hotkey in Windows of Alt-A. Or on Mac OS, a Shift-Command-A will also toggle your audio on and off. Now, if you instead prefer to use the audio bridge, then you can also dial into the meeting, in the audio, rather than using voice via computer. And if you do, we’ll make sure that you can speak, in case you wish to do all your chatting or questioning via voice, if that’s your preferred modality.

And finally, video. If you’d like to be able to pin more than one person, for example maybe you brought your own interpreter, you’d like to be able to pin me as well as someone else that you want to be looking at, then just ask Help to ask us to allow multi-pin for you, and we’ll do that, as well.

Now, we’re at 3Play Media, so of course we have some of the best professional live captioning in the world. And so of course, professional live captioning from 3Play Media is enabled for this event. It behaves just like AI captions, but it’s better because a professional is doing that. Now, if you want the closed captions, you must start the closed captions within the Zoom interface. And so you select Live Transcript button, just choose the Show Subtitle and/or watch the transcript, if you wish. That’s it.

So hey, I’m so excited to identify some of the silver linings that have been the result of the pandemic that we’ve all been enduring. Of course, the pandemic has been so hard for so many people. And yet, there have been some silver linings, and both for the classroom because– well, schools we work with, we work with a lot of higher ed clients. And schools we work with that were making noises about moving their courses online, well, suddenly they got it done. And they got it done fast. And they got it done faster than anyone expected.

So that was a benefit because, although it’s been inconvenient for many learners, and many learners prefer an in-person experience, in fact, many students benefit greatly from being able to choose when they will learn or at what pace they learn, or to be able to watch a recording or pause a recording and watch it again and again and again, as well as all sorts of customizations, personalizations that become possible when you’re joining through a mediated– through a computer.

And so the flip side was for meetings, whether we’re talking about a business meeting, an organizational meeting, telehealth sessions, where people had to see the doctor, but they couldn’t get to the doctor. So the whole health industry flipped open telehealth. And of course, we love that. Lots of people love that.

So meetings benefited as well from being forced online. Everyone potentially benefits from having the ability to join from anywhere, to be able to set up a meeting on short notice, to realize, oh, I’m not going to get there in time in person, I’ll join from my smartphone. And recordings have become– so many reasons why.

In fact, my father, my father, who graduated from U of T with his doctorate in chemistry– I know we have folk from U of T here, so shout out to the University of Toronto, my dad’s alma mater. My dad, my dad was one of the most brilliant people you’d ever meet. And if there was a pandemic during the ’40s or the ’50s or the ’60s or the ’70s or the ’80s, there would have been no way to communicate in this way.

Now, because of my father’s Alzheimer’s that changed his life dramatically over the last seven years, just before the pandemic began, we had to move him into a long-term care facility. And when the pandemic came, no one was allowed in. Which meant me or my sister, living just miles from the long-term care facility, or my brother in California, who could have– I’m in Canada, and you couldn’t fly. You couldn’t come in. You couldn’t cross the border. It meant we couldn’t visit him.

Except we were blessed with Zoom meetings. And we were allowed an allotment of two Zoom meetings a week. And so we could connect with him in that way. And from that, I started to learn something about how we can connect better with people challenged with a disability. And I’m going to get into that a bit a little later. So it’s kind of a blessing that the internet was here for us for this pandemic, because the last global pandemic, we didn’t have that so ubiquitous.

Now, if you’ve been to my sessions in the past with previous webinars for 3Play, you’ll know I harp about six reasons why we should care about accessibility. And the sixth reason is the one we’re going to dwell on today. And that sixth reason is something we call the accessibility dividend, which has a special twist. Because during the pandemic, the accessibility dividend– which usually we’re talking about how embracing accessibility in an organization can actually drive down costs, include more people, just do the just thing– but during the pandemic, the accessibility dividend actually was crucial to make sure meetings could actually work, to make sure that kids could keep learning.

The arc of their entire future of kids around the planet has been mitigated by the availability of distance learning. And people who traditionally could not participate fully in school or the workforce actually, suddenly, had a more level playing field. Because people who had often been disadvantaged because we didn’t– people just didn’t get it, they didn’t understand what it’s like to have to participate remotely, now everyone had to participate remotely, often. And so this was a breakthrough.

So it really begs the question, now that working and learning productively from a distance is now– we now know this works, this can be done, then I have to ask why we would stop. Because people are talking about the return to normal. I hear about the return to normal, the return to normal. Well, for many, the return to normal actually risks being a step backwards, a step back to a loss of equity that’s been temporarily gained.

In fact, I’m going to just take a quick poll. And so I’m using the polling feature. And so for most of you, you’ll see a poll launch shortly. But if you find you can’t use the poll feature due to its weaknesses, you’re welcome to also post your answer in the chat. So I’m launching the poll now.

And I’m asking you this question. Post-pandemic, do you want to go back to in-person meetings or classrooms only? I’ll just give you a little time to answer. I see the polls are open. They’re thinking about that.

Wow. Thanks for your participation. Almost 70% of people have also– we’re up to 80, 80% almost. I’ll give it a few more moments. I see some answers also in the chat. Appreciate that. Aristella, Barbara, Carol.

All right, so what did we get? I’m going to close the poll now. Let’s see what the result was. I’m sharing the result, and it’s– wow, 84% of us do– 84% said no. 16% said yes. So 16% of us do want to go back to in-person meetings or classrooms only. And the vast majority see the benefit in this accessibility dividend. So thank you. I’m going to just close those results now.

So there’s three cases I’m going to share with you today. And the first is the story– we’re going to talk about systemic stuff. We’re also going to talk about individual stuff. So starting with the systemic, I’ve got three case studies on the accessibility dividend that’s actually played out during the pandemic. And the first one is from a friend of ours, Fred Dixon from Carleton University. And I know we got folks from Carlton here. Shout out to you guys.

So Fred– Fred created a product almost 10 years ago called BigBlueButton. It was an alternative, open source approach to classrooms, a different type of classroom for online learning at a time when not a lot of online learning was going on in higher ed or elementary schools or middle schools. And so he created this thing called BigBlueButton and a company called Blindside. It was born of Carlton. And now it’s known worldwide.

And Fred came to us years ago and said, hey, what can we do to make our products accessible? And we’ve been working with Fred for years, doing what we often do here at David Berman Communications. We audit websites, identify all the breaks, coach the dev teams, make sure that we can bring it up to international standards.

So when the pandemic got started, BigBlueButton was already WCAG 2.1 AA conformant, which happens to be the standard in Europe. Well, he gets a call from the German regional government. They want to know if BigBlueButton conforms to the European standard, EN 301 549.

And Fred was able to say, absolutely. In fact, here’s my VPAT to prove it. And here’s my VPAT to prove it. And they spun up 4,000 servers to keep half the children in Germany in elementary school still in school. I think Blindside grew about maybe 100 times over the pandemic. And just in the first 12 months, they hosted 250 million students on their platform, and it became so popular then now when you install Moodle, like across a whole organization, BigBlueButton is one of the things you can automatically say, BigBlueButton with that? So that’s transformative in terms of an organization that was already doing good, and just doing great stuff for so many different people.

Now the idea of having a strategy where we, campus-wide, whether we’re talking about a business, a Fortune 500, a school, a government, we’ve got lots of examples, even before the pandemic, of how we found ways that organizations can save hundreds of thousands of dollars by injecting inclusive-by-default thinking into their work.

And as we always talk about, we always are saying, when we design for the extremes and we do it well, everyone can benefit. Well, the pandemic has taught us even way more ways to get this done.

So here’s my second case study, if you will. I’m not sure if any of you recognize this guy. But if you’re in accessibility, Gregg Vanderheiden has changed your life. Because Gregg– currently, Gregg called me up and said, can you help with the Morphic Advisory Council? He’s got a whole new product coming out. It’s called Morphic. And I’m going to tell you about Morphic shortly.

But first I want to give you the context, that if we go back over 20 years, there’s a far younger Gregg and Wendy, who invented WCAG. So the first release of the WCAG 1.0 guidelines came out of a lot of remarkable people. But these were the two leaders. And so back then, Gregg is the person who was one of the leaders who brought you this.

And so here we are, more than 20 years later, when we’re all– or most of us– are needed to conform to regulations that point to the one great international digital accessibility standard, WCAG.

And so what’s Gregg doing lately? Well, he’s doing all kinds of things. But one great thing he’s doing is he’s working on a product called Morphic, which has been especially valuable during the pandemic. The idea is that, consider these personas. Imagine a student who is living with disability and has certain assistive technologies.

Now, let’s say whether you’re on campus or you’re at home, imagine how frustrating it must be if, let’s say you’re a regular user of the JAWS screen reader. And when you go to the school library or you go to another computer in your house, all your settings aren’t there. So it’s typically you go to a school library or the Learning Center in an NGO. And maybe there’s a whole bank of computers.

And you’ll say, well, wait a moment, I’m the screen reader user. And they’ll say, the librarian will say, oh, yeah, yeah, yeah, we’ve got one computer set up for that, I think. You’ve got to go use that one. Oh, it’s in use. OK. Well– and even if you can get to that, is it set up right?

So what Morphic does is it creates a personalization wallet that you travel with. And so imagine all the computers in the library instead already have Morphic deployed, and it potentially has the ability to turn on all the top assistive technologies or the top bookmarks or whatever, and you have a Morphic account. And your Morphic account knows your favorite bookmarks, that you’re a Zoom Text user, that you like your color schemes different in Windows. And that’s what Morphic is.

So whether you’re that student, or you’re an IT person, or need a special accommodation when you’re taking tests, or you have different needs at different times of day. So you could, for example, program Morphic so that– let’s say you take your meds every day from 3:00 to 4:00, and your eyes work differently at that time of day. Or there’s so many situations where someone has different needs different times a day. So you could even have Morphic set up that way.

And if you’re the technician who has to take care of those hundreds of computers in a large organization, this is a blessing, because you have a standard set up, rather than someone saying, hey, can you get JAWS working in the library? Or you’re a professor.

And so the idea of Morphic, then, is that there’s a free version and there’s a paid version. And if you’re interested in trying this out in your organization, email me. And I will make sure that you get a guest invitation to try out the full Morphic. If you ever contact me, it would be [email protected]. So that’s what’s going on there.

Now Gregg is making me think about Morphic. But there’s a question we have from Travis Morgan, who’s asking, what about students who face social isolation at home due to language barriers, but only get language development at school? That’s a serious problem in the Deaf community.

Yes, Travis, I completely agree with you. And there’s no question that some people thrive best in an in-person situation for a variety of reasons, and others thrive best– thrive better in a computer mediated situation. And what we want to do is get the best of both worlds. So we’re seeking a goal where we can have a hybrid learning environment where everyone presents and joins in the way that fits their needs the best. So thanks for bringing that up.

And indeed, because we brought up WCAG– if WCAG intimidates, you because WCAG 2.0 AA, which is what many of us have to conform to, has 38 success criteria. And WCAG getting 2.1 AA has 50 success criteria. But the good news is that to make a meeting or a classroom, whether it’s online or hybrid, there’s only eight of those success criteria you really have to focus upon.

And most of them are packed into the part of WCAG, which is 1.2. So I’m showing a list of all 50 success criteria. And I’ve highlighted eight of them so we can focus on the ones that are under 1.2. Now if we were going deep, we’d formally step through each one. We have courses, half day and full day courses where we go deep, deep, deep, on how to create accessible multimedia, live meetings, et cetera. I’m just going to talk about a few aspects, though, that have been highlighted during the pandemic.

And I especially want to dwell on the idea of cinematic engagement. Because you know how I was telling you about my dad in the long term care facility? On those Zoom meetings, we found that sometimes he would engage it a lot. He was always delighted to see us. But sometimes it was like he was just watching a movie of us. So he was glad to see us, but he wasn’t necessarily interacting.

And so I started wondering, why is this? Why is it different? And so I started speaking to people I knew in the cinematic world about how you actually create better engagement. And what I learned, what I started to realize, is that we added better cameras, lighting, better microphone. We tried to come up with a variety of ways of creating engagement. We even– we brought in puppets, even. I took a course, we took a course on how to work with puppets.

And so the thing that came out of this is we realized that my father was reacting. He was engaging more if we had all of these technical things working. And this really inspired us. It was heart lifting when we had that breakthrough.

And so the result, though, in our organization was every time we did an online event, we started asking ourselves three questions. The first question was, why is it we can watch television or movies for hours and hours and hours and not even get up when we ought to go to the bathroom, but we can’t stay undistracted for a one hour Zoom webinar? Why is that?

And the second question we asked ourselves is– because we’re designers, we’re inclusive designers, but we don’t know much about video. So we started wondering, hey, if we talk to people who are professionals in cinema, what could they teach us about all those technical matters? And the third question, perhaps the most important one was, we asked ourselves, what can we learn from people with lived experience who already must learn remotely or must participate remotely?

And we learned some amazing things, and we gained some great habits. So as we started to do this, we started developing an approach that we started looking at everything going on around us and wondering, why does that person– why is that person memorable? Why is that person clear? And so we started looking at lighting and video, and how we gesture, and what’s our background like, and how do we contact people through the eye.

And so I’m going to share with you some of the things we learned. Because there’s ways of applying any or all of this to all of your events. So for example, I’m showing you a screen capture from the annual general meeting last year of the Society of Graphic Designers of Canada. And the reason I’m showing it is because for the first time ever, the meeting had to be online. It was the same with many professional organizations I’m a part of.

Now years ago, when I used to be involved, let’s say, in the leadership, I would go to these meetings in person and they were wonderful. But it had been years since I’d been to one because, well, I wasn’t ready to let’s hop on a flight to Halifax or New York, or Vancouver or Boston, to be there in person.

But when they say it’s going to be online, well, we are seeing better attendance at annual general meetings than any time in history, because I just had to say, oh, I just have to show up on Saturday at noon? I’m going to do that.

So the next level to that as well was, we discovered when meetings are in person and you’ve got everyone lined up in the Hollywood Squares arrangement, of course there’s downsides to it. But for some folk, it’s actually easier to engage. So for example, one member of our team is Deaf. And she demonstrated to us how– because we’re not all as fluent as we could be in ASL, and that’s her mother tongue. We discovered that when people are lined up all together in a screen, it’s a lot easier to engage.

So for example, imagine you’re Deaf and you’re at a boardroom table, or you’re in a classroom. And not only do you have to figure out who’s speaking now– let’s say you’re planning on lipreading. You’re looking around to even find out who’s speaking. And by the time you’ve found them, you’ve probably missed something because you want to lip read. And maybe they’re in front of a bright window and that you can’t see them so well.

So we figured out by making sure our backgrounds are good and our video is excellent. And we have– look at very un-blurry video. The better we did, the more someone who is lip reading– and surprise, surprise, everyone, all sighted people, lip read. You may not realize it, but you lip read all the time if you’re sighted.

So being able to line everyone up in front of you and be able to see who’s presenting, this is a breakthrough. So we actually started using this approach to meetings even when we didn’t need to to include someone fully who can’t hear in a typical way. So that was a powerful lesson.

And while we’re talking about sign language, also technology has built up. Technology has built up so that whether we’re in the classroom or we’re participating remotely– I’m showing an assistive technology here, it’s called SignGlasses. And SignGlasses are basically a heads up display for interpreters.

So imagine I’m sitting in a lecture hall. And the sign language interpreter may be off to the side. Or maybe I don’t even have line of sight. But whether I’m remote on my laptop in the room, or whether I’m wearing the glasses, the feed of the interpreter, whether it’s a sign language interpreter or a different type of interpreter, is available to me in a heads-up display, which allows me to easily see. So this is like a lesson learned on Zoom, or Teams, or Google Meet, and yet it applies to all of these technologies. It allows us to have a place where someone can personalize their experience and make sure they’re getting all the communication channels they want potentially at once.

And someone’s asking in the Q&A, why the eight criteria I’ve highlighted only apply to live meetings like Zoom and not online learning more generally, like a virtual classroom outside of Zoom. This is a very good point. And we will– I’m going to show you examples that apply, as well. So thank you for bringing that up.

Now here’s another aspect. Human beings have been looking each other in the eye for 7,000 generations. That’s about how many generations there have been of humans, about 7,000, maybe 7,500. And yet so often when we’re in meetings, because the camera’s not embedded in our laptop, like in the middle of the laptop, when we’re talking, we’re not looking people in the eye.

But I’m looking you in the eye. And check this out, if you can see me. I’m looking right in the camera lens right now. I’m looking into a teleprompter. Now if I just look away even three degrees, that is, I’m still looking at the teleprompter, I’m looking at the top left corner. See how different it is? It’s like I’m no longer looking at you. And as well, I can’t necessarily be looking– I want to see what’s in your face if your video is on, as well.

So we went through some trouble to create a setup where– and the setup I’m showing in this slide, I’m showing a picture of a setup at someone’s desk where there’s a teleprompter and a computer. And the teleprompter has mounted in it a video monitor so that the Zoom meeting itself is actually on the video monitor, and therefore is in the mirror.

And so when the person, like me right now, is looking into the teleprompter, I’m also looking right at the person who’s talking back at me. And so it allows us to improve that engagement. It mitigates the weakness of not being in person together. And it also allows me to put my– this is what I’m doing– I’ve got my slides up there, as well. So I’ve got my PowerPoint, and then semi-translucent through it, I’ve got the video of you guys. And I’m looking right into this Sony camera.

So this has been a game changer for us for all of our meetings. And we thought we’d originally do it just for– well, started with my father. And then we started thinking when we do courses we should do this. But we use this every day for every meeting.

Now, that’s an expensive rig. And I’m not suggesting every employer wants to go get that for everyone. But have you seen this? This is called the Plexi Cam. And this costs, like, $75. You take your Logitech C922 or something, a high end, high def camera. And you simply put it on this plastic rack, which slides back and forth across. And so you can have your Zoom meeting on your big monitor and you can be looking right at the people.

Now, it’s a bit in the way. It’s not as good as a teleprompter. But it’s 80% there. So I think if you haven’t tried this, I recommend you consider this. Because if we really want to deliver value, this is one of the ways we do it.

Here’s another thing we discovered. This is where assistive technologies actually become part of what we do. So I’m showing a picture on the screen of a device called IPEVO that comes out of Taiwan. And it’s an assistive technology that allows people who see differently to be able to reinterpret images. And I’m going to do a demonstration of that right now.

Because before the pandemic, I’d never seen a product like this that cost, let’s say, less than $800. And the IPEVO is about $150. So I’m now showing the IPEVO on the screen. And I’m also showing– I’m very proud of this. I know we got CAST people with us today. So CAST and I, we are launching this book we developed about a day in the life of a kid with ADHD. And this is the Icelandic version.

However, the idea is let’s say you’re a reader and you need zoom magnification of a certain page of text. Well, what I can do is with this IPEVO– and I’m reaching over with this IPEVO, an I can zoom in to change how tight it is. But as well, I can even transform the view. So I can transform it so that we go white on black, or I can change the filters to other colors that may suit how I see differently than other people.

So this is a technology we’re very familiar with. But what we discovered in the midst of the pandemic was that– I’m just going to reset this– that we could also effectively use it as a way to show other things. So for example, if we’re in the middle of a course and I wanted to show you this braille tactile display, well it’s one thing to be in the front of a lecture hall. But whether my audience is elsewhere in the same room or a half a world away, I can explain to you how this dynamic braille display works. And then I can also show it to you here.

So we found not only was this better for education and better for sharing knowledge, but it also mixes things up a little. It allows us to do different things. And so I’m even wondering, has anyone with us today– what other assistive technologies have you used in the pandemic for the very first time, or maybe in a new way for the very first time and discovered what’s possible?

And I’m seeing there’s chat questions about how are we wiring this. If you want to know the recipe of how we wired up the Zoom with the teleprompter and all that or, you want more information about IPEVO, feel free to email me and we’ll send you our recipes.

Now, another thing that has been a huge benefit of the pandemic is that recording– we’re recording everything. And there was a time that– I’m showing a picture of someone using a remote control to control a PVR.

But meetings being recorded, if you remember back three years, this was not ubiquitous. We now kind of expect that if we need a recording of a meeting, we can get it, or even a transcript of a meeting. But that’s not something that used to be always available to us. And so I don’t think we’re ever going back. Because, of course, the ability to pause and resume is powerful.

Let’s say whether you have an attention deficit difference, or you’ve got to take your meds, or you just want to step away in the middle of a meeting and come back and know what you missed, recording is brilliant.

Now, kind of hand in hand with that is the– we can record the audio and we record the video. But of course, everyone loves captions. Everyone loves transcripts. And not just humans like captions and transcripts, but search engines love them. So for example, you may have noticed in the last six months that when you do a search on Google for a question, you may get an answer that’s deep in a video, even though you weren’t searching for video. Google thought, oh, the best answer I’ve got is 2 and 1/2 minutes into this video.

I’m showing a screen capture, actually, right now. I was looking up how to fix an old rotary phone. And it surprised me to bring me to a certain time point in a video. Because the captions and the transcripts that are linked to the video tell search engines what’s there. And so if you’ve got a massive amount of content and on your university server because you’ve got this university-wide Google Meet or Microsoft Teams license, you can search and find the knowledge. And that’s amazing.

Now human transcription, of course, is the best way to have it done. And yet the quality of AI captioning has gone up so much over the last eight years, but especially during the pandemic. Because people are interested. And even the experts at an organization like 3Play are potentially using the AI captioning as a first draft, and then things can get cleaned up later.

So in the same way that language translation, there was a time when language translators say, oh, AI translation, that’s evil. But now most translators will actually use an assisted hybrid approach to get the very best, to be able to react quickly, but also to be able to give the high quality that we need.

And so captions have been great for business, whether you’re on Zoom or Teams or Google Meet, or even TikTok. I don’t know if– I’m sure I’m not the only person who, it’s late, you don’t want to wake up anyone else, so although my hearing is excellent, I’ve got the TikTok captions on because I want to have my 26 minute allotment every evening of TikTok and I don’t want to bother anyone else.

And if you think captions are good for business, just ask GoToWebinar. And I’m calling this out because we used to– GoToWebinar doesn’t support captions, or doesn’t support a type of, user-provided captions. And they’re still absent on this. I checked in just last week. And I go to the help page and it says, no, we don’t– GoToWebinar doesn’t yet support captions.

And I’m thinking when’s the last time anyone invited you to a GoToWebinar meeting? For me, it’s been about 2 and 1/2 years. We used to use it as our go-to. And then Zoom came along, and poof.

Now, there’s a lot of different captioning file formats. You may find captioning mysterious. But an organization like 3Play can help you find what you need. And there’s one thing that I want to encourage you. It’s something you can just take away and use starting tomorrow, is whether your presentation, whether your lecture, whether your meeting is being captured or not by AI or professionals, sign language interpreters and captioners can’t keep up if we’re not making sure that, when we’re presenting ideas, we’re presenting them at a speed which is optimal for our audience.

And so for example, I’m showing you optimum word rates here for three groups. I’m showing that for lower to middle level school, we’re going for about 130 words a minute. For adults, around 160 words. And for theatrical, that is movies and entertainment, we allow up to 235 words per minute. And the reason I’m bringing this up is as presenters, or as a person who is the steward of many presenters, it’s important that we keep in mind that if we want to be able to get great captions and great audio description and great transcripts, that it’s important that we share at a speed that works.

And Gary’s asking, do these rates apply to the workplace professionals? And I’d say, yeah, Gary, absolutely. In fact, you could argue these rates make sense in any face to face business meeting, too. So whether we’re person to person in the classroom, we should look for these rates.

Now Ruth is mentioning– Hey, Ruth. Ruth is mentioning, what about when people are working in their second or their third or their fourth language? And that’s really important, too. So sometimes, Ruth, when we’re doing an event– I know I tend to pack in a lot of ideas. But when we’re doing an event when we know that a lot of the audience members– let’s say we’re doing the event in English or in French, and we know that the audience, that their first language, their mother tongue is not either of the languages that we present in, then we will intentionally present information in plainer language, as well as at a pace that is more appropriate.

Now shout out to our favorite captioner and a transcription service, 3Play. They have many levels of service. And I’ll leave it to them to push that with you. But the last aspect of this I’m going to dwell on is audio description. Because that’s the other side of it.

For someone who cannot see what’s being presented, someone who’s non-visual, and we’re presenting stuff where there is meaningful information in the slides or in the presentation, well, it’s important we acknowledge that. And 3Play has a great audio description service. In fact, they lead the world in aspects of hybrid audio description.

The one thing I’d like you to take away today is the idea of integrated described video. Because one way to drive down your costs of not having to hire 3Play to do your audio description is to eliminate the need for audio description. And when you have control over the product, like for example, you’re the person who’s participating in a meeting, if you’re cognizant of the idea that every time you present something visual that is crucial to the learning, that is not replicated already in, let’s say, the text version of your slides, that you get into the habit of describing as you go.

And so for example, right now on this screen, I’m showing a screen cap from a tool kit, which was developed by a group called AMI with funding from the government of Canada, for their manual on integrated described video. And the idea is they teach you how to create video, whether it’s cinematic or it’s simply at a meeting, in a way that the voice actors are presenting all of the ideas that are crucial.

Now, you can’t always pull it off. But very often, you can pull it off. And if you’re interested in that, I encourage you to go to ami.ca/idv and have a read of their integrated described video manual.

Now the last of three case studies I’m going to explain to you is something quite next level. Barbara is asking, is it OK to share the URL for the presentation with others? And I think that’s a question for– I think that’s a question for a 3Play. And I know the answer is absolutely.

So there’s this remarkable person, Narine Hall. She contacted us about a year ago. And she she’s at Champlain College, which is affiliated with the University of Vermont. And so much of accessibility has been invented in higher ed. And here’s another case.

Because Narine had a vision to have a platform which was more appropriate, especially for higher ed, than what Zoom or Teams or other platforms could allow. She invented the idea of a proximity-aware platform. And she asked us, what can we do to make it the most accessible thing out there?

Now, just to give you a sense, if you’ve never heard of InSpace, imagine a classroom where, as you join, you appear in a location in the room, whether it’s in a classroom formerly, with a stage, or whether you’re just in a space, where you are in the room affects how loud other people are or whether you see their typing.

So I’m showing a screen capture here with a background of six people in a room. And depending on how far apart they are from each other, the volume of the person actually goes up and down. And then if you move into breakout rooms, you won’t hear the other people in the rooms, but you’ll hear people arrive and leave. And it’s a fascinating approach.

And everyone who plays with InSpace falls in love with it almost immediately. Now, it does have some limitations. There are ways it’s not as strong as. But it does things we’ve never seen anywhere else. So he started this journey where we were about six months, we were figuring out how to build accessibility into InSpace. Because it’s really interesting that someone who can see, can see let’s say the bubble that is a different person in the room. But what about a person who can’t see?

And so we worked on features where the screen reader would announce when someone came close to you. Or the screen reader would announce, for someone who’s non-visual how close they are to a wall. Or if you go into a breakout room, it would tell you what’s going on in that breakout room and who’s there.

We also added features where you could tether an assistant to you. So for example, let’s say either formally or informally, someone has agreed to lead you around because it’s your first day in this class. Or you’re Deaf and you brought with you a sign language interpreter. Or there’s only one sign language interpreter who was assigned to the room. And you’re the only person who’s going to make use of their help.

And so you tether that person to you. And as you move around the room, they travel with you. Anyone needing assistance of any nature can ask for help from someone else. They’d say, sure, I’d be glad to help you get to room number three. Anyway, it goes quite deep.

And they just let us go wild in terms of– so they asked us, how can we go places no one’s ever gone before in accessibility? And they embraced it. And we’re very proud of this continuously evolving thing. So InSpace.

So the benefit of InSpace, then, is that you have much more of a feeling of an actual room than the simple Hollywood Squares of a Zoom room. Mm-hmm, so InSpace itself, as Gary’s asking, is InSpace itself accessible to dexterity impaired, such that those of us who use Dragon for speech recognition control of the laptop and software?

And Gary, I’m proud to say, yes. I think you’d find it very powerful. Because we’ve– all of the controls have the proper labels so that assistive technology like Dragon is going to find them and respond to them and allow you to navigate the room very effectively. And Danielle is asking, what is the benefit of InSpace? And Danielle, if you’d like to email me, I could send you a whole series of slides and documentation about it.

But overall, the idea is we’re trying to create a more human interaction, which has the intimacy of– like in an InSpace meeting, we did this with one of our own internal meetings. Let’s say I just wanted to chat with someone in the hallway about something. So we leave the room and we just chat in the hallway a bit. And if anyone else comes close enough to hear us we know it. And then we figure, oh yeah, it’ll back channel. And then we go back into the room. All of that is possible in InSpace, whether you’re running a conference or one just one room.

You know, it’s also about– and I generally like to segue, then, that in general, we have to think about the future. Where does this all take us? Because people being around online together in a way where computers can track what we’re doing, it actually can be kind of creepy. But from an engagement perspective–

So for example, we have a client, Explorance. And I know we have Explorers in the audience today, too. They’re a leader in evaluation engagement software for higher ed. So imagine how one could next level, if you have the data to know, hmm, this proportion of our audience is using this type of technology, this group prefers this kind of view, this person reviews that. All the different analytics that become possible.

And a lot of the– one thing, for example, InSpace was playing with was, can we analyze the chat to discover how happy are people, how engaged are people, how sad are they? What’s the mood of the room? And I mean, I’m getting the mood of the room just by glancing over at the chat over here.

But imagine if there was a meter showing me happy, sad, proportions of the people in the UK. The few people with us today for the UK are not happy. People in America are very happy, something like that. Well, imagine what becomes possible.

And you say, of course there’s privacy issues all around that. But it makes me wonder, as we keep adding all of these fantastic– some of them kind of seem frivolous, and some of them seem really next level. So when, for example, Zoom, you may have noticed this week I can put up my– one can raise their hand and it will recognize that on video. Or we can transform our experience.

And whether that’s useful because someone has a social anxiety and they’re not comfortable being seen on video, but they do want to express, they still want to raise their eyebrows when something’s amazing, or they want to shake their head, or say yes, yes, yes, I totally agree. There’s all kinds of fascinating innovation that’s taken place during the pandemic. And some of it we don’t know where it’s going.

But what we do know is personalized experience are– experiences are becoming personalized. And people have so many more choices. And when we give people choices, great things happen. We are always talking about how when we design for the extremes and we do it well, everyone benefits. So when we design for the most extreme use cases, it’s not a surprise that people come up with applications that no one imagined. And a lot of that has come right out of the online experience.

Now, we promised to leave the last or 10 minutes of this webinar to Q&A. And I’ll stick around as long as you like. But I do know that you may want to learn more. So I do want to share one more resource with you. And in fact, if someone could put this right in the chat, that would be great, which is, I’m sharing a URL. I’m showing a picture of an article that’s on our website, our website at WCAG2.com. That’s W-C-A-G, digit 2 dot come slash accessible-online-meetings. Accessible, hyphen, online, hyphen, meetings.

And what we did is we took all the knowledge we had about how to make an online meeting as welcoming as possible. We had experts from CAST, who invented UDL, from the Canadian Association of the Deaf, from Autism Speaks, and from other stakeholders to make sure that the advice we were giving is right up to date.

So basically it’s an article that walks you through, whether it’s an online meeting or a hybrid meeting, every step of the way of getting comfortable with the idea of how to welcome everyone, people of all different abilities, people with disabilities, people with substantial differences, people who self identify in different ways, ways that you can feel comfortable welcoming everyone. And especially not just in the physical classroom, but in the online space, as well. So I encourage you to also explore that, as well.

Essentially, as I said earlier, there’s been about 7,500 generations of humanity. And we live in the first generation where it’s truly possible to include everyone. IT has liberated more people in the last 30 years than all the wars in the history of humanity. And we get to be a part of that liberation.

So you know, what I think back to my father, of course it was unfortunate that his experience, that his last home was a long term care facility. But it was a blessing that we happened to have that technology to bring to the fore so everyone can continue to communicate in creative ways.

So thank you for having the interest in being part of this journey that we’re all on together to find the silver linings. And beyond the pandemic, we are transformed in how we meet, how we learn, how we do telehealth. It’s all there. So that’s basically, Jaclyn, what I had to share. I can throw it back to you for Q&A as much as we like.

KELLY MAHONEY: All right, so I am actually going to take over this last portion for Jaclyn. She had a few technical difficulties. But David, thank you very, very much. You have been fabulous at answering questions live throughout this session.

So if anyone else in the audience has any last minute questions that you didn’t get the chance to ask, please feel free to go ahead and do that. We do have one sitting here that I’m not sure whether you got the chance to address yet.

DAVID BERMAN: Oh yes, Jocelyn.

KELLY MAHONEY: Jocelyn asked, how does an attendant assist someone with a visual impairment using a platform like InSpace? Is there a sort of chat function? Or how might they communicate using something like that?

DAVID BERMAN: Yeah, that’s a great question, Jocelyn. I’ll be happy to unpack that for you. And maybe what I’ll do is I’ll just go back to the slide. I’m going to go back to the slide where I was showing some of the screen captures from InSpace, because that’ll help give context to what I’m describing.

So first of all, if someone is not visual– and so for example, they’re using an assistive technology like a screen reader and they’re also going to be using keyboard only for navigation, because the mouse is not relevant for them, a pointing device is not relevant, then once they’re in InSpace space, all of the controls are keyboard accessible, which is not remarkable. Most good applications, many good applications today give us that.

But for example, let’s say you want to move from breakout room number one to breakout room number three. Well, the mouse user may be able to just drag themselves half a second from one room to the other. So our challenge was, well, how do we make that experience viable for someone who’s not going to use a mouse? So we put in three different techniques.

One was we simply made it possible to use a keyboard-only approach to be able to move around. So you would step, step, step, step, step, over to the right. And step, step, step, down. And it would take you longer. But you could get where you wanted to go. And that was OK. But it wasn’t that great. Because you don’t want to miss out on anything.

So then what we added was a feature where you could basically beam yourself to any room. So there was an alternative approach where you could navigate the interface and says, move me to break out one, or move me to where my friend Jack is right now. I don’t even know where Jack is, but take me to Jack. Because you could, on the participant list, you’d say, take me to Jack.

And then the third way was that you could ask for help. So you could say, hey, Jack, could you help me out? Could you be my assistant? And then if Jack accepts the idea of being an assistant, then Jack’s kind of on your shoulder in the interface. And now Jack can simply lead you to where you both want to go.

In the same way you could, a person who’s blind could ask for assistance, could ask you to lead you across a busy intersection. And you ask, could I help? And they say, no, I’ve got this. I’m fine. Or they could say, oh no, I’d love some help. Yes, please. Please take my– So you say, OK, take my shoulder, or take my elbow, and off you go.

It’s the exact same thing in InSpace. So if someone would like to have assistance, then they can have someone assist them, whether it’s through navigation or it’s other aspects of the interface. Or if they want to be independent, they’re free to do that, as well.

They’re also free to self-identify as living with a disability or whatever. Or they can decide that’s personal to them and they just do what they do. So this is the next level stuff we got into. And we tested it with users of differing abilities and behaviors. Like, I’ve got quotes like, wow, that’s like the best meeting space I’d ever been in. So, yeah. Very good.

And Ruth Sermon is asking, if I may vocalize her question–

KELLY MAHONEY: Go right ahead.

DAVID BERMAN: All right. How can we convince website developers that the current fad of gray text on a gray background creates major challenges for a person who needs higher contrast to be able to read it? Oh, I love this. So Ruth’s basically saying, how can we convince people who think it’s really stylish to have gray text on a gray background how problematic that can be for people who either have low vision or who see color differently, a color deficit, or color blindness.

Well, one way we can do it, Ruth, is to educate. Another way is to give them– we have tools that allow you to simulate what it’s like to see the screen with a variety of different color deficits. And so they can realize– because graphic designers tend to trend towards people who have a really good grasp of color. And so they’re not their best target audience. Themselves is not their target audience.

So we give them tools and we let them experience for themselves how they can see what the interface looks like for someone with particular color differences. And the other approach is, if that doesn’t work, we point out that for some regulatory reason, they need to conform with, let’s say, WCAG 2.0 AA. And WCAG 2.0 AA is going to tell them that their text must exceed a contrast ratio of, for example, let’s say 4.5 to 1. And we give them tools that allow them to read those contrast ratios.

So there’s a whole toolkit of ways to help transform the understanding so that every member of the development team of a product, or a meeting, or a presenter can make sure that– Let’s say someone’s creating their slides for the very next 3Play webinar, that they would know that their slides need to be produced in a way that all visual people will be able to experience them.

KELLY MAHONEY: Neat. So with that, it seems that we’ve reached the end of our time today. This was very, very informative. Thank you, David.