« Return to video

Innovating for Inclusion [TRANSCRIPT]

ELISA LEWIS: So thank you, everyone, for joining us today. Today’s session is “Innovating for Inclusion.” My name is Elisa Lewis. I’m the Senior Brand Marketing Manager here at 3Play Media. I’m a white female in my 30s with long brown hair. I’m wearing a gray-and-white striped shirt.

Today, we’re really excited to be talking with Sam Sepah, the Lead Accessibility Research Product Manager at Google. And interpreting voicing for Sam is Churyl So thank you for joining us. And welcome, Sam. We’re really glad to have you with us today.

SAM SEPAH THROUGH INTERPRETER: Thank you. Thank you so much for having me here today. It’s really an honor and pleasure to be with you all.

ELISA LEWIS: So to get the conversation started, I want to really start at the beginning and learn a bit more and share with our audience your career journey. Could you start with a brief overview of your career journey thus far, and how you’ve landed in the role as Google’s Lead Accessibility Research Product Manager?

SAM SEPAH THROUGH INTERPRETER: Absolutely. Sure. So I believe that everyone has their own journey, and mine is unique in its own ways. I never saw myself as working at Google one day. [LAUGHS] But I think the skills that I developed did prepare me for a job here.

So I was studying human resources management at Rochester Institute of Technology, and I was focused on, basically, making the best workplaces possible for people, whether they be disabled, deaf, or just regular population. I saw my parents growing up, and people in our community, going through different experiences at work.

And I felt like we spend so much of our time at work. Most of our waking time is spent with our colleagues and in that work environment. And so that time, I think, and always have thought, should be full of fun, respect, and all those things.

So HR was a passion of mine. I worked in that field and recruiting, diversity, task force management, organizational development, executive coaching, all sorts of different things in HR, for many years. It was a very rich experience.

I was able to work in different high tech companies like IBM, GE, the US government as well, and mostly working with people in the STEM field– research scientists, engineers, chemists, economists, mathematicians, statisticians. And Google, one day, said, we would love you to work with us. You have an HR background, and you’ve worked with all these people who are executives and leaders in the STEM fields. And that’s how I ended up at Google.

ELISA LEWIS: Wonderful. Thank you so much for sharing that and sharing that with our audience. It’s helpful to have that background. So I’m curious what advice you might have for organizations that are looking to attract and support individuals with disabilities into the workplace. What have you seen be successful, and maybe even what hasn’t been as successful?

SAM SEPAH THROUGH INTERPRETER: Yeah, absolutely. I work with different organizations outside of Google. I serve as a volunteer and as a board member. I work with different businesses, large and small, and I hear this question almost every day. How do you attract these candidates who have disabilities?

And my answer always is, first, to give them a chance. Believe me. Every job that I have had, when I look back on my career, starting with my internship, I was not the perfect candidate. My first HR internship– I think I met 60% to 70% of the criteria, in terms of minimum qualifications.

But I was hired anyway, because somebody gave me a chance. They saw that I had the potential to grow, and I did. And that’s how I got here today. So somebody gave me a chance. And so that’s always the first step.

If you have a business need for XYZ, you are also building the future generation with the candidates that you choose. So give a diverse group of people a chance. And I’m not saying to hire people who aren’t qualified, but if they are almost qualified, give them a chance. It is a risk, but I believe that sometimes the higher the risk, the higher the reward.

So once I got to my internships [INAUDIBLE] jobs, it was [INAUDIBLE] mentorship and things like that. And now, I mean, I’ve been able to work globally and do things beyond my wildest dreams. I believe that the best candidates are passive candidates, whether they’re disabled or not, which means that somebody reached out to them because of the potential that they have for a particular role. So again, I mean, there are people who actively seek for jobs in the background, but also, I believe that organizations can really improve their own workforce by following those steps.

ELISA LEWIS: That’s really great advice, and there are two pieces that really stood out to me. I think it’s a very common sort of theme that we hear from individuals who are very successful, that at some point earlier in their career, someone did take a chance on them. And I think it’s a good reminder that everyone starts somewhere, and everyone is new at some point.

But I also really love that you pointed out how important it is for an organization or a business to be seeking who they want and who they think is worth taking a chance on and who’s going to bring that expertise and passion to learn and grow within their organization. So thank you so much for that. Throughout your career, are there–

SAM SEPAH THROUGH INTERPRETER: If I may add one more comment, the impact on me was, after I got my master’s degree at RIT, I was invited to interview at different corporations, and then hired at IBM as an HR business partner. And I was actually surprised that I was hired for that role. It seemed like I was expecting a junior role, and it was more advanced than that.

And I asked someone at IBM if they had made a mistake. And they said, no, we hired you for this role on purpose. I was like, I’m just out of school. I just got my master’s degree.

And they’re like, no, we think you’re ready for this role. We want to give you a chance, and make us proud. [LAUGHS] And so the pressure was on. But it was such a good a good pressure on me.

It wasn’t like just giving me a job where I was qualified, but making me stretch, out of the gate, at IBM. And that journey was so important to my career, because it made me realize that it is a gamble for a company. I mean, it was a gamble for my boss and his reputation, but it did make me work that much harder to make sure that I met his goals for me. So yeah, be confident and give the benefit of the doubt. I just wanted to add those comments.

ELISA LEWIS: Absolutely. Throughout your career, have there been any specific projects or initiatives that really stand out to you as transformative moments in the field of accessibility or inclusive design?

SAM SEPAH THROUGH INTERPRETER: For the last few years, my career has switched to the more technical and research domains from HR. And it’s been a very interesting space, very captivating. I’ve grown more in my career, and I feel like I’m able to give back to the community in bigger ways.

I was doing this work as a volunteer, because I wanted to give back to the community. But now, working with the technical part of inclusivity and accessibility, in terms of how we can make our everyday lives better and how we can be equal players in the field, in society– as a deaf person, I don’t always have doors open to me, but now, working with engineers and research scientists and all these brilliant people here to develop technology to even the playing field is really, really wonderful.

And they really believe that you have the right– I mean, they have the right heart– places in the right heart, the right mindset. And that is that everybody have equal access to the world’s information, which has really impressed me. I worked for some other teams– hearing, understanding, and other sciences as well, using haptic devices.

I’ve worked with other higher-educational institutions who are experts in the field of sound– people who don’t really– have not lived the deaf and hard-of-hearing experience, but have close relationships to those people– or sometimes don’t. But again, their heart was in the right place. They had the right mindset.

And so yeah, these projects have really changed the trajectory of my career and have been transformative to society. So it’s really how we partner with other people. And that’s been a big learning moment for me, and that mutual respect, and these people that I’ve worked with have become lifelong friends. It’s been a really beautiful experience to build incredible products and develop these incredible relationships. It’s really changed the way I see the world today. I always tell people, don’t burn any bridges.

ELISA LEWIS: Absolutely. I’m really glad that you brought up your role in research and technology and how you see and have seen technology progressing. I think we’re all familiar with how technology is constantly evolving. Even just how frequently we’re doing these virtual webinars and presentations is a change from just a few years ago.

How are emerging technologies continuing to impact accessibility? And what are the potential challenges and opportunities that technology presents in the accessibility space?

SAM SEPAH THROUGH INTERPRETER: Fantastic question. So when I had a fireside chat recently with the leadership at Meta and Microsoft, we were– three companies were represented on a panel at the National Association of the Deaf– their biannual conference. At that panel, we were talking about the future of accessible technology from a tech standpoint. Not from a deaf person’s standpoint, but more from the tech industry– just, what is the key to keep us innovating in this space and to keep us integrating deaf people into society.

And the three of us were key players in this space, and we all agreed that captions need to be everywhere, is one thing. That really gives a broad number of people access to information, whether they be deaf or disabled or anything in between. For example, my father– he is not fluent in English. We are an immigrant family.

And he loves to read the captions. He has so much more access to information and entertainment with the captions. He just happens to have a deaf son who also needs the captions to be able to understand what’s going on, and the two of us watching something together is just a wonderful experience.

So on that panel, we were saying that we agree that a lot of platforms don’t have captions, and we need to make sure that they are integrated everywhere, and that it become a universal standard, universal design. You can look up universal design, if you like. But basically, it’s design that works for everyone, or at least more people, the most people.

For example, curb cuts– that’s another example of universal design. In the city of San Jose, in the ’70s, they didn’t want people with wheelchairs to be stuck when they were trying to cross the street, so they came up with this invention of making curb cuts. But then, of course, it benefited all sorts of other people– people with bicycles, mothers with babies in strollers, all sorts of other people with physical and impairments or otherwise. And now, you see them almost everywhere, not only in this country, but globally. There are curb cuts everywhere– so another example of universal design.

So I think that we should see captions the same way. It was made for an accessibility purpose for a specific population, but it benefits so many people. And you know, I’ll give you another example of where we need accessibility.

Sometimes when you buy a new product, there will be a video with auditory instructions. But as a deaf person, I don’t know what they’re saying. I don’t know what to do to get started, and it’s not accessible. So again, captions are the future of accessibility.

Not only are they for people who speak English, but for people who need translations. So basically, you need captions in other languages. That technology is emerging now. Universal design is the future of accessibility, where things are made beneficial for all and accessible for all.

ELISA LEWIS: I completely agree. I love that you shared the example of captions really enhancing viewing pleasure and enjoyment and engagement. And certainly, we’ve seen and heard of caption use being on the rise. Love the other examples that you also shared of universal design. I think those are great, that people can really relate to and have experienced every day, and maybe didn’t know that they started from an accessibility place.

I think I also want to talk about AI and the role that plays in accessibility and technology. So of course, AI, Artificial Intelligence, has been a very hot topic this year. And in the context of AI-driven accessibility, I know one of the things that you’re focused on is the ethical considerations. So can you share what some of those ethical considerations– you know, what should be taken into account here, and particularly when it comes to ensuring quality and equity for users with disabilities?

SAM SEPAH THROUGH INTERPRETER: Sure. Wonderful point. I think it is vital that as an industry, in general, that we know what the users are expecting from us and what the use cases are. The more that people speak out and let us know what their needs are, the more we can shift to– shift our priorities and strategic plans to meet those needs.

And then, in terms of ethics– and it’s really a privacy of data. Different companies are responsible for making sure that there is a code of ethics that is followed– not just the legal team, but other teams. Companies have this– most companies. But yeah, basically, there needs to be some type of understanding of what the market needs.

And then, in terms of ethics, that is often shaped by people’s feedback. If a user says that their privacy is incredibly important, which is how most people feel– I mean, we’re using technology that’s entering people’s homes and their intimate lives. But it’s become really murky, in terms of– for the accessibility community, because we need to sacrifice some privacy to be able to get accessibility, and that’s just how it is.

And I’ll give you an example– speech-to-text. You’re recording people’s speech. If I’m talking with my mother or someone is talking about a sensitive conversation, that’s all being recorded and captured. But that information needs to be inputted so that access can be had if you’re using certain technologies.

So there’s not always a safeguard in terms of ethics. Some platforms are really great that way. For example, with a phone, they’ll make sure that the information stays on device instead of being uploaded to the cloud.

But for example, using myself, I need to sacrifice privacy for accessibility sometimes. And it’s not an easy thing to do. Most disabled people just say it’s worth it, because either they have the access or they have their privacy, but not both, often, which is a painful choice.

Most people don’t have to make those hard choices, but people who are disabled often do have to make that hard choice. So again, in the future, we need to have more conversations, more feedback from the community, more conversations with industry, discussing what the boundaries should be and how the disabled community, especially those in the accessibility and technology space, should not have to sacrifice too much privacy and have such a hard trade-off.

But it’s not something that we can really know today, in terms of how it will work out in the future. But we should strive to make it as right as possible.

ELISA LEWIS: Absolutely. I completely agree there. I’m curious, are there industry-wide efforts emerging to address some of these ethical challenges tied to AI and accessibility? And if so, how can these be promoted to guide responsible development?

SAM SEPAH THROUGH INTERPRETER: Right. That’s the million-dollar question, isn’t it? There are different non-profit organizations and working groups like a place called Safe AI. That’s a good organization. They are focusing on pulling different populations and leaders and users to gather information on what is ethical when it comes to AI.

These organizations, and specifically this one that I just mentioned, are great. They have an ad hoc group, which is made up of deaf and hard-of-hearing people and others with other disabilities, where they are able to give their input.

And so this information is used to help organizations decide on the best practices when it comes to AI and when it comes to ethics and safety. So those groups are trying to get the word out. And I think that they need to have an equal seat at the table, including minors. They need to have a voice at the table, too, so that we all shape what policy will be in the future.

These new groups are emerging and working on these thorny topics. We’re looking at how we can make– basically, develop guidance on this. We’re still in the early stages, but yeah, we’re starting to work on these things.

ELISA LEWIS: Thank you. And someone just shared in the chat as well that the EU is also trying to institute a lot of these AI guidelines and appropriate privacy laws. And a couple of people are mentioning the balance that you mentioned around accessibility and privacy, and how those two play into each other and kind of have to be balanced is really interesting. So thank you for sharing those, and thanks to those of you who are chiming in.

I want to talk a little bit more about AI and accessibility and how we, again, balance this. So as we balance early adoption of AI and accessibility, ensuring consistent quality at scale can be challenging. How can companies approach this balance to ensure technology’s reliability over longer and more complex scenarios?

SAM SEPAH THROUGH INTERPRETER: Yeah. Well, as you know, most journeys are long. They’re not something– things don’t happen overnight. And so with a long journey, you learn lessons every step of the way.

Often, smaller companies try to build things very aggressively, and they have very aggressive timelines because they’re trying to deliver things to market. And often what they’re focusing on is making the investors happy, making other people who have contributed funds and resources happy. And then, when our product starts to scale, there are other people who start to have issues, barriers, roadblocks. And I think the future of technology, when technology grows, people will realize that we need to have more accountability come in.

With a smaller company, accountability is not at the forefront of their minds because it doesn’t affect their brand as much. But the bigger a company grows, the more accountability really matters. When we were a young company, and then became a multi-million-dollar company, we had to think about that a lot more.

For example, there was an accessibility startup company that wanted to build a camera that would guide blind people so that they were able to navigate a city independently. And they thought that this was a perfect solution, that it was scalable to all blind people, that they would be able to navigate cities and other places independently with this camera system, giving them voice instructions on where to go, and that this technology was very scalable. And they thought that with the camera and with AI and all the applications, and the ease of scalability, in terms of resources, was good.

So they grew in the US and wanted to spread over to Europe. And then they had issues with data privacy. Because in Europe, people were saying you could capture a lot of information with that camera. You’re interfering in their lives, even other people on the streets.

And so technology is getting cheaper. It is getting better. And people are more willing to use it despite any trade-offs. But the question is, how do you know that technology is not only working for one person, but is something that all of society can adopt and accept?

For example, for deaf people, for speech-to-text, the information is being recorded. And for a lot of people, they feel like that’s uncomfortable to them. And deaf people could say, if a hearing person can hear things, then a deaf person should have the right to be able to record that same information.

And that’s a question. Like I said, everything is getting easier, cheaper, more scalable. But we have to think about, in terms of technology, what society can tolerate, and again, those trade-offs with privacy and access. So if I want to join in some activities, what’s the appropriate levels of these things? Those haven’t been defined yet.

So again, that’s a conversation that needs to happen, feedback that needs to be collected. When mistakes are made by organizations, we need to share these conferences and white papers and learn from each other. So again, we’re still in the early stages. We can’t really know what the answers are until we go through it, trial and error, and then maybe forgiveness will be required before we really develop these new norms.

ELISA LEWIS: Yeah. I think it’s interesting to see these questions come up, now that we are dealing with technology, that didn’t previously exist, and using it in ways that we were never using it before. I know somebody posted in the chat, too, that they will not use Fury or any of the smart speakers because it feels like they’re always listening, which is definitely true.

I love this idea of sharing knowledge to learn and progress faster. And I often think about accessibility as a journey. And as technology continues to evolve, how can the accessibility community stay ahead of the curve and anticipate future challenges and opportunities? And also, how do we make sure that this community is involved in those technologies and decisions?

SAM SEPAH THROUGH INTERPRETER: [SIGHS] It’s very interesting that you ask that question. Because if you look at what’s happened historically, in terms of accessibility, often accessibility is ahead of the game, in terms of technology. We’re often, in this space, the first to arrive at a solution.

I don’t want to get too much into the weeds, but I’ll just talk on that briefly. As you know, Alexander Graham Bell invented the telephone, and it’s because he had a wife who was deaf. And it was an accidental invention, but it became something that society– became mainstream, basically. But it was really an invention for accessibility, out of the gate.

Another example is Robert Weitbrecht. He was a deaf physicist and scientist who worked with another deaf person who was a dentist. He came up with the teletypewriter, called a TTY, because he couldn’t hear or use the phone and use acoustics. So he used acoustics to be able to use this TTY device.

Again, it was a device used for accessibility. It was something where you put a phone on a teletypewriter– TTY– and it would make acoustic signals that ended up being the basis for the technology for modems.

Another example. A lot of people– deaf people– didn’t want to use telephones or typewriters and wanted to use their native languages– sign languages– to be able to communicate. And this has been happening since the ’80s. And so they came up with webcams. And this was when the internet was just a new emerging technology. But they have been using these technologies, that technology, for many years, and now FaceTime and web video conferences. Google Meet has become mainstream as well.

So deaf people are often like, this isn’t new to us. Yes, it’s gotten better, easier, more efficient, less expensive. But we are often early adopters and the first in this space, as you can see with these examples.

So I think this will continue on into the future, whether it’s somebody who’s inventing something to make something more accessible to them, something like that. We are always ahead of the game when it comes to research in this space. Often, startup companies will pull people in who are disabled because we are up for making sure that we have these new inventions for us.

ELISA LEWIS: I think that’s great. I love those examples. And on that note of being ahead of the game, I want to hear a little bit more about what you’re working on. You’re working on a pretty unique and ambitious project around sign language recognition AI technology, more specifically using fingerspelling. Can you share with us what that is all about and kind of where it stands?

SAM SEPAH THROUGH INTERPRETER: Yes, absolutely. Yeah, so exciting, this work that we’re doing. But before I explain that, if I can just step back a bit, if you look at Google’s mission, it’s to organize the world’s information and make it universally accessible and useful. I think that’s such a beautiful mission and a great goal to strive for.

Information should be accessible to everyone. And now we’re in the internet age, the digital age. As you know, there’s a lot of speech-to-text technology, voice assistance available, and this is revolutionizing people’s lives. I have a friend who made a comment that was interesting to me.

Her mother wasn’t able to use the smart TV remote, so she pressed the voice assistant, and asked it to find her a football game, a specific one, and it did. And I didn’t even know that that was possible. Not only can you use speech-to-text to ask your smart home system to turn your lights off, but you can even do things like find a football game.

So it’s amazing technology, where you don’t even have to touch a screen anymore. You can just use your voice to have all sorts of magical things happen. But for people who are speech-impaired, deaf, or hard of hearing, these technologies are not available to them. And this includes millions of people who do not have access to the same function in these technologies and these products and devices.

So that’s a big barrier. A lot of people are experiencing wonderful, magical things that we cannot. So Google wants to even the playing field, build a more equitable user experience, and one way we’re thinking of doing this is by building AI models that can recognize hand shapes and sign language. So they look at space and movement of hands and expressions and translate that to language.

And it’s the same idea as if somebody speaks. We’re doing that when somebody’s language is signs. We have a proof of concept. We know that it works.

And we publicized that we are working on this through a Kaggle competition. We recently had one, and it was the world’s most successful competition to date. And not only for Google, but it was open publicly to the world.

And we saw whether it helps– it can help any sort of different diseases, et cetera. And it’s open-sourced so that all sorts of people can use assistive technology to solve complex problems. We have made sure that we share this data publicly so that people can build better detection with fingerspelling, so that people can fingerspell to a camera and have that translated.

People have asked why we don’t just have isolated words instead of fingerspelling, but the reason we give is because sometimes we need to be able to convey a person’s name or street address or a certain word that there’s no sign for. So for deaf people who do use fingerspelling instead of text, it’s actually two times faster than texting on a phone, at least. I mean, people who speak can input that information verbally quite quickly. And so now we’ll have a more equitable experience, being able to do that with fingerspelling and not be so slowed down in our productivity.

So this is another way that Google is making our products more equitable in terms of user experience, and giving more access to the world’s information. This is such important work that we’re doing and contributing to society and this space.

ELISA LEWIS: We have a couple questions around this coming in from attendees that I want to share. And I think this is a really interesting topic, the sort of relationship of ASL and AI. Someone is asking, how do you feel about including ASL in more platforms for individuals that are deaf or hard of hearing that struggle with English, or when ASL provides them clearer information?

SAM SEPAH THROUGH INTERPRETER: Yeah. I think it’s more important than most people know. It’s very important. Often, people say that they don’t see many deaf people around or even signers. They don’t see that in their lives, but it’s a larger population than most people think.

I have two statistics to illuminate that. 466 million people in this world have some kind of hearing loss, from moderate to severe, depending on what age they are, et cetera. So 466 million, and that number will continue to grow to 700 million by the year 2050. So this is a growing population.

If it were a country, it would be larger than the population of this country. It would actually be the third-largest country, behind India and China. That’s how large this population is.

So that’s one thing. Another is that more than 70 million people in the world currently use sign language as their primary mode of communication, and this is a huge untapped market. So if you make things more accessible to these people, these people will find more jobs, have more satisfaction with using products, have better access to health care, and even access to loved ones and other relationships. And once one door opens, many more can open. And businesses can connect more with their customers, give them better user experiences, and society can improve overall.

Also, some people say that deaf people can just read or write. Why don’t they just do that? And that is true for some deaf and hard-of-hearing people. But in some countries, deaf children are not educated this way.

For example, in India, they have a long way to go. Currently, the adult population there– 90% of them are illiterate. They are not taught to read and write, so they heavily rely on sign language, in terms of having access to information. It’s because they have a different educational system there than they have here in this country. We have more resources. So it’s so important, from a market standpoint, too, to make sign language available in products.

ELISA LEWIS: Absolutely. I think it’s interesting that you pointed out that sign language, people don’t kind of see it as often, as now we interact with captions. And it’ll be exciting and interesting to see if it hopefully, in the future, gets to that point where it’s just another method of communication that everyone can find some benefit from as well.

SAM SEPAH THROUGH INTERPRETER: Absolutely. Yeah, I think that we should have sign language and captions both available. Because many young children who are deaf and hard of hearing want to learn sign language, and they don’t know– maybe they see a sign, and then they have captions there, and they can learn the English word. Or sorry, they do know sign language, but they want to learn the English word.

So having both languages is really beneficial that way. And using my father as an example, English is not his first language, but with captions, he’s learning it quite a bit. So it really is beneficial to everyone.

ELISA LEWIS: Absolutely. Another attendee question– and I’m curious to hear your answer on this– is, will the sign language AI be able to replace ASL interpreters?

SAM SEPAH THROUGH INTERPRETER: Right. My quick answer is no, not in the immediate future. It’s interesting. I was invited to present recently at the Registry of Interpreters for the Deaf, which is an interpreting organization. I was invited to present to them, and they had the same question– is AI going to take our jobs?

And the point of my presentation was to say that there are a lot of issues that need to be clarified still, like how to make sure that assistive technology is reliable and accurate. We’re not there yet. And then safeguards in terms of anonymity and confidentiality– we’re talking about medical information. There’s HIPAA that we have to think about.

Both on the consumer and the medical provider side, we haven’t gotten there yet. There are a lot of issues that have to be solved. Reliability, ethics, all these things need to be solved. So in terms of very general use, like ordering a cup of coffee, that could work more sooner, but not replacing interpreters.

And I have asked, actually, AI– a chatbot, Bard and ChatGPT– whether it thinks that interpreters will be replaced by AI, and both replied, no, not in the immediate future. And I asked, and they said, because we can’t read nonverbal cues. If there’s some type of emotion on the face or the voice tone is showing certain emotion or giving information, I cannot capture that, so we’re not there yet.

ELISA LEWIS: Yeah, a couple people in the comments also mentioned, it’s kind of a similar question around ASR or Automated Speech Recognition and whether that will replace human captioners. And you know, I think it’s the same answer– not yet, not in the immediate future.

And to your point, you know, ASL is such an expressive language, and these automated technologies, whether it be ASL or captioning, can’t pick up on some of the nuance and context that a human can. So it’s a great– they’re great tools and certainly help us along the way. But I definitely agree that the human component is not going away anytime soon.

SAM SEPAH THROUGH INTERPRETER: Right. And just to add to that, it’s not just emotions, but it’s also, for example, some cultures have certain behaviors and different things that are interpreted differently. Like in Japanese culture, if you nod your head a certain way, it just means “I hear you,” whereas in a Western country, it means “yes.” So different nuances there that technology is not caught up on yet. So a head nod does not always translate to one thing. There’s still a lot of work to do.

ELISA LEWIS: Definitely. I think we could talk about this alone for a long time, but I do want to be mindful of our time together. So it’s really amazing how Google is prioritizing building equitable experiences. I know this is part of their core mission, like you shared with us, and prioritizing this research really being a great model of how a large global company can and should prioritize accessibility.

And I think it’s also important to acknowledge the influence and the impact that these companies have in our society and with consumers around the world. What do you feel is the company’s social responsibility when it comes to promoting accessibility and fostering inclusion worldwide?

SAM SEPAH THROUGH INTERPRETER: A lot of smaller companies, in the accessibility space mostly, will ask us what Google is doing, what other companies are doing in this space. And I often respond, saying, don’t feel intimidated or threatened, because we really have the best intentions. And these smaller businesses are– whether they be small or middle-sized, all these different players should pursue their goals, and basically, partner with us or work in the same space.

The more captions, the more translations, the more AI accessibility technology, the better. We want to all collaborate and work together. And that way, we can really raise the bar and the standard in the industry together. So yes, we do have our resources, and a lot of small companies are– they have theirs, and they’re relieved to hear that this is our perspective.

And we say that they should be inspired to build AI for our society, because we want us and them to enrich everyone’s lives. That’s the goal at the end of the day. So we want to think about how we can partner with other companies, share resources and data, share best practices and lessons learned.

We even share technology mutually, or even better, take the time to help each other on community projects, which really does well for social good. So it’s really, in the future, about how we work together to make life more accessible for all.

ELISA LEWIS: Yeah, those are great points, and really great, sort of tangible examples and points. Similarly, how do you think companies can make accessibility a part of their fundamental principles? Obviously, this is something Google has done really well. But how can other companies ensure that accessibility is really prioritized from the early stages of development and across all parts of the organization?

SAM SEPAH THROUGH INTERPRETER: Yeah, that’s a great question and a great point. I think people definitely publish about this and research about this, but it really goes back to what the company’s values are and what the leadership lives, breathes, and honors, and what they make in terms of policies.

There are some beautifully well-written missions, but if the people in the companies don’t actually live and breathe that, then it doesn’t really matter, or if it’s not prioritized. And so I think what really works at the end of the day is having the people behind accessibility– people on the engineering team, people on the marketing team, financial people– all believing in this goal and having the good intention of making their companies accessible, and also making sure that it’s delivered in their products and technologies and services as well.

And that’s not a nice-to-have, but it’s a must-have. Because our users– if things aren’t accessible to them, that’s a deal breaker. That mindset is really powerful. It’s not about resources or capital at the end of the day. It’s really going to be about people.

ELISA LEWIS: Thank you. So I know there’s so much more we could continue to talk about, but I want to be mindful of our time, and we are getting close to the hour. So I’d like to know, from you, Sam, as we wrap up today, what are your aspirations for the future of technology and accessibility? Are there breakthroughs or changes that you would like to see in the coming years, and what are those?

SAM SEPAH THROUGH INTERPRETER: Sure. [SIGHS] I mean, there are a lot of possibilities. And one thing I’m personally really hoping for is for society at large to accept that accessibility and that technology is just a part of life, whether people are disabled or not.

Often, people say, oh, that accessibility technology is for them. But I would prefer– I want society to think that it’s for us, just like captions. People originally thought that captions were for people who can’t hear, but it’s for everyone. It’s for us. And in terms of a breakthrough, I would like a breakthrough in terms of the stigma and how people view these populations, so that family members can talk to each other.

I want assistive accessibility technology to be available so that family members can talk to each other– not just be like the cool next thing, bells and whistles, et cetera, even though it is very cool and very exciting. But I really want that acceptance of these different tools to be accepted by society and really make the world a better place so that we all live together, and so that ultimately, we respect human rights. That’s really the whole point.

ELISA LEWIS: Absolutely. What a great takeaway to end our conversation on. Thank you again, Sam, for this wonderful discussion. And thank you to everyone for joining, for asking great questions, and for being a part of today’s discussion.

SAM SEPAH THROUGH INTERPRETER: Thank you again so much, you and your team, for making this very fantastic session happen. Thank you.