Katie Grant
“If I want more stories to exist, I have to make them.”
Toj Mora said that when he was a young student he realized that there weren’t many movies specifically aimed at deaf audiences, and that’s how he understood it.
At the Texas School for the Deaf, Mora, who has been deaf since birth, took classes in video technology. “Within one semester, I had exhausted the school’s small collection of deaf films, and I remember thinking to myself, ‘Is that all?’” he said.
Maura’s passion for Deaf stories, combined with his love of solving puzzles and a bit of ADHD, made editing a natural career fit. “It’s like the ultimate video game. It’s putting the puzzle pieces together with the best stuff you have. You have to find a way to make it work. So that creative process has always been a joy for me. challenges.
Given the intricate connection between image and sound in filmmaking, image editing might not seem like the most natural choice for someone with hearing problems. But Mora and others are expanding what’s possible in the field.
Emmy-nominated James Cude (ACE), a hearing loss editor and association member, lives with sensorineural hearing loss and faces its own set of challenges. He considers ASL his second language. “I had to straddle two worlds,” he said. “I have to act like a hearing editor because a lot of my work has to do with sound. People have misconceptions about what deafness is and are afraid to ask. Don’t assume I can or can’t do something. Just ask and I’ll Tell you.
Coode says Waveform has been a game-changer for deaf editors. “I learned how to make waveforms on the timeline. You can edit audio by looking at the waveform and identifying certain sounds. I would often edit S and T because I couldn’t hear them, but you learn how to visually to what they look like and how to edit them.
“You learn to rely on other forms of communication,” he added. “It’s not just audio. You learn to become more discerning about what you’re looking for and what you’re looking for. They call it ‘deaf eyes.’ You can learn things visually very quickly. I work mainly in documentaries, but also often with transcripts. I can read transcripts and edit scenes very quickly. I’m surprised the hearing editor doesn’t do this. All editing software now has transcription functionality built into it, so there’s almost no reason not to do it anymore.
Often, hearing filmmakers will frame Deaf stories using a shooting style sometimes called the “hearing gaze.” Close-ups of deaf characters’ faces and slow-motion hand gestures are prime examples of this phenomenon, a play on the male gaze coined by filmmaker Laura Mulvey in 1970. Wei’s theory focuses on problematic depictions of women.
The Hearing Gaze fetishizes or objectifies the elements of ASL and deafness that are most appealing to hearing audiences. Poor framing is also part of how the auditory gaze comes into play. While logos are often cut off by shot framing during production, in Mora’s experience hearing editors and directors often opt for close shots even when wider alternative shots are available because they prefer to see the actors’ faces— —The logo was an afterthought.
“To edit ASL properly, you need a deaf editor,” Mora said. “Not a hearing person who has known ASL since high school. Not an interpreter. Not a language consultant. You need a deaf editor. There are a lot of nuances to editing sign language.
Deaf sound works are another example of the “auditory gaze.” “Deaf scenes can involve a lot of breathing and physical contact, which creates sound,” Mora said. “Also, sound design often ‘replaces’ dialogue with audio. Sometimes editors and directors can get lost in the distraction of the listening gaze rather than focusing on the subject. Deaf people are the subject, not what they are doing.
Mora believes that most of the issues that need to be addressed in ASL are pre-production related, such as not budgeting for interpreters in advance, or something being missed during production, such as not filming an open door. In these situations, he’s grateful to be able to participate in damage control.
The final hurdle was the physical logistics of having the deaf editor stand behind the hearing editor and look over their shoulder at the screen as they drove the Avid. In this case, trying to transfer feedback from American Sign Language to spoken English isn’t exactly easy or fast.
When Mora noticed a change needed to be made, he would signal to the interpreter, “Wait, go back,” and the interpreter would tell the hearing editor, “Okay, go back.” When they arrived, they were well beyond their intended location and lost valuable time. Occasionally, Maura would ask him if he could hold it back to reveal the problem quickly. Some editors give up control, but he said he always respects them when they don’t.
Mora’s respect for deaf filmmakers began with one of his mentors, deaf filmmaker, editor and TEDx speaker Wayne Bates Jr. (“Galaudet: The Movie,” “Glee” Group”, “Deaf Family”). He introduced Mora to the concept of a “deaf lens,” which can be described as “an attempt to conceptualize what deaf people’s perception is and how that perception can be depicted visually through filming and editing.” Bates proposes a new approach to Deaf/ASL footage that allows viewers to understand how Deaf people experience the world. This had a great impact on Maura, expanding his thinking and helping him understand what an editor is and think like an editor.
“In my opinion, sign language is the hardest language to edit,” Mora said. “Sign language doesn’t have ADR. There’s nothing in the post to fix it. And if there is, I’d be surprised if there’s a budget for it.
Technically, Mora recommends that hearing editors use something to make editing in ASL easier. The first is to always keep the door “open” to production shoots. Open gate or full sensor recording is when the entire image is captured without any cropping or cropping to a specific aspect ratio. The aspect ratio cutout is then applied at a later stage. This is very important when cutting down ASL to ensure that all signatures remain in the shot if necessary, while also allowing other changes to be easily made.
“Don’t get me wrong,” Mora said, “Deaf filmmakers still use close-ups. We use all kinds of frames and lenses. The only difference is we know when to use close-ups and when not to use close-ups. .
Subtitles are another important element of production that is often overlooked. Mora recommends using subtitles before, during and after editing so you can see what parts of the footage may have been cut when the show is shown. Temporary subtitles can also be added to the video village monitor during production to see which parts of the shot may be obscured, or if any colors from the shot may be blended into the subtitles, rendering them irrelevant.
In a perfect world, daily newspapers would also have headlines. Mora suggested, “I would have preferred that the team sent it to a captioning company and had them manage it. I would have preferred it to be a deaf-owned captioning company, like Caption Anywhere.
Mora treats subtitles as sound archives. He secures them with clamps so they always move together. He acknowledged that Avid sometimes fell short in managing subtitle files. But this is even more reason to drive innovation in this field. “Adobe is killing it. Avid needs to wake up and catch up.
In this regard, Matt Feury, senior director of video and post-production market solutions at Avid, said: “Avid is actually very focused on subtitles and speech-to-text workflows this year. In the latest version of Media Composer, we have introduced a transcription tool, which is PhraseFind technology An AI-enhanced extension that displays automated transcriptions of source clips, including speaker identification and timecode. The script is linked to the source window so that the script is not only synchronized with the playback of the clip, but can also be selected via the viewport. Text to mark clips for editing In addition to the transcription tool, we’re also doing some other exciting things for subtitles that will debut at the IBC trade show in a few weeks.
In addition to subtitles, another important aspect of editing Deaf films is ASL continuity. It’s a term Mora invented years ago, but it’s the same concept as any continuity. If editors don’t know ASL well, they won’t know exactly how, what, or when to cut what is acceptable, distracting, ignored, or missed. Therefore, switching from one shot to another in continuous ASL can be difficult to truly understand without cultural nuance. This includes everything from the speed of the sign to the position of the sign. From the expressions accompanying the symbols to body position and eye position. Even whether a person is in the middle of a breath when signaling can have an impact.
Mora recently lost a deaf colleague, Michael Epstein, an IATSE member of the Institute of Artistic Directors. He recalled their last conversation about how to promote equal representation in Hollywood, “He told me that there were a lot of deaf people who were willing to work in the industry and had the passion and skills to do the work, and that ‘authentic representation’ wasn’t that important. A Factor. Prop department, transpo, grips, crafty, MUA, etc. Deaf people deserve the same opportunities as hearing people, and we shouldn’t forget that that’s the real goal: equality.
Mora has been editing for more than a decade and notes that Deaf filmmakers and editors have included audio in their workflows from day one. “As an editor or assistant editor, you don’t really need to hear the sound because everything is presented visually anyway,” he said. “Just like any other editor, Deaf editors just need to know the workflow or best practices used by any given team. The industry has forgotten this edit The task doesn’t actually require listening. Film and sound go hand in hand, of course, but they’re still two different hands shaping the same story.
While Mora is proud to be the league’s first culturally deaf editor, his personal goal is to write, direct and produce his own films and television shows. Take control of your own film, have complete creative control from start to finish, and tell Deaf stories the way they should be told. Of course, he will also be very hands-on in his role. “I think this is the only time we’re going to get a truly authentic representation of a Deaf movie. If we let hearing people make all the creative and budgetary decisions, then we’re going to have a hearing movie. Honestly, I think Hollywood would be surprised at what we can make I was shocked by the work that came out.
Katie Grant is a freelance writer based in Los Angeles.