A little project I'm working on called "changing global Deaf culture"

So, I’ve been spending a bit of time recently on a new project which is a hugely interesting one. It basically ties together all my interests.

I was contacted by the Japan Deaf Evangel Mission, (ViBi) who said “Hey, we hear you’re a missionary in Japan and you have an interest in sign language and you can program computers. We need you.”

They’re translating the Bible into JSL—while almost all JSL users can read standard Japanese, a Bible in sign language would really serve to promote JSL as a real, living heart language for Deaf Japanese. Vibi has translated about 30% so far. So now they have literally hundreds of hours of video Bible; how do you get people to actually use it? Carrying around shedloads of DVDs is not really practical and even if you did, getting to the bit of the video you actually want to see if a pain in the neck. But these days most people have a smartphone, tablet or the like. So they wanted someone to write a little Bible player application where you can dial in which chapter and verse you want to see, and it goes and finds the relevant bit of the video and plays it.

I went down to Tokyo to talk to them about this, which was an experience in itself. I met with five people from ViBi, with one guy, Mark, acting as sign language interpreter for me. Half way through the first day he said “I’m going to stop translating for you now. These guys are very good at making themselves understood. You need to get into this.” So with a lot of drawing on bits of paper and my limited sign language, we planned out the app.

Obvious first question: Why does JSL need its own player app? Surely you can borrow code from another sign language Bible application? Turns out nobody else has translated the Bible into any sign language yet. Wow. ASL is slowly getting there, but ViBi is leading the field at the moment.

Now they already have a bunch of XML files created using a bit of software called ELAN, which annotate the positions where chapters and verses start, so that’s more than half the battle, but I still wasn’t relishing the idea of messing around with cross-platform video-handling code for mobile devices. And then someone said “We heard there’s this thing called HTML5, would that help?” Oh of course there is. “Right, everybody shut up!”, which was a bit of a stupid thing to say in a room full of deaf people, and off I went to a desk to hack, and came back an hour later with a quick prototype.

And of course then the other wishlist items came out. Can we make it repeat? Yeah, sure, no problem. Can we have a slider to change the speed? Oh, I dunno, Google, Google, hack, hack, vid.playbackRate=$("#slider").val(), yep, looks like we can.

Can people add their own study notes, that then later get shown alongside the Bible text? Sure, we have the ELAN format which annotates video files, so we can add our own annotation elements to the downloaded XML file; I’ll just let people type in… oh, hang on, here’s a thought. These devices all have cameras. They can all do video capture. Why don’t we make it so that they enter their notes in sign language? It’s a first-class language, let’s treat it like one. Cut the cord. Let’s dispense with Japanese altogether, use icons with no words in the UI, and have a completely immersive JSL environment.

At this point the ViBi director started signing rather excitedly. I gave up and asked someone. “What’s he saying?” “He says, if you do this, it will change Deaf culture.” Well, hey, that’s all the reason I need. So that’s what I’m working on at the moment.

Now one of the guys in the room was the Asia Pacific sign language co-ordinator for a global language development/translation agency, and he says, well, if we can do that, then we have a tool I can give to sign language development fieldworkers; these are people who are working with Deaf communities around the world to help them to develop and/or standardize their own sign languages, primarily through translation. So with an app like this, fieldworkers can go out into communities with provisional translations to field-test them, get people to record their own corrections and responses, then take it back and review the comments later to produce an improved version. Of course they have a slightly bigger set of requirements for what is essentially a sign language fieldwork laboratory rather than just a simple video player, but what a great thing to be involved in. (Also they got me shiny test equipment to play with.)

I love my job.