Google I/O Sandbox Case Study: MobileASL

Uploaded by GoogleDevelopers on 24.06.2011

♪ [upbeat music plays]
Jessica Tran: Mobile ASL is an experimental application
that provides real-time sign language communication on cell phones.
So this allows deaf people to communicate in real-time using their mobile devices.
And what makes us unique is that we're able to transmit video
at very low bandwidth using the 3G cellular network.
Jaehong Chon: Well, about five years ago,
we started the project for different operating systems,
but essentially we switched to Android
because the most people used the Android device than other existing platform.
And then, the Android API Android environment
is much easier than other operating system.
So I can easily access to front-facing camera.
I can easily access to some API to display the video
to send the data through the network.
Jessica Tran: Since we're only sending video,
we're able to toss out the audio.
So that already removes a lot of the data.
And we're doing a lot of different video compression techniques to send the video.
For instance, we're sending video
at 30 Kilobits per second at 15 frames per second.
And we also do region of interest encoding.
So we identify the skin pixels,
such as the hands and the face,
and when we transmit the video,
there are more pixels devoted to there as opposed to the background.
So, the hands and the face are more clear,
but the background is sacrificed.
Which is fine,
because it's really important for people to see what's being conveyed
through the hands and the face.
Jaehong Chon: Basically, there are a lot of applications for the same purpose,
like Skype, iPod, some of this program.
But none of them provide the same accessibility through the cellular network.
So our goal is to provide the same capability anywhere, anytime.