Android: Moving accessibility forward

For the last 8 years we at Code Factory have been making software that helps the blind and the visually impaired access their mobile phones. We’ve created this software for several different platforms. Last year we decided it was time to start doing something for the Android platform, due to its growing popularity and variety of devices.

From our past experience, developing a screen reader for a new platform required a lot of work, hacks, and investigation. Almost none of the previous platforms we supported implemented any sort of Accessibility API that we could use. Android, we thought, would be no exception to this rule. We were very wrong.

Starting at version 1.6, the Android operating system comes with a built-in Accessibility API that makes our application a lot easier to develop. All you do is create a service, which implements the AccessibilityService interface, declare it in your manifest and voilà! The system will start sending events, such as button presses, list navigation, focus changes, etc. to your service. You then convert this information to voice using a Text-to-Speech engine, and you have a screen reader.

The Accessibility API is not yet as complete as what you can find on a desktop PC, but it’s good enough to provide the users with basic user interface navigation, and we have no doubt that, as the Android platform evolves, so will the built-in Accessibility API.

We also wanted our application to go beyond a screen reader and provide an intuitive, easy-to-use UI that allowed the blind and visually impaired access to most of the phone’s functionality, such as messaging, web browsing, contact management, and so on.

We were pleased to see that we could do this Android. The existing set of UI controls, such as buttons and lists, can be overridden in order to provide custom functionality, such as speaking the text of the control. This made it possible for us to keep the user interface of our application consistent with Android, while at the same time providing the speech feedback that our users require.

By intercepting touch events within our application and using the gesture detectors that Android provides to developers, we were also able to make the touch screen accessible to our users, so they can use gestures like swipes to move through items of lists, or double-taps to activate items.

We really like how much we can accomplish with Android with so little code. Want to let a blind person create an SMS or email using voice? Simply use the SpeechRecognizer class. Want blind users who are walking on the street to know their exact location? Just use the LocationManager and Geocoder classes to give their exact street name and number.

Android lets us do a lot in a very efficient way. It wraps a whole bunch of cool technology into well-defined classes and interfaces. And if at any given time you need to know how something works behind the scenes, you just take a look at the source code, which is freely available to everyone.

A new look for Google Translate for Android

When we launched the first version of Google Translate for Android in January 2010, we were excited about the year ahead. For the first time, we were bringing the capabilities supported on Google Translate—like machine translation, romanization of non-Roman scripts and spoken translations—to the Android platform. We also offered voice input to let you speak the word or phrase you wanted to translate instead of typing it in, and SMS translation so you could translate SMS messages sent to you in foreign languages.

Today, we’re refreshing Translate for Android with several updates to make the app easier to interact with. Among other improvements, we’ve created better dropdown boxes to help select the languages you want to translate from and into, an improved input box, and cleaner icons and layout.

We also want to let you in on an experimental feature that’s still in its earliest stages—Conversation Mode. This is a new interface within Google Translate that’s optimized to allow you to communicate fluidly with a nearby person in another language. You may have seen an early demo a few months ago, and today you can try it yourself on your Android device.

Currently, you can only use Conversation Mode when translating between English and Spanish. In conversation mode, simply press the microphone for your language and start speaking. Google Translate will translate your speech and read the translation out loud. Your conversation partner can then respond in their language, and you’ll hear the translation spoken back to you. Because this technology is still in alpha, factors like regional accents, background noise or rapid speech may make it difficult to understand what you’re saying. Even with these caveats, we’re excited about the future promise of this technology to be able to help people connect across languages.

As Android devices have spread across the globe, we’ve seen Translate for Android used all over. The majority of our usage now comes from outside the United States, and we’ve seen daily usage from more than 150 countries, from Malaysia to Mexico to Mozambique. It’s really rewarding for us to see how this new platform is helping us break down language barriers the world over.

Translate supports 53 languages, from Afrikaans to Yiddish, and voice input for 15 languages. You can download the application, available for devices running Android 2.1 and above, by searching for “Google Translate” in Android Market or by scanning the QR Code below.