Adventures in NFC Land

One of the most exciting aspects of Google’s Android platform is the rapid release pace, so support for cutting-edge technologies can be included in the platform early on. As engineers, we’re excited about such features, because we like to tinker and test the limits of new technologies. But the real challenge is using these new capabilities to integrate easy-to-use features for end-users, so that everyone can be excited. The introduction of the Near-Field Communication (NFC) API in Android 2.3, accompanying the release of an NFC hardware feature in Google’s Nexus S, gave us the opportunity to do just that.

NFC is meant to send and receive small amounts of data. This data can be read from passive (non-powered) devices (e.g. credit cards or interactive posters), or active devices (e.g. payment kiosks). NFC can also be used to communicate between two NFC-equipped devices thanks to a protocol introduced by Google which defines a way for two active devices to exchange NFC messages that follow the NFC Data Exchange Format (NDEF) specification. This protocol (NDEF Push Protocol (NPP)) is implemented in Android 2.3.3 and beyond.

The NPP is very simple: it allows us to send an NDEF message to another device, which will then process the message as if it had been read from a passive tag. Thus, to understand how to use NPP, we need only to understand the NDEF message. An NDEF message is a collection of NDEF records. An NDEF record is a short header describing the contents of the record’s data payload, and then the data payload.

This new NFC/NPP capability enables the exchange of a few hundred bytes of information in a well-defined format between devices, using proximity as a method for instigation and authentication. In other words, we can now share a small amount of data between two devices without going through the normal steps of pairing or association that are required by more conventional (and higher-bandwidth) avenues like Bluetooth or WiFi—the fact that the two phones are in close proximity is enough evidence to convince the NFC software that the devices are eligible to receive information from each other.

The NFC API for Android handles all of the details of receiving and parsing NFC messages. It then decides what to do with the message by investigating the intent filters registered for applications on the device. In some cases the messages may have characteristics that can lead to finer-grained dispatching. In the case of an NDEF message, you can register for NFC messages at varying levels of detail: from as vague as any NDEF message to as detailed as an NDEF message containing a URI matching a given pattern. A pleasant result of this approach is that the operating system can have “catch-all” applications for messages that have unknown details.

Let’s consider a possible application of the NFC technology in the context of the doubleTwist app. If you’re listening to a song and you want to share the details of the song with your friend, how can NFC help with this? Well, we can use NFC to bundle up a small bit of information about the song in a well-defined format, which the other phone can then parse and use to display information about the song. We will pass the metadata in a single NDEF Record containing a JSON object.

To ensure that our message is routed to doubleTwist on the receiving phone, we rely on Android’s ability to dispatch messages based on the data type of the payload, signaled by setting the the type name field to TNF_MIME_MEDIA. This indicates that the type field of the NDEF record should be a MIME type as specified by RFC2046. To take advantage of this capability, we create our own custom MIME type to use in the type field. Since the NDEF message must be composed of bytes, we’ll also need to convert the strings into byte arrays. The code for all of this looks like this:

[php]//Assume that musicService is an interface to our music
//playback service
JSONObject songMetadata = new JSONObject();
String title = musicService.getCurrentTrackTitle();
String artist = musicService.getCurrentTrackArtist();
String album = musicService.getCurrentTrackAlbum();
songMetadata.put(“title”, title);
songMetadata.put(“artist”, artist);
songMetadata.put(“album”, album);

String mimeType = "application/x-doubletwist-taptoshare";
byte[] mimeBytes = mimeType.getBytes(Charset.forName("UTF-8"));
String data = songMetadata.toString();
byte[] dataBytes = data.getBytes(Charset.forName("UTF-8"));
byte[] id = new byte[0]; //We don’t use the id field
r = NdefRecord(NdefRecord.TNF_MIME_MEDIA, mimeBytes, id, dataBytes);
NdefMessage m = new NdefMessage(new NdefRecord[]{r});[/php]

Similarly, our application registers an Intent filter to define the Activity that will handle NFC messages that match this mime type. So, when another phone running doubleTwist receives our message, doubleTwist will have priority in processing the message. To do this, we add an element to AndroidManifest.xml‘s entry for the activity that should launch to handle this NDEF message:

[php]<activity name="SongInfoShare">
<intent-filter>
<action android:name="android.nfc.action.NDEF_DISCOVERED" />
<data android:mimeType="application/x-doubletwist-taptoshare" />
<category android:name="android.intent.category.DEFAULT"/>
</intent-filter>
</activity>[/php]

The SongInfoShare activity will be started whenever the NFC radio receives our special message from another NFC-capable phone. In the onCreate method for this activity, we can check the action for the intent that started to activity to verify that it’s the NDEF_DISCOVERED action, and then use the normal intent helper methods to get the NFC message data from the Intent. Here’s what the code will look like for our info-sharing example:

[php]public class SongInfoShare extends Activity {

public void onCreate(Bundle b) {
if(android.nfc.ACTION_NDEF_DISCOVERED.equals(getIntent().getAction()) {
Parcelable[] msgs =
b.getParcelableArray(NfcAdapter.EXTRA_NDEF_MESSAGES);
if(msgs != null &amp;&amp; msgs.length &gt; 0) {
NdefMessage m = (NdefMessage)msgs[0];
NdefRecord[] r = m.getRecords();
JSONObject sharedInfo = null;
if(r != null &amp;&amp; r.length &gt; 0) {
try {
String payload = new String(r[0].getPayload());
sharedInfo = new JSONObject(payload);
} catch (JSONException e) {
Log.d(TAG, "Couldn’t get JSON: ",e);
}
}

//At this point, we can use the contents of
//sharedInfo to set up the contents of the activity.[/php]

So, now we’re able to easily pass along song information to a friend by simply bringing the NFC radios close together. No cutting, pasting, tapping, or searching for options in menus!

So, what happens if our friend isn’t running doubleTwist? Well, the stock Android NFC handler app can help us. It will display any text or link records contained in the NDEF message it receives. So, for example, we can also share a pre-formatted link that will do a Google search for the song information. To achieve this, we can modify the end of our first code example as follows:

[php]r1 = NdefRecord(NdefRecord.TNF_MIME_MEDIA, mimeBytes, id, dataBytes);
String query = UrlEncoder.encode("""+artist+"/" /""+title""", "UTF-8");
String searchLink = "http://www.google.com/?q="+query;
byte[] searchBytes = data.getBytes(Charset.forName("UTF-8"));
NdefRecord r2 = new NdefRecord(
NdefRecord.TNF_ABSOLUTE_URL, searchBytes, id, searchBytes);
NdefMessage m = new NdefMessage(new NdefRecord[]{r1, r2});[/php]

Now, in the case where the receiving user doesn’t have doubleTwist installed, they will at least be presented with a link that they can click on to begin a Google search for more information about the song.

The first version of Google’s NDEF Push Protocol API has allowed us to bring some novel enhancements to our app using one of the newest available handset technologies. We are looking forward to further development of device-to-device NFC APIs that will allow even richer inter-device communication using just a tap!

The Benetech app

Benetech is a different kind of technology company, where we measure our success not on ROI, but Return to Humanity. Benetech is a non-profit organization that builds software solutions to address large scale and global social needs in literacy, human rights and the environment. Many of our software solutions are delivered via an open source model.

One of our latest literacy projects has been to develop an Android-based book e-reader for people with print disabilities. Print disabilities, such as blindness, paralysis, or dyslexia, effectively prevent a person from reading traditional print books. Many of those people qualify to have free or inexpensive access to books thanks to an exemption in U.S. copyright law called the Chafee Exemption. Bookshare, one of Benetech’s largest solutions, provides over 100,000 e-books in the accessible DAISY format (similar to ePub) to over 130,000 Chafee-qualified people in the U.S. Once downloaded from Bookshare, DAISY books can be consumed using Assistive Technology (AT), which employs Text to Speech technology (TTS), electronic refreshable braille, or large fonts for low vision users. Bookshare was originally built 10 years ago on a PHP architecture, was migrated to a Java/Hibernate/MyBatis framework and we recently migrated the content repository to S3.

Bookshare has a public REST-based API, which enables AT developers to directly integrate their applications with our API. Through the API, an AT application can enable a user to directly search for books, browse books based on category or recently added books and download a book packaged up as either a DAISY file or a BRF file commonly used by an electronic refreshable Braille display, such as HumanWare’s BrailleNote. Being able to directly download from the AT application simplifies the potentially frustrating experience of having to manually transfer the books from a PC to the AT software or device. The API supports anonymous use, which provides access to freely available books and open educational resources that have no copyright restrictions. Only qualified Bookshare members can access the copyrighted books and periodicals. To learn more about the Bookshare API and obtain a developer key, visit http://developer.bookshare.org/.

Recently Benetech challenged a group of volunteers to build a free, open source, mobile and accessible e-reader which leverages the Bookshare API. The volunteers chose to extend FBReaderJ, a popular open source e-reader for Android which leverages Android’s TTS API (android.speech.tts.TextToSpeech). The project is a work in progress, but so far the volunteers have added DAISY format support and Bookshare API integration. They are now working on improving the accessibility of the application and are evaluating different user interaction experiences to making it easy for print disabled users to access books. To learn more about Android accessibility, check out http://eyes-free.googlecode.com/ and to check or contribute to the project visit http://github.com/amahule/fbreaderj.

Ultimately, we believe this Android e-reader could also benefit people who don’t qualify under Chafee, but who have other disabilities, such as Attention Deficit and Hyperactivity Disorder (ADHD), which makes it hard for them to enjoy printed or even traditional e-books. Furthermore, TTS technology and the Google Translate API may help us use inexpensive Android devices to distribute valuable knowledge locked up in print to illiterate populations in developing countries. Accessing knowledge for illiterate populations will be critical to the success of emerging democracies.

We welcome ideas you may have about our app or Benetech in general. We particularly welcome anyone interested in contributing product development skills to our Android e-reader project or any other open source projects Benetech is working on regarding literacy, human rights or the environment.

Android: Moving accessibility forward

For the last 8 years we at Code Factory have been making software that helps the blind and the visually impaired access their mobile phones. We’ve created this software for several different platforms. Last year we decided it was time to start doing something for the Android platform, due to its growing popularity and variety of devices.

From our past experience, developing a screen reader for a new platform required a lot of work, hacks, and investigation. Almost none of the previous platforms we supported implemented any sort of Accessibility API that we could use. Android, we thought, would be no exception to this rule. We were very wrong.

Starting at version 1.6, the Android operating system comes with a built-in Accessibility API that makes our application a lot easier to develop. All you do is create a service, which implements the AccessibilityService interface, declare it in your manifest and voilà! The system will start sending events, such as button presses, list navigation, focus changes, etc. to your service. You then convert this information to voice using a Text-to-Speech engine, and you have a screen reader.

The Accessibility API is not yet as complete as what you can find on a desktop PC, but it’s good enough to provide the users with basic user interface navigation, and we have no doubt that, as the Android platform evolves, so will the built-in Accessibility API.

We also wanted our application to go beyond a screen reader and provide an intuitive, easy-to-use UI that allowed the blind and visually impaired access to most of the phone’s functionality, such as messaging, web browsing, contact management, and so on.

We were pleased to see that we could do this Android. The existing set of UI controls, such as buttons and lists, can be overridden in order to provide custom functionality, such as speaking the text of the control. This made it possible for us to keep the user interface of our application consistent with Android, while at the same time providing the speech feedback that our users require.

By intercepting touch events within our application and using the gesture detectors that Android provides to developers, we were also able to make the touch screen accessible to our users, so they can use gestures like swipes to move through items of lists, or double-taps to activate items.

We really like how much we can accomplish with Android with so little code. Want to let a blind person create an SMS or email using voice? Simply use the SpeechRecognizer class. Want blind users who are walking on the street to know their exact location? Just use the LocationManager and Geocoder classes to give their exact street name and number.

Android lets us do a lot in a very efficient way. It wraps a whole bunch of cool technology into well-defined classes and interfaces. And if at any given time you need to know how something works behind the scenes, you just take a look at the source code, which is freely available to everyone.