This post is the first of two which gives a brief introduction to creating multimodal interactions in Android applications. I'll briefly cover some of the SDK features available to you as an Android developer which you can use to create richer interactions in your apps. Example code will be quite concise because I assume you have at least a basic knowledge of Android development. Feel free to leave any comments suggesting how I can better explain these concepts, or to let me know if I've made any mistakes or omissions.
The callback method for gesture performance receives a Gesture as an argument. This can be used to obtain a list of predictions: which gestures in your library that Android thought the gesture was. With these predictions, you can use the prediction score (or contextual information) to determine which gesture the user was most likely to have performed. I find it useful to define a threshold for gesture acceptance, so that you can reject erroneous or inaccurate gestures. The best way to choose this threshold value is through trial and error: see what works for you and your gestures.
What is "multimodal" interaction?
Multimodal interaction, put simply, is interaction involving more than one modality (e.g. multiple senses). For example, an application may provide a combination of visual and haptic (touch) feedback. These types of interaction design provide a number of benefits, for example allowing those with sensory impairment to interact using other senses, or allowing interaction in contexts where one sense may be otherwise occupied.
One of the most ubiquitous examples of a multimodal interaction is the way in which mobile phones combine visual, audible and haptic feedback to inform users of a new text, phone call, etc. This combination of modalities is particularly useful when your phone is, say, in your pocket. Obviously you can't see the phone, but you will probably feel the phone vibrate or hear your ringtone as new notifications appear.
Most handheld Android devices have some sort of rotation motor in them allowing simple haptic feedback. Although not common in tablets (largely due to size constraints), all modern Android phones will have tactile feedback available. You can control the phone vibrator through the Vibrator class. Note that in order to use this, your Manifest must request the following permission: android.permission.VIBRATE
One of the most ubiquitous examples of a multimodal interaction is the way in which mobile phones combine visual, audible and haptic feedback to inform users of a new text, phone call, etc. This combination of modalities is particularly useful when your phone is, say, in your pocket. Obviously you can't see the phone, but you will probably feel the phone vibrate or hear your ringtone as new notifications appear.
Haptic feedback in Android
/* Request the device's vibrator service. Remember to check * for null return value, in case this isn't available. */ Vibrator vibrator = (Vibrator) getSystemService(Context.VIBRATOR_SERVICE); /* Two ways to control the vibrator: * 1. Turn on for a specific time * 2. Provide a vibration pattern */ /* 1. Vibrate for 200ms */ vibrator.vibrate(200); /* 2. Vibrate for 200ms, pause for 100ms, vibrate for 300ms. */ long[] pattern = new long[] {0, 200, 100, 300}; /* Perform this pattern once only (repeat := -1). */ vibrator.vibrate(pattern, -1); /* Vibrate for 200ms, followed by indefinite repeat of * 100ms pause followed by 300ms vibrate. Setting * repeat := 2 tells the vibrator to repeat at offset * 2 into the vibration pattern. */ vibrator.vibrate(pattern, 2);
Touchscreen gestures
Using touchscreen gestures to interact with applications can be fun, efficient and useful when users may be unable to select a particular action on the screen. For example, it can be difficult to select a button on-screen when running or walking. A touch gesture, however, is a lot easier and requires less precision from the user. The disadvantage with touch gestures is that if not used sparingly, there may be too much for the user to remember!
Creating a set of gestures for your application is simple: create a gesture library on an Android Virtual Device using the Gesture Builder application (available on the AVD by default) and add a GestureOverlayView to your activity layout. In your activity, you just have to load the gesture library from your resources and implement an OnGesturePerformedListener.
private GestureLibrary mLibrary; public void onCreate(Bundle savedInstanceState) { ... /* 1. Load gesture library from the res/raw/gestures file */ mLibrary = GestureLibraries.fromRawResource(this, R.raw.gestures); if (!mLibrary.load()) /* Error: unable to load from resources! */ ... /* 2. Find reference to the gesture overlay view */ GestureOverlayView gov = (GestureOverlayView) findViewById(R.id.gestureOverlay); /* 3. Register callback for gesture input */ gov.addOnGesturePerformedListener(this); }
The callback method for gesture performance receives a Gesture as an argument. This can be used to obtain a list of predictions: which gestures in your library that Android thought the gesture was. With these predictions, you can use the prediction score (or contextual information) to determine which gesture the user was most likely to have performed. I find it useful to define a threshold for gesture acceptance, so that you can reject erroneous or inaccurate gestures. The best way to choose this threshold value is through trial and error: see what works for you and your gestures.
private static final double ACCEPTANCE_THRESHOLD = 10.0; public void onGesturePerformed(GestureOverlayView overlay, Gesture gesture) { /* 1. Get list of gesture predictions */ ArrayListpredictions = mLibrary.recognize(gesture); if (predictions.size() > 0) { /* 2. Find highest scoring prediction */ Prediction bestPrediction = predictions.get(0); for (int i = 1; i < predictions.size(); i++) { Prediction p = predictions.get(i); if (p.score > bestPrediction.score) bestPrediction = p; } /* 3. Decide if we'll accept this gesture */ if (bestPrediction.score > ACCEPTANCE_THRESHOLD) gestureAccepted(bestPrediction.name); } } private void gestureAccepted(String gestureName) { /* Respond appropriately to the gesture name */ ... }
No comments:
Post a Comment