Wednesday, 9 May 2012

Looks like I'm graduating

For the past seven weeks I've been studying intensely, preparing for my end of year exams. I've sat 6 so far and have 3 remaining. After some good news last week I now know that these will be the last exams I ever sit as an undergraduate. Having been unconditionally accepted to start as a PhD student this autumn, I'm now graduating early with a bachelors degree, rather than the masters degree I always intended to aim for. It's a big step, but I'm excited about it. I enjoy research and now I'll get to do even more; there won't even be pesky lectures and coursework assignments getting in the way of it!

After my last exam a week on Friday I'll have three weeks of a holiday before I start work for summer. I'm going to make the most of it and try to relax; something I've hardly had the chance to do because of exams and working on my dissertation. It's tough (despite what Allie says...) but it always pays off in the end. In that time I'll also be writing a paper on my final year project. After an unsuccessful CHI submission, I'm really hoping that it'll get in to ITS. Getting a publication out of my project work would be a rewarding bonus to what has been an intense, but hopefully successful, project.

Monday, 2 April 2012

Amazon Hackathon

This weekend was the Amazon Hackathon in Glasgow, the first of its kind in the UK. The idea was simple: get 50 students to show up in teams, give them 20 hours to innovate and create, and provide them with $50 of AWS credit and a stack of pizzas to power their ideas. What followed was anything but simple. A 20 hour frenzy of brainstorming and prototyping to create something which we would eventually present to our peers and the "Amazonians". Some amazing ideas were demonstrated, with one team netting $1,000 in AWS credit to help kickstart their idea of a gifting-based social network.

I entered as part of Team Giraffa Cakes, a team of four from the University of Glasgow. Disappointingly, three of our team were the only 4th year students from Glasgow to enter. Our idea was to crowd-source information to help inform product comparisons. Users would be able to search for two products on Amazon and our web service would then gather relevant content from sources such as Twitter, Blogger and Youtube to help inform decisions. We implemented a system which did just that, performing sentiment analysis on the tweets and blog posts to give an overview of how people feel about those products. At a glance users would be able to tell if opinion about each product was generally negative, neutral or positive. Information from the web service was made available to both an Android app which I implemented and a website frantically thrown together in record time by James.

Our Android app, showing the product overview.

The experience was an interesting one and was lots of fun, despite the intense desire for sleep that kicked in around 4am. To stay sane as the night went on we found ourselves increasingly taking breaks just to get away from the computer. Cold pizza and instant coffee from a kettle of questionable hygiene proved to be a welcome respite and every hour or so we went outside for a short break. As appreciated as the fresh air was, it was probably the darkness which was most welcome; a break for our eyes. I think the location of the event may have contributed to the drowsiness of everyone. The lab it was held in is notoriously hot and stuffy; in retrospect our team probably would've been more comfortable downstairs in our own, gloriously air conditioned lab. Although then we'd have to walk further for cold pizza.

Overall it was a blast. We had a lot of fun, created something we were all proud of and the icing on the cake was that our fellow hackers voted us the People's Choice. It was great to see so many ideas brought to completion in a single night - it's not uncommon to see university coursework that doesn't work after several weeks of work! Thanks for all the pizza and fresh fruit, Amazon... same time next year?

Thursday, 1 March 2012

Deadlines, workshops and twisted ankles

I've not written much here lately; it's been a hectic few weeks with university coursework deadlines. Thankfully most of them have now passed and I can get back to focussing on my project. My dissertation for my honours project is due three weeks tomorrow. I'm not too worried about it, having started in September last year! I've been doing most of the writing before actually doing the work (an approach I very much prefer for research), as I always feel more confident going forward with a planned and reasoned about methodology.

January was a really good month for running, and February started really positively. Alas, for the last two weeks I've been unable to run because of a dodgy ankle. I'm avoiding running with it until it's recovered a bit more - I don't want to risk damaging it further and waiting even longer for it to heal.

A few weeks ago we ran the Glasgow CompSoc Android workshop (which I first mentioned in November last year). It was a fun experience. I've never really stood up and taught something to a group of people before, so that was quite exciting. I'd quite like to do some similar lectures/workshops next year for CompSoc.

My goals for March are to get my project out of the way, and hopefully pick up running again once my ankle recovers some more. Looking forward to wrapping up my honours project. It's been fun, and I'm sure there's a lot I'll be able to write about in a paper afterwards, but it'll be nice to take a break from everything. I love doing research, it's just a shame that lectures and deadlines have to interfere with it so much as an undergraduate!

Monday, 13 February 2012

Multimodal Android Development Part 1

This post is the first of two which gives a brief introduction to creating multimodal interactions in Android applications. I'll briefly cover some of the SDK features available to you as an Android developer which you can use to create richer interactions in your apps. Example code will be quite concise because I assume you have at least a basic knowledge of Android development. Feel free to leave any comments suggesting how I can better explain these concepts, or to let me know if I've made any mistakes or omissions.

What is "multimodal" interaction?


Multimodal interaction, put simply, is interaction involving more than one modality (e.g. multiple senses). For example, an application may provide a combination of visual and haptic (touch) feedback. These types of interaction design provide a number of benefits, for example allowing those with sensory impairment to interact using other senses, or allowing interaction in contexts where one sense may be otherwise occupied.

One of the most ubiquitous examples of a multimodal interaction is the way in which mobile phones combine visual, audible and haptic feedback to inform users of a new text, phone call, etc. This combination of modalities is particularly useful when your phone is, say, in your pocket. Obviously you can't see the phone, but you will probably feel the phone vibrate or hear your ringtone as new notifications appear.

Haptic feedback in Android


Most handheld Android devices have some sort of rotation motor in them allowing simple haptic feedback. Although not common in tablets (largely due to size constraints), all modern Android phones will have tactile feedback available. You can control the phone vibrator through the Vibrator class. Note that in order to use this, your Manifest must request the following permission: android.permission.VIBRATE
/* Request the device's vibrator service. Remember to check
 * for null return value, in case this isn't available. */
Vibrator vibrator = (Vibrator) getSystemService(Context.VIBRATOR_SERVICE);

/* Two ways to control the vibrator:
 *  1. Turn on for a specific time
 *  2. Provide a vibration pattern */

/* 1. Vibrate for 200ms */
vibrator.vibrate(200);

/* 2. Vibrate for 200ms, pause for 100ms, vibrate for 300ms. */
long[] pattern = new long[] {0, 200, 100, 300};

/* Perform this pattern once only (repeat := -1). */
vibrator.vibrate(pattern, -1);

/* Vibrate for 200ms, followed by indefinite repeat of
 * 100ms pause followed by 300ms vibrate. Setting
 * repeat := 2 tells the vibrator to repeat at offset
 * 2 into the vibration pattern. */
vibrator.vibrate(pattern, 2);

Touchscreen gestures


Using touchscreen gestures to interact with applications can be fun, efficient and useful when users may be unable to select a particular action on the screen. For example, it can be difficult to select a button on-screen when running or walking. A touch gesture, however, is a lot easier and requires less precision from the user. The disadvantage with touch gestures is that if not used sparingly, there may be too much for the user to remember!

Creating a set of gestures for your application is simple: create a gesture library on an Android Virtual Device using the Gesture Builder application (available on the AVD by default) and add a GestureOverlayView to your activity layout. In your activity, you just have to load the gesture library from your resources and implement an OnGesturePerformedListener.

private GestureLibrary mLibrary;

public void onCreate(Bundle savedInstanceState) {
  ...
  /* 1. Load gesture library from the res/raw/gestures file */
  mLibrary = GestureLibraries.fromRawResource(this, R.raw.gestures);

  if (!mLibrary.load())
    /* Error: unable to load from resources! */
    ...

  /* 2. Find reference to the gesture overlay view */
  GestureOverlayView gov = (GestureOverlayView) findViewById(R.id.gestureOverlay);

  /* 3. Register callback for gesture input */
  gov.addOnGesturePerformedListener(this);
}

The callback method for gesture performance receives a Gesture as an argument. This can be used to obtain a list of predictions: which gestures in your library that Android thought the gesture was. With these predictions, you can use the prediction score (or contextual information) to determine which gesture the user was most likely to have performed. I find it useful to define a threshold for gesture acceptance, so that you can reject erroneous or inaccurate gestures. The best way to choose this threshold value is through trial and error: see what works for you and your gestures.
private static final double ACCEPTANCE_THRESHOLD = 10.0;

public void onGesturePerformed(GestureOverlayView overlay, Gesture gesture) {
  /* 1. Get list of gesture predictions */
  ArrayList predictions = mLibrary.recognize(gesture);

  if (predictions.size() > 0) {
    /* 2. Find highest scoring prediction */
    Prediction bestPrediction = predictions.get(0);

    for (int i = 1; i < predictions.size(); i++) {
      Prediction p = predictions.get(i);
      if (p.score > bestPrediction.score)
        bestPrediction = p;
    }

    /* 3. Decide if we'll accept this gesture */
    if (bestPrediction.score > ACCEPTANCE_THRESHOLD)
      gestureAccepted(bestPrediction.name);
  }
}

private void gestureAccepted(String gestureName) {
  /* Respond appropriately to the gesture name */
  ...
}