Wednesday, 23 November 2011

Running Visualisations

A heatmap of every run in November so far

Tonight I've been playing around with a python script which generates heatmaps from GPX files. The image above is a composite image of my last 10 runs in November (taken from my phone). It's fairly easy to see where I run most often.

What would be really cool would be if there was some way to visualise speed at a given point. The above image uses colour intensity to encode frequency of location; the more intense the colour, the more often I've run there. I'd find it interesting if it was possible to use colour to encode movement speed at a given point. For example, you could calculate the average speed across each run being visualised, and use different colours to represent above and below average, with varying saturation being used to represent how much above or below the average you were at that point.

Although I've seen a few existing methods of visualising speed (typically line charts), I've yet to see one which shows the relationship between speed and location. Endomondo and similar websites approach the issue by showing a map and separate chart for speed, and moving your mouse over either shows the corresponding location on map or speed on the chart. This exploratory method doesn't really give a good overview of the information.

This has the makings of a potential side project...

Tuesday, 15 November 2011

Android workshop and Surface

It feels unusually warm for November, which has made the past week quite pleasant for running. I've gotten 4 runs in over the last week, and I'm hoping to keep up at least 3 runs a week until the end of the semester. Dr Cutts talk last Wednesday on computing science education has caused a bit of introspection on how I use my time, and has made me realise that I don't always spend it wisely. I'm a workaholic, I get lots of work done, and I consider myself to be quite well organised. But maybe I could achieve similar things in a lot less time. Lately I've been focusing more on coursework, really trying to get as much done as early as possible. I've never had to pull an all-nighter working towards a deadline before, and I certainly don't plan to start any time soon.

I'm excited for the start of Week 12 because, other than the obvious reasons of having no more deadlines, I'm likely going to be putting on an Android development workshop in the School of Computing Science, along with another classmate. It'll be cool to give something back like that, and hopefully attendance is pretty decent. I'd certainly hope so, given that the Mobile Software Engineering degree has over 5 times as many students this year as last. The more Android projects I work on, the more I notice patterns emerging and the ability to re-use code. Things have gotten to the point now where any project I do in Android has about 50% re-used code. I think myself and James have a decent amount of experience to offer and can help teach other developers how to address problems that we've already encountered.

Week 12 is also the start of a fortnight dedicated to project work. My project at the moment is in quite a good state, I reckon. As far as implementation is concerned, I'm well ahead of schedule and the main technical concerns have been addressed. I don't know if I've mentioned it before, but I'm working with Microsoft Surface this year, and one of the technical challenges I'm approaching is how to display information when the Surface has stuff on top of it. I find this an interesting problem because it's only natural that a tabletop computer has to remain usable at the same time as being used as a table.

So far I've been iteratively developing a prototype which displays a shape in the largest unoccluded space, and have just started to animate this shape as it moves around the tabletop due to objects being placed on or removed from the Surface. There's some really cool stuff going on to make this work, and we (my project supervisor an I) are probably going to submit a work-in-progress paper to CHI2012 about our research so far. It's a new and novel area of research and it'd be the highlight of my academic "career" if that paper gets accepted.

Here's a terrible quality video I took earlier showing a prototype in action.


Wednesday, 26 October 2011

Left-recursion in Parsec

Lately I've been using the Parsec library for Haskell to write a parser and interpreter for a university assignment. Right-recursive grammars are trivial to parse with combinatorial parsers; tail recursion and backtracking make this simple. However, implementing a left-recursive grammar will often result in an infinite loop, as is the case in Parsec when using basic parsers.

Parsec does support left-recursion however. Unsatisfied with the lack of good tutorials when I googled for advice, I decided to write this. Hopefully it helps someone. If I can make this better or easier to understand, please let me know!

Left recursive parsing can be achieved in Parsec using chainl1.

chainl1 :: Stream s m t => ParsecT s u m a -> ParsecT s u m (a -> a -> a) -> ParsecT s u m a

As an example of how to use chainl1, I'll demonstrate its use in parsing basic integer addition and subtraction expressions.

First we'll need an abstract syntax tree to represent an integer expression. This can represent a single integer constant, or an addition / subtraction operation which involves two integer expressions.

data IntExp = Const Int
            | Add IntExp IntExp
            | Sub IntExp IntExp 

If addition and subtraction were to be right-associative, we'd parse the left operand as a constant, and attempt to parse the right operand as another integer expression. Upon failing, we'd backtrack and instead attempt to parse an integer constant. Reversing this approach to make the expressions left-associative would cause infinite recursion; we'd attempt to parse the left operand as an integer expression, which attempts to parse the left operand as an integer expression, which tries to... you get the point.

Instead we use chainl1 with two parsers; one to parse an integer constant, and another which parses a symbol and determines if the expression is an addition or subtraction.

parseIntExp :: Parser IntExp
parseIntExp =
  chainl1 parseConstant parseOperation

parseOperation :: Parser (IntExp -> IntExp -> IntExp)
parseOperation =
  do spaces
     symbol <- char '+' <|> char '-'
     spaces
     case symbol of
       '+' -> return Add
       '-' -> return Sub

parseConstant :: Parser IntExp
parseConstant =
  do xs <- many1 digit
     return $ Const (read xs :: Int)

Here, parseOperation returns either the Add or Sub tag of IntExp. Using GHCi, you can confirm the type of Add as:

Add :: IntExp -> IntExp -> IntExp

So, we have a parser which will parse a constant and a parser which will parse a symbol and determine what type of operation an expression is. In parseIntExp, chainl1 is the glue which brings these together. This is what allows left-associative parsing without infinitely recursing.

A complete code sample is available here. The abstract syntax tree has been created as an instance of the Show typeclass to print in a more readable format, which shows that the grammar is indeed left-associative.

ghci>  run parseIntExp "2 + 3 - 4"
((2+3)-4)

Thursday, 13 October 2011

Another reason I like C#

A lot of people seem to be finding their way here because of my post about finding the maximal area submatrix in a binary matrix in C#. I found this post from a couple of weeks ago which shows the same idea in Haskell. It's interesting to see a similar idea presented in a different language; especially one as awesome as Haskell.

After a couple of hours of writing in C# earlier I've found a feature of the language that I really like: indexors. They allow array-style indexing of user-defined classes, which would be particularly useful for data structures. I really like that something which has typically just been syntactic sugar for array access is available to developers. Why? Because we're too lazy to write foo.get(x, y) when foo[x, y] is also available.

Wednesday, 21 September 2011

Learning to play harmonica

My new harmonica.
As it turns out, lectures don't start until Monday, so (excluding a short workshop tomorrow afternoon) I don't really start back at university this week. That's given me an extra few days to recover and get over this illness. My cough is almost gone, thankfully.

I bought a harmonica a week ago and I love the wee thing. I seem to have picked it up quite quickly and can knock out a few riffs and jam along to some backing tracks, but my embouchure could be a lot better. Practice makes perfect, though. Right now I'm doing my family a favour and practising while they're out of the house. Apparently my grandmother was a good harmonica player, although I don't imagine she jammed along to blues music. Maybe I'm wrong...

I suppose I'm the only musical person in two generations of my family. My grandmother came from a very musical background (I think her mother was a music teacher) and my grandfather was quite musical too, but it seems to have skipped a generation and no-one else plays anything. I've spent a lot of my life playing music, having played bass and guitar since very early in high school. Since university I've also started to play ukulele and harmonica.

Last week I was unable to do any exercise but as I'm feeling a bit better now, I've started to run again. I've ran three times since Monday and I'm slowly getting back into it. You'd be surprised at what effects even a week of illness can have on fitness. I know I'm not going to be breaking any personal records at the moment (actually that's a lie, I ran my fastest ever mile on Monday) but right now I'm just trying to limit damage.