Categories
Design MIX11

User behaviour and interaction tracking with IOGraphica

IOGraphica map of a 3 hour projectDuring Sara Summers‘ talk Get Real! Sketch, Prototype, and Capture Great Ideas with Expression Blend and SketchFlow at MIX11 the question of recording user interaction and mouse behaviour on a prototype or a real application or website. In the talk Sara touched on how you can use SketchFlow to record user interaction with active elements such as buttons, but the question concerned live tracking of all mouse movements — i.e. where the user moves the mouse to, where she leaves it to rest and generally how she navigates the site. That made me realize I never got around to writing an article on how I do this with the free art tool IOGraphica so I guess it’s time I do just that.

What does your mouse tell you about your tools?

When you build applications or websites you (should) spend a lot of time thinking about how to make the user interface as intuitive as possible. That means placing navigation and key elements in the most obvious places and make them so that the user finds them easy to interact with. But there’s an additional layer to this process, and it is one that is usually ignored unless you are working for a large company: Actual live user behaviour.

Let me give you an example: Say you design a website with the main menu on the top left and a call to action button on the middle right. This configuration would likely lead to the user first moving her mouse up to the top left and then down to the middle right. Because most web users start their web surfing by placing the cursor at the front of the web address window it means you have created a situation in which they make a left reclining V movement to interact with the site. This is inefficient, but you are not likely to ever know because all you see are the actual actions (click) on the website. To discover this created behaviour you need to actually watch the mouse move in real time or record it.

Recording what matters

If you are working for a huge company with a UX department you likely have the fancy, advanced and expensive tools to do proper real-time interaction and behaviour tracking. But chances are you’re not and those tools are both well beyond your reach and also well outside of the scope of your work process. Typically this is solved by either doing live interaction testing where you literally sit and watch someone else work with the site Big Brother over-the-shoulder style or on a separate monitor or you record the session with a tool like Camtasia for review later. Both of these processes gives you some idea of what’s going on, but they are neither time effective nor conclusive. To be frank watching people surf the web and interacting with applications is mind numbingly boring and it’s exceptionally hard to glean anything from it.

More importantly though, the real-time watching procedure doesn’t actually provide what you need which is a map, taken over time, showcasing overall trends in behaviour. In other words, it’s really just a solid waste of time. This is where IOGraphica comes in.

Using an art tool to make valuable data

My experimentation with IOGraphica, a tool created by Anatoly Zenkov and Andrey Shipilov, actually started out of pure curiosity. I had seen an article on the tool hailing it as a way to create cool art from your day-to-day work routine. IOGraphica is a simple Java app that runs on your computer and records all mouse movements, pauses and clicks for however long you want it. You simply download it, boot it up, click the play button and work normally. It has some very basic settings to turn mouse pause recording on and off and to add the desktop as a background but beyond that it’s as simple as can be. If you take a break you can hit the pause button and when you’re done you simply stop it and save the image.

What you get from IOGraphica is a visually stunning image of your mouse behaviour over the recording time period. With the mouse pauses turned on it looks a bit like a highly organized random doodle with ink spots all over the place.

The description of the app as a tool for making art is accurate — the output has an artistic flare and you could definitely print and frame it, but there is a much more important use for it: IOGraphica gives you a clear and often quite surprising picture of how people actually behave on a site or application. And because it is recorded over an extended period of time you uncover strong trends and get a very clear idea of whether your tools, layouts and designs are organized in a way that makes sense to your users.

The image above shows my interaction with PhotoShop over a 3 hour period while I was working on a redesign of a site. I recorded it to test IOGraphica, but once I saw what came out of it I realized I’d gotten way more than I expected. Looking at the map I saw that my workspace layout was quite inefficient. The largest clustering was in the middle where the actual design work took place, but there was also a lot of traffic going to the top left and bottom right corners. This is where I had my tools and layers respectively. On closer inspection I also found that I often traversed the entire diagonal of the screen from tools to layers or vice versa, the longest distance possible without making any curves.

Two things became obvious: First of all I am far less good at using keyboard shortcuts than I thought. Secondly the default position of the tools is inefficient. For quicker work and less mouse movements I should place the tools and layers directly next to one another. Now imagine what the same process will say about your apps and sites.

IOGraphica user interaction recording scenarios

Incorporating the IOGraphica tool into your user behaviour tracking (and incorporating user behaviour tracking in your overall design process) is a simple process, and it’s one you can do either locally or remotely. It requires the Java Runtime Environment (JRE) which is about a 5 minute download and install. IOGraphica itself is a standalone application consisting of one executable file and requires no install. There are versions for Windows, Mac and Linux.

Once you have the application, simply start it, choose whether you want to record pauses or not and click the Play button. If the user needs a break or has to leave the application you can pause the recording temporarily. When the testing is over simply stop the recording and save the file. You can then choose to save it just as a behaviour map or as a behaviour map with the desktop as a background. Or you can do both. Since the application doesn’t record the desktop itself you simply place the application or website you are testing in the front view and save it to get the map superimposed.

The key to making this work is to ensure that the user only uses the application or website being tested and that the recording is done over an extended period of time. Longer time means a more statistically relevant result.

Because of the nature of IOGraphica you don’t actually have to be present when during the testing, and you don’t have to do it on your own computer. You can simply send IOGraphica or its link to your tester along with the application or site to be tested and some basic instructions and have them do the actual test on their own time. Because it runs in the background and is dead simple you’ll end up with valuable data from multiple sources and you can do some pretty advanced testing without ever leaving your office.

Cool, eh? I think so.

 

Categories
MIX11

MIX11 Day 2: Phone Innovation, Standards and Kinect for Everyone

A Kinect for everyone

A tag for tracking: #ms_mix11_svwe

Keynotes at conferences can be hit and miss at times. You have to know your audience, both what they want and how they want it served, your timing has to be exactly right and there has to be a strong balance between content and humour. All of these points were achieved, if not to perfection than at least to a level over and above what I’ve seen before today. And that’s not just because every attendee at MIX11 this year gets to walk away with a Kinect and the promise of tools to build Kinect applications for the PC in the near future.

The first 2/3 of the day 2 keynote at MIX11 was, not surprisingly, focussed on the future of Windows Phone 7 and Silverlight (not necessarily connected). Microsoft’s reboot in the phone market has really started to take off and because of this there is now a larger push than ever to roll out new features and capabilities for developers to build on and users to use. I’m not a phone developer so this is not technically interesting to me, but as a phone user I can say that what is being introduced in the next Windows Phone 7 update, code named “Mango” for some unknown reason, will help developers create better user experiences and more interesting applications for the phone and will provide the end user with a smoother and more intuitive experience. Both of which are great.

When it comes to Silverlight and the future Silverlight 5 release there are also great things on the way that will result in more immersive user experiences and capabilities.
But for me it was the last 3rd of the keynote that stood out. It was dedicated to Kinect on Windows. Kinect is, or rather was, a tool created to facilitate a more immersive and controller less gaming experience, but it didn’t take long for developers to realize that the true potential of the little weird bar lay not in gaming but in interaction with data in general. For my own experience Chris and I have been talking about what Kinect would do for PhotoPivot.com for some time but we’re not the only ones. And when the devices hit the street developers immediately started exploring ways of opening them up to develop new PC based applications. In response to this Microsoft is now set to release developer tools so that everyone can build Kinect based applications for Windows, which means the home computer. This is a game changer.

On stage there were demos of a Kinect controlled reclining chair, a helmet mounted Kinect used to help blind people navigate and a universal telescope. But that’s just scratching the surface. The Kinect not only makes your entire body an input device but also has the capability of taking voice commands meaning with the right application in the background it can do away with any other user interface. I can’t quite put into words how exciting this is. And to kick off the innovation every attendee at the conference got their very own Kinect to take home. Pretty cool. So expect an avalanche of crazy new user experiences using the Kinect once the developer tools are released in May.

Personally what stood out on this second day of MIX11 was the focus on open standards and web standards in general. Yes, there were tons of sessions on Windows Phone 7, Azure, .NET and other Microsoft-centric topics, but there were also a large variety of sessions on general topics like the new canvas tag in HTML5, the UX lightning series and a talk about the Web Standards Sherpa site by the Web Standards Project.

Of these I’d say the Web Standards Sherpa talk is a must-see for anyone working in the web world. Web standards are what binds us together and it’s more important than ever to keep them in focus.

Check out my continuously updated photostream from MIX11 on Flickr.

Categories
MIX11

MIX11 – Day One Recap

HTML5 mattersEvery MIX conference has an overreaching theme it seems. In 2008 – my first year – the theme was the Dev-Igner – the developer – designer hybrid – and how that person was going to find a place in the web development universe. This was spurred on by the launch of Silverlight and all the tools associated with it. 2010 (I didn’t go in 2009) was all about Mobile and the launch of Windows Phone 7 later that year. This year the focus is HTML5, illustrated by the picture above. It may seem surprising, it may be confusing, but it actually makes a lot of sense.

For anyone not working within or closely associated with the Microsoft universe the seemingly sudden shift to a focus on web standards, open source and interoperability may seem as a sudden and irratic shift. But it has actually been going on for a long time. It’s just that with the release of IE9 just a few weeks ago Microsoft suddenly went from being the reason why web standards and cutting edge web technologies couldn’t be implemented to a company leading the way and in many respects leaps and bounds ahead of the competition. So with that in mind the focus on HTML5 isn’t so strage after all.

The first day of MIX in many ways felt like a formal staking out of a new path for Microsoft. Yes, there was the ever present celebration of the .NET framework and all the technologies associated with it, but there was also an inescapable and refreshing focus on open standards, forward thinking and interoperability. If there was any doubt, the keynote, and the sessions that followed made it pretty clear that Microsoft is now fully backing open source, being it based on .NET, JavaScript, PHP or whatever other code language you swear by. And that’s a good thing. No. That’s a great thing. It means we are moving forward and can start pushing the limits rather than working on making everything comply with old and broken applications. And more importantly it means us open source programmers have well and truly been let in from the cold and accepted as equals.

The sessions I attended on the first day were on infographics, HTML5 standards and JavaScript – all non-platform specific, all cutting edge, all promising. And even the Ask the Experts session carried with it a vibe of moving forward together to make amazing things happen on the web.

Not to sound like a crazy cheer leader or anything, but the future looks bright. Or to put it in my own humble terms: Microsoft has seen the light and is accepting what we have known all along: Web standards and open source is where the future lies.

I’m really looking forward to the Keynote tomorrow where there are rumours they will be announcing some cool stuff that will make our startup PhotoPivot.com even more revolutionary.

Oh yeah, if you haven’t done so yet, go to PhotoPivot.com, check out the app and sign up for the beta. That was just my little plug.

Check out my continuously updated photostream from MIX11 on Flickr.