Sunday, October 30, 2011

Measuring channel shift


I have been pondering rather uselessly about cross channel measurement, driven in the main by the need for channel shift in certain sectors. I think that the obvious topic of channel shift in commerce is covered pretty well, so this is more about the harder area of customer service.

To get a minor rant off my chest, I am not a believer in using the term channel shift in the way it is used in public sector circles i.e. as an action you do. It has too many connotations for me of trying to shift people whether they like it or not. It is also assumed that channel shift is uni directional (i.e. online) and personally I am not convinced we should always assume that this is always the case (even though being practical, it mostly is and the desire is that it should be). I much prefer channel shift is used as a result i.e. it is something that happens as a result of channel optimisation. For, me if you optimise channels, customers will choose the channel that suits them best at that moment in time for that specific issue/task. Channel shift will result from optimisation but importantly customers will be happiest. Anyway enough of the rant and back to the point.

Measuring channel shift for non commerce related tasks is tricky IMHO. It is pretty straight forward to detect a massive drop in calls and look to see if there is equal trend in another channel such as online. But really measuring channel shift in order to measure and adjust channel optimisation efforts gets quite complex. I think the theory is pretty easy but there are practical challenges. So in this post I am going to take the easy route and start with an attempt at theory.

Measure, Measure, Measure


The main challenge for me would be measuring each channel in the same way so that comparatives can be drawn. In most organisations, where there is multi channel customer service including a contact centre, the methods of measurement may be very sophisticated but each channel has it's own methodology and metrics. As they say, "if you can't measure it you can't manage it" so it goes without saying cross channel measurement requires a cross channel methodology. To attempt to keep this post short I am using a few assumptions. The first and  main assumption is that the contact centre has sophisticated measurement of all interactions which not only includes things like types of query, activity measures, results etc. but also has information such as "intent" of query and can measure first time resolution. I think I'll also assume that there is some decent voice of the customer post contact surveys happening through the contact centre. Lastly I'll take the assumption that the channel requiring optimisation is the website, so if we are going to measure cross channel activity we had better work on ensuring we have some common metrics between the contact centre and the website. If we assume that the contact centre measures are fit for purpose then there will be some learnings we can take from this for the website.

Web Analytics


This train of thought started for me when observing how organisations measure their websites, my view would be that in the main not too well. I think that this stems from the common opinion that page views and visits are all that are important (don't get me wrong they are, but....). In ecommerce these statistics are important as they directly relate to conversion rates i.e. the more visits I get the more sales I make. In reality there is much more information we need. If we stay with ecommerce for a moment the key measure is how many people successfully purchase. In successful ecommerce organisations the website owners have every step of the journey through the website, from initial visit to purchase, mapped out in web analytics so that reports can highlight areas which could be optimised. This microscopic approach focusses on one thing, a "task", the task of buying. So for customer services we could take this approach and focus on more tasks. (see Gerry McGovern on the subject of tasks, very worthwhile reading).

10 years or so ago the conversation on websites was all about customer journeys, but I personally think that Gerry McGovern's focus on tasks is absolutely the right approach. In organisations like local government the number of tasks a citizen/customer might try to achieve with their council could be high, but some of these tasks are absolutely vital for that person's well being. Councils therefore engage with their customers face to face, by phone, by email and online. Many of these tasks are complex and the sheer volume of them is mind boggling. If a council wishes to achieve channel shift through channel optimisation, the challenge may look insurmountable. If however the challenge is broken down into it's constituent parts i.e. tasks, then each can be prioritised and addressed in the appropriate order. It is perhaps then true that channel optimisation is actually a job of "task optimisation".

BTW the topic of web analytics is always dealt with very well by boagworld. It is worth digging through the archives for the items on google analytics especially any podcasts featuring Matt Curry of Wiltshire Farm Foods (e.g. 193. Get more from Google Analytics ). What I like about the boagworld approach is that they are always reminding us of the common sense stuff, especially the last podcast series where they take us through their approach to re-designing their own website. The problem I feel with many websites is that the project will start well, but measurement becomes an afterthought.

A lot of energy is spent in the early phase of any website project (hopefully) on setting objectives for the site and then breaking this down into detailed measurable goals and hopefully doing some usability work on tasks. Therefore there should be plenty to measure and those measures will be key to showing whether the site is or is not being successful. But are people measuring?
  • Are people going  back to these early definitions and making sure they are being measured? 
  • Has the website been built in a way it can be measured easily,? Some common challenges are:-
    • Making sure URL structures are revealing and meaningful (not just search engine friendly)
    • Forms which post to themselves i.e. the URL does not change when the user clicks submit. Not a bad practise but these have to be measured slightly differently
    • Smart Ajax functions which may present 3 or four stages of a task, without the page refreshing. This would just appear as one page view unless they are specifically measured.
    • If using external services (e.g. shopping cart services or Wufoo forms) are these tagged so you can measure them ?
  • Do the "tasks" on the website relate to tasks in the offline world of the contact centre ?
My first main point therefore is that a review of the web analytics on the website is absolutely vital. We don't have to boil the ocean on this, all we really need to establish are the following
For transactional tasks e.g. those with forms....
  • Can we measure the entire task (every step) from start to finish?
  • Can we measure repeat visits to the same feature (which could indicate usability issues if the form transaction only requires infrequent use)?
  • Can we measure what the visitor does if they don't complete the task?
  • Is the measure using the same terms as other channels such as the contact centre, so we can compare?
  • Can we measure how easily people find the feature?
For informational items......
  • Can we measure visitors paths through the website to key pieces of information ?
  • Can we measure the viewing of key information using the same terms as the other channels e.g. the contact centre?
  • For this "destination" content, are we able to measure what else the visitor does after reading it (and whether they do read it)?
General
  • Where do visitors come from, i.e. did they click through from another site ? (if it's a search engine, what was the search term as that gives us intent)
  • Where do visitors go if they click a link to an external site? This is important if signposting to another site is considered as a success measure?
  • If visitors search the website search what do they search for and do they get to it first time or are they constantly going back to  the search page and trying something else ?
Devices
  • Can we measure more that just what devices are used, can we measure what they were used for i.e. devices by task?

I expect that the answer to a lot of these questions will be no, but this is not a disaster. If we take the approach of optimising task by task, it is a simple matter to ensure that that measures are in place as part of the optimisation project. This means that adjustments can be made to the specific task related areas of the website to ensure there is sufficient measurement and, IMHO, these are done before any optimisation is attempted so we can measure the "from" state and then also compare that with the "to be" state.

So to try and summarise the web analytics portion of this ramble it is important we are able to measure everything on our websites but it is also important we start using task related metadata which is the same as used elsewhere in the business.
Example
Let's take something simple such as a library card application. I discovered (when picking this example for this blog) that my council library has an ebook service which works for iPads. I registered fine but this is when the comedy of errors started (and it is a classic case of unnecessary channel escalation).

I selected a book and tried to checkout which required me to log in. I copied and pasted my temporary borrower number (e.g. UNREG12345) and entered my pin only to be rejected. I tried again this time only entering 12345, but to no avail. The only options available to me at this stage were to use the contact us form or phone.  I duly clicked on contact us filled in the form and it rejected my message because the borrower number could only contain digits (I entered UNREG12345), I tried 12345 and received a message that the number had to be seven digits. Basically I can't log my non urgent problem by the form which is a pain (and I doubt the web analytics will be able to show this).

All in all not the best experience. From a measurement perspective encouraging signs looking through the page code. The main website is nicely tagged using navigation tracks which show the depth the visitor is in the website. The problems are to be found outside of the main website. The registration function was in another domain and the library in another. Although they were tracked they didn't have the same detailed approach to metadata as the main site which would make task related measurement difficult. The forms also desperately need custom tracking code to be able to tell the path a user took e.g. in the registration form, the success page URL was the same as the terms and conditions page which isn't too clever as it will not be possible to measure completion of the process.

As a result of this I will now call for help and a record will be made in a CRM system,  but can the county council in question tie the two events together ? If they implemented task based measurement then these events could be tracked together, not necessarily as one contiguous event but they can be trended together on the same chart.


Summary

The main points of this rather long peice are probably this
  • Start thinking about the website as a place where visitors can achieve tasks
  • Ensure that your web analytics is measuring the site properly
  • Add task related measurement to the website
  • Use the same words in the task measurement as are used elsewhere to make it easy
  • Approach this incrementally i.e. instead of trying to apply task based measurement across the website apply it as you optimise a specific task on the website.
  • Create a task dashboard which combines all the channels so you can see the channel shift.
I suppose now I had better work out how to best to tag a website for tasks

Tuesday, October 18, 2011

Keeping up with social media on the ipad


It's taken a while to work out but I finally think I have stumbled on a set of tools that work for me keeping in touch.

I suffer from magpie-ism. i.e. I have this burning desire to keep every tweet and blog post and if I haven't checked for a while, I find it really difficult to ignore what has passed me by and focus on the new stuff. I sit and look at google reader desperately trying to mark 2000 blogs as read when I haven't read them paralysed that I might miss something.

Part of this for me is the right tools, so for now my list is as follows:-

  • Feedler pro - hooked into my google reader
  • Instapaper - I mark stuff for reading and more often than not never go back to it
  • Hootsuite - I have tried a few but this just seems to work for me
  • Apple's new twitter app - not sure yet

If anyone stumbles on this tumbleweed zone, any suggestions via comments would be gratefully recieved.