Rachel Potts (@citipotts) held a 3-hour pre-conference workshop at TCUK, showing and instructing us how to use web analytics (such as Google analytics) to monitor and improve documentation. We went through three use cases, you can
- Answer specific questions, for example, how often was a certain page viewed. This is the simple, obvious use case where you just look at one or few numbers without having to do any data tringulation.
- Measure strategic process. This means to analyze available data top-down to find out whether documentation serves an intended purpose. For example, if exception handling pages in the online help have feasible numbers for page views and time on page, while customer support calls for the corresponding issues decrease, your strategy to offer self-service error recovery in online help probably succeeds.
- Measure content use. This means to analyze bottom up to find out what new insights can be teased out of the available. Take an online shop, for example. If you relate search terms with the products that were ultimately bought in the same visit, you can start to build a glossary of products which your customers relate to one another. This is not the well-known “Customers who bought that have also bought this…”. Instead, you might find that people looking for “loafers” often buy “sneakers”. Or people looking for “equity” in the online help often wind up at the “stocks” page sooner or later.
For the specific improvement of documentation, Rachel had prepared several exercises. The one I found most helpful had us develop an action to achieve a business goal with documentation content. This action consisted not only of a formula, a target and a frequency of what to measure in web analytics, but also of the underlying purpose and the underlying assumptions.
Rachel was very explicit on that last point: She warned us that we should always keep in mind our assumptions in our interpretations and in what we define as a successful outcome!
– I had never thought through how web analytics could be applied to documentation, so this was an insightful workshop for me. I believe that all three use cases can be applied successfully to complement user analysis and surveys. However, having documented web analytics software in a previous job, I am skeptical about any absolute numbers to come out of the analysis. Instead, I trust them more to indicate relative change or progress.
How have you used (or would you use) web analytics to monitor and improve documentation? Feel free to leave a comment.