1st day of sessions at TCUK 13

On its first day of sessions, TCUK13 offered very diverse sessions. My selection of presentations – and hallway conversations – focused on cognitive science, the future of tech comm, the business side of our industry as well as managing tech comm, this year’s specialist stream.

Sarah O’Keefe on “Fame, glory and… tech comm”

Sarah’s opening keynote urged us to unleash our inner pirate and “go for the booty” of corporate resources and attention – in other words: to follow the money. We tech comm’ers need to understand the objectives and KPIs of C-level executives, develop a content strategy that supports these objectives and then profit (before marketing or other departments do, as Ellis Pratt later pointed out in his rant).

This way we can create effective tech comm which meets both business needs and user needs – as opposed to artisanal tech comm which fails business goals or cheap and merely adequate tech comm which fails users.

My session on semiotics and mental models

My own presentation Addicted to meaning: Mental models for technical communicators was attended by approximately 50 people and quite well received, I thought.

It’s essentially a brisk walk through a couple of cognitive concepts that underlie much of tech comm. After considering what meaning actually is and why we technical communicators should even care, I looked to semiotics to explain how meaning works in communication – and why it still sometimes fails in tech comm. The second concept is mental models which can explain how and why we create meaning – and how we can create meaningful documentation.

Adrian Morse on “The challenges of remote management”

Adrian drew on his experience of both working at home and managing technical communicators who work at home to explain many of the challenges of managing writers remotely. His tips applied to most teleworking scenarios, from occasional home office days to full-time teleworking by some or all of the team members.

Remote working and managing requires thought-through policies and a good reliable setup that starts with the appropriate hardware and network services and extends all the way to regulating PC administration, backup policies, etc. and complying with corresponding laws and EU regulations.

Adrian emphasized how important communication is as long as someone, anyone teleworks: You need to agree on mutual expectations in terms of hard objectives and performance, but also in terms of softer factors of answer times and availability for mail and phone contact. Just as working face-to-face, teleworking requires regular meetings, both 1-on-1 and of the team as a whole. Also make sure you have good ideas and policies for when and how you allow people to enter teleworking scenarios and when and how they will end them again!

Ray Gallon on “The Quantum Funnel”

Ray’s talk dovetailed with my own: His reference to creating scripts which explain how we behave in a restaurant was very close to my own example of how mental models determine our approaches to and perceived options in restaurants.

His premise is that today’s practice of learning is much more scattered and autonomous than it has previously been when learning was more controlled and directed. Such learning leaves more and more crucial gaps than before. To make sure that people (and users of tech comm specifically) can successfully fill their knowledge gaps, learning becomes more important than knowing.

One such approach is “connectivism” which understands learning as the process to search and connect concepts, ideas and fields. In this context, learning must not only answer the questions “what?”, “how?”, “where?” and “when?”, but also “how to be?” and “how to be with others?”. People in general and tech comm audiences in particular, increasingly learn in self-directed and creative ways by social collaboration, together with others. The role of teachers shifts to facilitator, that of technical communicator to curator.

This will emphasize both social and cognitive skills in the future, when we learn by moving through these stages:

  1. Exploring and understanding
  2. Representing
  3. Planning and executing
  4. Monitoring and reflecting

Applied to tech comm, this means our model shifts from a gatekeeper of knowledge to that of a curator and storyteller, as we avail ourselves of different types of contextual information, some of which our outside of our control:

  1. Internal documentation, such as progressive disclosure.
  2. External information, such as it is in Wikipedia.
  3. Interactive information, such as MOOCs and commenting functions support them.

– Feel free to leave comments about any of the sessions, whether you have attended them or not. I will try to answer them as well as I can.

Top 6 tech comm trends for 2013

Flexibility in several dimensions is my tech comm mega-trend of the year, after mashing up the top 6 trends presented by Sarah O’Keefe and Bill Swallow in Scriptorium’s annual tech comm trends session. Head on over to Scriptorium’s site to watch a recording of the webcast and to read Sarah’s take on the trends she presented.

1. Velocity

Velocity is Sarah’s first trend which simply means that we tech comm’ers are expected to create, deliver and update content faster than before. Also gone are the days – or months – when localized documentation could be several weeks late.

If we are serious about this, we need to revamp our documentation processes. I agree with Sarah: A recent restructuring of documentation processes has sped up my throughput and made my estimations more predictable. So it has improved my productivity as a whole.

It’s also made me more flexible because my smaller task packages take less time before I finish with a deliverable. It used to take me 4 to 6 weeks to update a user manual, so interrupting this task for something more urgent was expensive because it further delayed the manual and also clogged up my pipeline. Now I take about 2 weeks for the same task which allows me greater flexibility in sequencing my tasks because I still have a chance to finish the manual by the end of the month, even if we decide that I first spend a week on something else. I could not have achieved this  flexibility without revamped processes to analyse and specify documentation and without a topic-based approach.

2. PDF is here to stay

Bill’s doesn’t see PDF go away any time soon, if only because it’s durable, controllable, reliable and downward compatible to that other durable format called print-on-paper. Google Docs might be a potential competitor, but Bill doesn’t see it making great advances on PDF in 2013.

As comments from the audience showed, there seems to be a lot of passion about the issue and some people can’t seem to wait to lay the last PDF to rest, finally. As a tech writer in semi-regulated industries, I know that I’ll be creating PDFs for my users for a looong time. It might not be a trend per se, but I agree with Bill that we haven’t seen the end of PDFs just yet.

3. Mobile requirements change technical communication

Mobile will be the big game-changer for tech comm this year, predicts Sarah. Requirements for mobile documentation mean that PDF will be one format of many – and maybe not the primary one in many cases. Other essential deliverable formats include HTML5 (for an online audience) and apps (for native or offline use).

The limited real estate of mobile devices requires more flexibility in how we structure and present documentation. Progressive disclosure can help us to integrate essential user assistance in labels or pop-ups. Beyond that, we need a strategy of what to disclose where and how to create a seamless and consistent user experience.

4. Mobile drives change

Similar to Sarah’s trend, Bill underscored the influence of mobile documentation. He emphasized the need for concise, no-frills content. Rather than jump on the progressive disclosure, Bill presented an alternative scenario: An “executive” device with the main product hooks up with a second, mobile device which presents the corresponding documentation. (I didn’t quite understand this point, I think Bill mentioned a second screen embedded in the primary device, but I’m not sure.)

5. Localization requirements increase

Bill sees the scope of localization expand as the need for translation no longer stops with external documentation. Increasingly, internal documents also need translations because corporations need to keep international teams afloat and cannot afford to lose traction due to vague or misunderstood communication.

This is also the reason why, economic advantages notwithstanding, machine translation hasn’t taken off yet. But a hybrid process seems promising in some areas where machine translations become useful and reliable after human editing. Enter flexibility as our audience now might also include far-flung colleagues – and our tasks might include editing text that’s been translated by robots.

6. Rethink content delivery

As we face diverse requirements for working at different speeds in more formats and for more diverse audiences, we need to be flexible and rethink what we deliver and how. With demands like these, pages of static contents are frequently not sufficient. Instead, users need more dynamic content and filters to customize the documentation to what they need at the moment.

Someone in the audience summed it up very well: “Think of content as a service, not a product.” To me that makes a lot of sense, because it emphasizes the recipient of that service and their situation over the static dead-on-arrival quality that comes with a tome of printed pages.

My summary

I think flexibility is a key ingredient in many of the trends Sarah and Bill discussed with the audience. The recent opportunity to reorganize how I create documentation has given me two kinds of confidence: I have a suitable process in place for now. And I can change processes and methods when I need to.

A secondary trend occurred to me as well: Thanks to Sarah’s spirited mc’ing by which she included the majority of audience questions and comments, this webcast felt a lot more communal than previous ones I have attended. It was almost like single-session, virtual mini-conference. And if industry leaders can bring us together outside of conference season, we can strengthen our networks and move our profession forwards – with just a little bit of flexibility.

Strategic Technical Communicator panel at tekom12

Marijana Prusina, Nicky Bleiel, Sarah O’Keefe and Dr. Tony Self pooled their experience in an interesting and versatile panel session about the more strategic aspects of our profession.

Marijana Prusina, Nicky Bleiel, Dr. Tony Self, and Sarah O'Keefe

The Strategic Tech Comm panel (photo thanks to Axel Regnet)

It was not so much a discussion as a fast-paced session of the experts sharing their thoughts on strategic issues and problems, so I’ll simply list some of the insights:

  • Domain knowledge for a certain industry (as opposed to general tech comm skills) can be a great asset that you can use to build a career on, but it’s not necessary to become an expert in any one domain.
  • To get a mandate or money from management, don’t argue in terms of quality, but rather in terms of cost: Show how improving documentation will either reduce cost or create additional revenue.
  • Freelancing can work well, but you will need some things which are less essential if you are employed:
    • Considerable project management skills – even if only for your own projects
    • A good network of satisfied customers, other people who know and like your work, and other freelancers with whom you can exchange tips and tricks – and maybe even projects if they’re better skilled to take them on or when you are busy.
    • A snappy definition of your core services, skills and profile.
  • To improve the reputation of tech comm and exert influence in your company, try these strategies:
    • Volunteer, if you can afford to, whether in a professional tech comm association or a standards committee.
    • Underpromise and overdeliver on your deliverables – and meet the deadlines you agree to.
    • Write a book – but be aware that you’ll mainly do it for marketing and influence: It’s a lot of work, and it won’t make you rich.
    • Be the advocate of users, who are satisfied, more productive and less costly to your tech support thanks to good documentation.
  • Take all the training that makes sense to you and that you can get. Don’t forget about domain skills and software-related skills, for example, for API documentation. When training, keep in mind your resumĂ© and what value you will add to your customers or your employer by adding a certain skill.

Turning tech comm into a biz asset by Sarah O’Keefe

Turning technical communications into a business asset, according to Sarah O’Keefe, is mainly about justifying cost; it is necessary, but possible. Her session at tekom12 was part of the Content Strategy stream, presented as last year by Scott Abel.

How expensive is your documentation – really?

Much progress in a tech comm department gets stumped when we, the tech writers, say: “Ah, that’d be great – but they’ll never pay for it!” What that really means is: “‘They’ don’t see the value (or the urgency).” So to prove the value behind tech comm, we need to justify how we can either save money (by reducing effort) or how we can generate additional revenue (by producing value that exceeds our cost).

Sarah points out several way to do this:

  • Show how tech comm can address legal or regulatory issues. Avoiding lawsuits is a great way to save your employer’s money!
  • Control the real cost of tech comm, because “cheap can be very expensive”: Yes, you may get something akin to documentation from a secretary or an intern, but…
    • Is your documentation efficient to maintain?
    • Does it scale or allow publication in other formats?
    • Does it actually satisfy your customers and support your brand – or does it stab your corporate value statement in the back?

Cost containment strategies

Sarah mentioned several strategies to control documentation cost.

The first bunch has to do with efficient content development:

  • Reuse as much content as possible: Write once, use many times, either in different places of the same format or in different output formats.
  • Automate formatting: Manually handcrafted formatting of deliverables can be a huge cost factor. It’s not uncommon for tech writers to spend 20% of their time (and hence a sizable chunk of money) on formatting output. Automate this, by relegating format either to templates or CSS.
  • Localization scales content efficiencies: Localizing or translating your content will be all the more inefficient, the more inefficiencies you have in your original documentation processes. This applies to content reuse, inefficient content variants and formatting.

Then there’s cost reduction outside of the tech comm team, for example, in tech support:

  • Consider whether your documentation is good enough to deflect the maximum possible number of support calls. Anything that users cannot find in the documentation, whether it’s missing or unfindable, drives up costs for your tech support staff.
  • Ensure your tech support staff has access to your documentation in formats they can work with efficiently. Downloading and then opening a document of 10 or 20 MB, is not only clumsy in its own right, it’s also likely that it doesn’t present the required information in the most efficient way…
  • Ensure your documentation content is actually useful to tech support staff: It must not only be accurate, but also up-to-date. Consider the nightmare in terms of costs and maintenance if tech support spun off their own documentation to augment the “official documentation”. Instead, invite them to contribute to the documentation you create.

Make documentation more strategic

Then there are a few strategies to make documentation more strategic, or rather, more strategically valuable:

  • Ensure your documentation is not only searchable (so it’s captured by publicly accessible search engines), but also findable (so people know where and how to get to it) and discoverable (so people link to it, from blogs or forums or twitter or the like).
  • Align tech comm to larger business goals: Find a corporate goal, preferable one that is tied to revenue to be made or cost to be avoided and show: If the tech comm team did this, it could contribute approximately that much money (in savings or additional revenue) to that larger corporate goal.

Conclusion

Sarah’s talk was geared towards the strategic angle of tech comm, but succeeded in making valuable points very clearly. Whether you can actually apply her advice in your situation may depend on how much managers with budget control feel the pain of improving tech comm.

Summing up Scriptorium’s tech comm experts webcast

In Scriptorium’s “Ask the experts” webcast on 17 April 2012, Sarah O’Keefe, Nicky Bleiel and Tony Self reflected on frequently asked questions and trends. Here’s a timed play-by-play synopsis, so you can access the bits in the recording that interest you.

I try to provide teasers, not spoilers, so scoot right over to Scriptorium’s blog and check out the meaty answers for yourself!

FAQs

The panel starts with the questions they hear most often, from the underlying architecture via the tools to the deliverables.

5:46 – What is the best help/XML/CMS tool to use?

Tony tackles this first question. While there is no clear, short answer, Tony sums up some criteria which can guide your choice to selecting a system that works for you.

9:20 & 14:20 – What should we be delivering?

Nicky hears this one frequently – and often the underlying sentiment is: “So many outputs, so little time.” Again, there’s no simple answer that suits everyone, but Nicky outlines how to make an appropriate decision. And once you throw single sourcing in the mix, you’ve likely got a scenario that works for you. (They go back to this question after the next one, that’s why there are two timestamps!)

11:54 – Should we implement DITA/XML?

Sarah has several answers for this question: Showing an ROI is tricky, but most compelling – and it is more likely in some scenarios than others. The strategic aspect of making your content future-proof helps, but it may not be sufficient for your business case.

Hot topics

Next up are some more or less controversial questions:

18:50 – Should we use a wiki for documentation?

The experts chime in with several perspectives that can help you make a decision. Tony thinks it can be a valid format for many use cases, Sarah cautions of the very different processes a wiki brings.

25:10 – Content strategy: Is it a fad or fab?

The experts’ answers focus on what content strategy actually means for technical communications, from “we’ve basically been doing it for years” to embracing it as part of our profession’s maturing process to new job roles and titles.

30:30 – The tech comm career: Is it awful or awesome?

This one is interesting: The experts see both, the glass half empty AND half full. The consensus is that the role for technical communicators is changing, and fast, so there are challenges and opportunities to those who adopt.

Audience participation time

For the last round, the webinar audience submits some questions for scrutiny:

37:50 – Can tech comm be complex when products get ever simpler?

The panel is not ready to dumb down documentation at all costs. Complexity may be warranted depending on the products and audience expectations.

42:15 – Is agile good or bad for tech comm?

Agile can be good for tech comm, when it’s implemented well. Tony also points out that agile may give technical communicators a stronger role in the development process.

45:55 – Can product specs be turned into docs with a single edit?

It’s hard to tell without knowing the details. But probably not.

How to disrupt techcomm in your organization?

If you need to “disrupt” your tech comm content, I believe it’s more beneficial to integrate content across the organization than just to get tech comm to become more business-oriented or more like marketing.

The idea comes out of a worthy new collaborative project Sarah O’Keefe launched last week, Content Strategy 101: Transform Technical Content into a Business Asset. (This blog post is based on a couple of comments I’ve left on the site.)

Tech comm goes to business school

A recurring discussion is that tech comm needs to be more business-like to be justifiable in the future, not only on this blog but also elsewhere. Proponents of this view definitely have a point, if only because tech comm is often seen as a cost center and finds it hard to claim a return on investment.

I think, however, that this view is detrimental to all involved parties:

  • Tech comm risks to abandon its benefits to users and quality standards in an attempt to be “more like marketing”.
  • Managers may risk permanent damage to the documentation of their product without solving the bigger problem.

Breaking down all silos

The bigger problem often is that most content production is inefficient – because it occurs in parallel silos. Many companies have gotten good at making their core business more efficient. But they often neglect secondary production of content which remains inefficient and fragmented.

I’ve seen several companies where marketing, technical communications and training (to name just three areas) waste time and money. Due to inefficient, silo’ed processes, tools and objectives, they create similar, overlapping content:

  • Marketing and tech comm create and maintain separate content to explain the benefits of a product.
  • Tech comm and training write separate instruction procedures for manuals and training materials.

Once companies wake up to these redundancies, all content-producing units will face pressure to streamline content and make it easier to produce and reuse. This will revolutionize corporate content production and publishing.

Quo vadis, technical communicators?

I think this issue raises two questions for technical communicators.

The strategic question is:

Which kind of content disruption is more beneficial for the organisation and for customers: Folding tech comm into marketing or integrating all content with a corporate content strategy?

The answer depends on several issues, among them:

The tactical question is:

What’s the role of technical communicators in this content disruption: Are they the movers or the movees? Are they shaping the strategy or following suit?

The answer again depends on several issues:

  • What is your personality, clout and position in the organization?
  • Which team has the most mature content and processes to be a candidate to lead any kind of strategic change in content?

I think tech comm can lead a content strategy, especially if and when the tech comm team knows more about content than marketing or training or other content producers.

So what’s it like to present a tech comm webinar?

Presenting a webinar isn’t much different from other “public” presentations, but the format has a few quirky effects and demands of its own.

On 29 February, I had the chance to present my first webinar. As with many first-time experiences, the newness of it all felt a little weird, there were some glitches, but altogether, it went alright. I think. I hope. Because I have had virtually no feedback.

Missing feedback

And that is already the most important difference to other presentations: You have next to no idea how you’re coming across. I never knew how vital even subtle cues are for presentations before a live audience. Does the audience follow along or do I need to be faster? Or slower? Frowns can signal that a point or a  joke didn’t get across. Genernal “antsiness” means I can pick up the pace a bit. Attentive smiles or chuckles indicate that I’m connecting. A webinar offers none of that.

The best I’ve seen other, better webinar presenters do is to ask at the beginning whether attendees can hear the audio and can see the slides changing. But after that, as a presenter, you’re on your own. It feels like talking into a tin-can telephone – without knowing whether the string is still taut.

Tin can phone

Tin can phone, from http://www.wikihow.com

Fortunately, my webinar heroine Sarah O’Keefe had alerted me to this lack of immediate feedback. So I could identify it – but that didn’t make coping with it any less bewildering.

I think I forged ahead too fast and with too much urgency in the beginning, as if constantly groping for attention. Then I reminded myself to take a long, deep breath between my major sections.

The curse of convenient isolation

I think it’s also worth keeping in mind that even a live webinar catches everyone in a different time, place and context. What makes webinars so easy and convenient to attend, turns out to be a bit of a curse. I was presenting at 7 p.m. in Germany from my kitchen. Attendees in the US caught the webinar in the late morning or around noon, at the (home) office, I’m guessing.

This means you have less of a common context on which to build a dramatic arc or a feeling of community. In this regard, a webinar feels rather like broadcasting live television.

By contrast, some of the best live presentations I’ve witnessed gathered all attendees together, took them on a transformative trip and dropped them off at a different mental place. These were communal experiences which impart knowledge and change your perspective and rouse a group to action. I don’t think I’ve ever had that feeling in a webinar. And after my own experience, I don’t think it can be done, unless participants know each other better and have some way to interact with each other.

My webinar about getting ahead as a lone writer relies mainly on information sharing, but both times when I presented it at conferences, I was delighted to know that some attendees walked away with a feeling of “I’m not alone; I can do something about this, because others could, too.” Whether my webinar was successful along these lines, I don’t know.

Your turn

If you’ve considered presenting a webinar or have done so already, I’d love to hear your expectations and experiences. Feel free to leave a comment.

Tech comm trends 2012, mashed up and commented

2012 is the year when tech comm’ers need to understand business processes and align documentation with new technologies, say tech comm pundits – and yours truly.

What I expect for 2012

Tech comm’ers need to understand business processes.

Okay, so this trend is not exactly new, but I expect it will gain traction this year. Scott Abel thinks so, too. Business processes are crucial for us tech writers in more ways than we might think. Ideally, we understand them in three domains:

  • In tech comm, we need to understand business processes to do our job efficiently, to improve how we work and to measure if (or prove when) we are understaffed.
  • In our employer’s business (or whoever has ordered the documentation we provide), we need to understand processes to contribute to the bottom line and to get out of the cost center corner.
  • In our customer’s business (or whoever uses the documentation we provide), we need to understand processes to ensure these customers or users are efficient and happy with both, the product we describe and the documentation we create.

In a nutshell: We need to know business processes, so we know which are the right things to do, whether it’s moving our documentation to a CMS, aligning our deliverables with the corporate content strategy, or updating our personas. At the same time, we need to hang on to our tech comm skills, so we know how to do things right.

What others expect for 2012

Here are two trends predicted by Sarah O’Keefe and Connie Giordano that resonated with me. (And I recommend you follow the links to get the experts’ predictions first hand!)

Creating documentation moves to the cloud.

Documentation will follow other content production to the cloud, such as collaborative Google Docs, blogs, and wikis. With this trend, I’m wondering:

  • Compelling event? Will cloud-based tech comm creation take off now – or do we need a more compelling event than ubiquitous access and the (alleged) lower operational costs?
  • Whose market? Will conventional HAT vendors be the major players, so their customers can keep their sources and move them to the cloud – or will HAT vendors (and tech comm’ers sources) be disrupted by other providers?

Documentation design aligns with mobile UX.

Tri-pane web sites are too large for effective user assistance on mobile devices which require new, condensed documentation designs. These will in turn feed back into other documentation formats. Here, I’m wondering:

  • Turf wars? Will tech comm’ers and UX designers engage in turf wars – or pool their skills and resources for better user assistance?
  • Innovation? Will the reduced real estate lead to genuinely new ways of presenting user assistance – or to a resurgence of minimalism?

What no one expects for 2012

The survival of the classical tech comm job profile

Virtually all tech comm predictions and trends for 2012 are driven by external forces of change: The cloud, mobile devices, or new social media habits which expect collaborative documentation and user-generated content.

At the same time, the trends and predictions I’ve seen show little initiative to define or advance technical communications as a profession around a set of skills and tools, methods and processes. The classical tech comm job profile (as described in the Occupational Outlook Handbook by the U.S. Bureau of Labor Statistics, for example) that is centered around deliverables and tools, formats and styles seems to wane.

In many sectors, technical communications has instead become a function that contributes to corporate assets and the bottom line. Technical communicators provide it, as do content strategists, information architects or UX designers. And whoever pays them doesn’t necessarily care who does it – or even know the difference.

In a way, this is the other side of the coin of the trends above. Scott Abel points out:

The real value we provide is not our mastery of the style guide. Rather, it’s our ability to impact the customer experience in positive ways.

And Connie Giordano calls for the evolution of “integrated technical communications” to coordinate and integrate

all technical communication processes, tools, functions, and sources within an organization to convey information and knowledge relevant to optimizing the users’ product experience.

So I believe technical communications is here to stay – but we may have to look for news ways of selling what we do and deliver.

What do you expect for 2012?

Will you follow the trends above? Are there others in your future? Please join the discussion, leave a comment.

All aboard! Onwards to structured authoring!

Our team of technical writers is embarking on a journey towards structured authoring. With 10 writers, we’ll move from an unstructured Word to PDF/CHM environment to a structured Flare to WebHelp/PDF environment. Or I should say “semi-structured”: We do have an information model based on DITA, but we won’t actually be able to enforce it with Flare (which we knew before we chose the tool).

It’ll be an interesting cruise, to be sure! Four writers already apply topic-based authoring rather than the previous free-form documentation guided mainly by common sense. The others have had training, but no real opportunity to write topics continuously. We have drafted tighter new processes to draft, write, review and edit topics to replace the previous loose processes of writing and reviewing, but they are not in place yet.

And then there’s the new tool, of course. Only one of us has worked with Flare before. Many of us are excited about getting Flare. Some really like it – what we’ve seen in several demos so far. Others just really loathe the current writing environment.

“Regarding the pain of others”

So as we’re about embark, I’ve been looking out for others who’ve taken the trip before. Scriptorium’s State of Structure webcast has been very helpful: Its results of a survey among 200 tech communicators helps to position us in relation to others who are currently implementing structured authoring or considering it. It also collects some mistakes respondents have gone through. I’ll just be quoting a few select points, but the whole webcast by Sarah O’Keefe is totally worth checking out, so thanks to Scriptorium for making this webcast available!

Reasons

The top reasons why survey respondents (consider to) move to structured authoring made us nod emphatically: Reuse, consistency and cost savings are also at the top of our wish list of achievements. Looking ahead, it’s promising that the majority of respondents achieve these goals.

We’ll also take other goals that respondents achieved, whether it’s to automate processes or to reduce content (oh, yes, please, we’re not even exactly sure how much redundant, almost identical content we have). So far we’re confident, we’re not only doing the right thing, but doing it for the right reasons, too!

Efforts

Savings have their own price, of course. Sarah’s survey confirms several cost points we’ve already identified in the project.

  • Converting legacy content is a biggie for us, simply because we currently have a lot of stuff.
  • Redefining output layout will take time, but will be worth it given what Flare is capable of doing with CSS in both web and print outputs.
  • Integrating a new system with its writing and publishing processes into our product and workflow systems will also take some time.

Mistakes

Mistakes have been made by others before us, and we’ll have plenty of chances to make our very own mistakes. If we’re lucky, we can avoid repeating the mistakes of others:

  • Planning and project management cause problems, maybe because most companies lack the experience of major documentation overhaul projects. Sarah specifically mentioned the lack of understanding of the project scope and of the need for testing. So we’ll look through our project plan again and ensure that the estimates are plausible.
  • Converting legacy contents can also get you into trouble, especially when you convert something that’s less than structured. It doesn’t help if you reserve too little time to do it or get inexperienced people to do, whether it’s off-shore labor or student helpers. That’s sound advice: GIGO (“garbage in, garbage out”) can certainly endanger the expected benefits. A new tool can help us be more efficient, but we still have to learn and apply structured writing in topics. So this confirms one of my two tech comm dogmas: Don’t get a new tool to fix your processes!

Setting out

What do you think? Is our crew well-equipped, given a tried and proven method, well-defined processes, a new tool and the words of warnings above? If you have additional advice, please leave a message.

Improve documentation with quality metrics

Quality metrics for technical communication are difficult, but necessary and effective.

They are difficult because you need to define quality standards and then measure compliance with them. They are necessary because they reflect the value add to customers (which quantitative metrics usually don’t). And they are effective because they are the only way to improve your documentation in a structured way in the long run.

Define quality standards

First, define what high quality documentation means to you. A good start is the book Developing Quality Technical Information: A Handbook for Writers and Editors from which I take these generic quality characteristics for documentation topics:

  • Is the topic task-oriented?
    Does it primarily reflect the user’s work environment and processes, and not primarily the product or its interface?
  • Is the topic up-to-date?
    Does it reflect the current version of the product or an older version?
  • Is the topic clear and consistent?
    Does it comply with your documentation style guide? If you don’t have one, consider starting from Microsoft’s Manual of Style for Technical Publications.
  • Is the topic accurate and sufficient?*
    Does it correctly and sufficiently describe a concept or instruct the customer to execute a task or describe reference information?
  • Is the topic well organised and well structured?*
    Does it follow an information model, if you have one, and does it link to relevant related topics?

* Measuring the last two characteristics requires at least basic understanding of topic-based authoring.

The seal of quality

You may have additional quality characteristics or different ones, depending on your industry, your customers’ expectations, etc. As you draft your definition, remember that someone will have to monitor all those characteristics for every single topic or chapter!

So I suggest you keep your quality characteristics specific enough to be measured, but still general enough so they apply to virtually every piece of your documentation. Five is probably the maximum number you can reasonably monitor.

Measure quality

The best time to measure quality is during the review process. So include your quality characteristics with your guidelines for reviewers.

If you’re lucky enough to have several reviewers for your contents, it’s usually sufficient to ask one of them to gauge quality. Choose the one who’s closest to your customers. For example, if you have a customer service rep and a developer review your topics, go with the former who’s more familiar with users’ tasks and needs.

To actually measure the quality of an online help topic or a chapter or section in a manual, ask the reviewer to use a simple 3-point scale for each of your quality characteristics:

  • 0 = Quality characteristic or topic is missing.
  • 1 = Quality characteristic is sort of there, but can obviously be improved.
  • 2 = Quality characteristic is fairly well developed.

Now, such metrics sound awfully loose: Quality “is sort of there” or “fairly well developed”…? I suggest this for sheer pragmatic purposes: Unless you have a small number of very disciplined writers and reviewers, quality metrics are not exact science.

The benefit of metrics is relative, not absolute. They help you to gauge the big picture and improvement over time. The point of such a loose 3-point scale is to keep it efficient and to avoid arguments and getting hung up on pseudo-exactitude.

Act on quality metrics

With your quality scores, you can determine

  • A score per help topic or user manual chapter
  • An average score per release or user manual
  • Progress per release or manual over time

Areas where scores lag behind or don’t improve over time give you a pretty clear idea about where you need to focus: You may simply need to revise a chapter. Or you may need to boost writer skills or add resources.

Remember that measuring quality during review leaves blind spots in areas where you neither write nor review. So consider doing a complete content inventory or quality assessment!

Learn more

There are several helpful resources out there:

  • The mother lode of documentation quality and metrics is the book Developing Quality Technical Information by Gretchen Hargis et al. with helpful appendixes, such as
    • Quality checklist
    • Who checks which quality characteristics?
    • Quality characteristics and elements
  • Five similar metrics, plus a cute duck, appear in Sarah O’Keefe’s blog post “Calculating document quality (QUACK)
  • Questionable vs. value-adding metrics are discussed in Donald LeVie’s article “Documentation Metrics: What Do You Really Want to Measure” which appeared in STC’s intercom magazine in December 2000.
  • A summary and checklist from Hargis’ book is Lori Fisher’s “Nine Quality Characteristics and a Process to Check for Them”**.
  • The quality metrics process is covered more thoroughly in “Quality Basics: What You Need to Know to Get Started”** by Jennifer Atkinson, et al.

** The last two articles are part of the STC Proceedings 2001 and used to be easily available via the EServer TC Library until the STC’s recent web site relaunch effectively eliminated access to years’ worth of resources. Watch this page to see if the STC decides to make them available again.

Your turn

What is your experience with quality metrics? Are they worth the extra effort over pure quantitative metrics (such as topics or pages produced per day)? Are they worth doing, even though they ignore actual customer feedback and demands as customer service reps can register? Please leave a comment.