Describing or clarifying: How do we explain stuff?

So our company has an elevator pitch competition. The task is to explain in 30 seconds “What does our company do?”

The submissions indicate that “to explain” means different things to different people. To some, it means “to describe or summarise information”. To others, it means “to translate and clarify for others”.

The two meanings reach different audiences with different levels of success. Description and summary are best when you have a similar background and outlook as your audience. Translation and clarification are best to bridge differences between you and your audience. Such a difference could be much vs. little experience with a product or knowing how to set it up vs. having to use it every day. To bridge this difference, you have to put yourself into the users’ shoes and remember how it is not to know all the things you know now.

Technical communication skills, our experience when crafting content, and the way we can structure information let us excel in the translating and clarifying. And the better we know our audience, the better we are at it.

Developers and testers who contribute to user documentation frequently deliver very good descriptions – and technical communicators can help them by translating and clarifying them further, if necessary.

We often hear that “everybody can write”. But what it means is that many people can describe (and some don’t even do that very well) – but few can clarify as well as a technical communicator.

If the distinction between describing and clarifying resonates with you, I’m sure you can think of more examples.

Structured topics, taxonomies & lightning at #STC14

More than 600 technical communicators met for the annual STC Summit in Phoenix, AZ, to demonstrate and expand the many ways in which they add value for users, clients, and employers. In a series of posts, I describe my personal Summit highlights and insights that resonated with me.

The journey to structured topics

I’ve framed my presentation “From Unstructured Documentation to Structured Topics” as a journey to a fjord precipice: Daunting, but nothing you cannot achieve with a some planning and a little bit of confidence.

Concluding slide for my presentation on structured topics

The summary outline during Q&A, photo by @dccd.

In this “project walk-through mini-workshop”, I outlined how we can combine core tech comm proficiency, such as topic-based authoring, with content strategy and project management skills to master the migration to structured topics. The applied skills and the resulting content architecture can be a solid foundation for a full-blown future corporate content strategy that highlights technical communicators and their skills.

The engaged Q&A afterwards showed that the ideas resonated with the 80+ attendees. Many technical communicators are comfortable and well qualified to expand their topic-writing skills into information architectures and content modelling.

The trip to taxonomy

In her session “How to Create and Use a Functional Taxonomy“, Mollye Barrett told of a similar challenge: She was originally brought in to create the documentation for a highly customised implementation of financial software. When it became apparent that not just the software needed documentation, but also the workflows and processes which it was supposed to support, she wound up creating a taxonomy!

As she laid out her case study, Mollye showed how technical communicators’ core skills of task analysis and task-oriented documentation qualify them to create a taxonomy of business functions that maps a software’s functions to specific user tasks.

The project essentially consisted of explicating the company’s multi-faceted tacit knowledge and connecting all the pieces:

  • Create a consistent terminology by defining the standard financial terms in use.
  • Describe and classify the various functions of the software.
  • Identify and describe the users tasks which need documentation.

Mollye studied disparate, unstructured legacy documents, examined the software, and worked with specialists from the business and IT sides. Her main driver was her persistence to eliminate ambiguity, her goal to define clear terms – or put more simply: to create order out of chaos.

Lightning strikes twice

A popular staple of the STC Summit is the two lightning talk rounds, moderated with understated wit by Rhyne Armstrong.

Liz Herman drove forward the multi-skilled tech comm theme with multiple costume changes in her talk “Perfecting the Hat Trick, Why My Hair’s Messy“. She demonstrated how tech comm’ers don the hats, caps, and helmets of sailors, fire fighters, cowboys, football players, the Irish, something I’ve forgotten and many more in just five minutes:

Liz Herman wearing different hats

Liz Herman dons diferent hats, photo by @dccd.

And Viqui Dill showed us how to use social media right in “Social Media is not the Devil“, her rousing karaoke performance to the tune of Charlie Daniels’ “The Devil Went Down to Georgia”:

Viqui Dill's karaoke lightning talk

Photo by @marciarjohnston.

 

Skill map, wicked ambiguity & influence at #STC14

More than 600 technical communicators met for the annual STC Summit in Phoenix, AZ, to demonstrate and expand the many ways in which they add value for users, clients, and employers. It almost sounds like a non-theme for a conference, but that was a common impression I took away time and again, prompted by my personal selection of sessions, no doubt!

This thrust is actually very much in line with the STC’s revamped mission statement (scroll down a bit) which includes these objectives:

  • [Support] technical communication professionals to succeed in today’s workforce and to grow into related career fields
  • Define and publicize the economic contribution of technical communication practices […]
  • Technical communication training fosters in practitioners habits […] that underpin their ability to successfully perform in many fields

I’ll describe my personal Summit highlights and insights that resonated with me. For the mother of all STC Summit blogging, visit Sarah Maddox’ blog with a summary post which links to posts about no fewer than 10 individual sessions!

Connecting across silos with diverse skills

A good illustration for how easily tech comm skills and tasks connect with and seep into other job roles is Red Gate Software’s Technical Communication Skills Map. It appeared at least twice at the STC Summit: In STC CEO Chris Lyons’ opening remarks and again in Ben Woelk’s lightning talk.

Tech Comm skill map from Red Gate Software

Depending on a tech comm’ers talents and tasks, he can collaborate closely with – or develop into – a product manager or project manager, a UX specialist or tester.

Standing united against “wicked ambiguity”

Jonathon Colman, content strategist at Facebook, took a very high-level view of our profession’s challenges in his keynote address on Sunday evening. Technical communication that travels millions of miles on NASA’s Voyager or that must last thousands of years unites us against ambiguity – regardless of our different skills and various everyday tasks.

Jonathon Colman at TC14 keynote

Such ambiguity can become “wicked” in fields such as urban planning and climate change, because it makes the issue to be described hard to define and hard to fix with limited time and resources. The solutions, such as they exist, are expensive and hard to scope and to test. Still we must at least attempt to describe a solution, for example, for nuclear waste: Tech comm must warn people to avoid any contact in a message that is recognizable and comprehensible for at least 10,000 years.

Jonathon ended on the hopeful note that tech comm’ers can acknowledge wicked ambiguity, unite against it, race towards it, embrace it – and try the best we can.

Wielding the informal power of influence

Skills are not always enough to connect us tech comm’ers successfully with other teams and departments. Sometimes, adverse objectives or incentives get in the way. Then we need to wield the informal power of influence. Kevin Lim from Google showed us how with witty, dry understatement – a poignant exercise of persuasion without resorting to rhetorical pyrotechnics.

Kevin Lim on Influence Strategies

Influence, Kevin explained, is the “dark matter of project management” that allows us to gain cooperation with others. We can acquire this informal power by authentically engaging colleagues with our skills and practices. (The “authenticity” is important to distinguish influence from sheer manipulation.) To optimize our chances for successful influence, we need to align our engagement with the company culture, a corporate strategy, and the objectives of key people, esp. project managers and our boss.

Put everything under a common goal and engage: “Don’t have lunch by yourself. Bad writer, bad!” – appropriate advice for the Summit, too!

– Watch this space for more STC14 coverage coming soon!

Preview my STC14 session about structured topics

If you are curious about moving from unstructured documentation to structured topics – or if you cannot decide whether my session at the STC Summit next week is for you – here are the slides, maybe you find them helpful:

Moving to topics? Join me at STC Summit!

If you’re moving to topic-based authoring (or considering the move), join me next week at the STC Summit in Phoenix for my presentation “From Unstructured Documentation to Structured Topics“.

The format will be a “project walk-through mini-workshop” in a regular session slot of 45 minutes. That means you won’t get a detailed project plan or silver bullet for a successful migration to topics. But you will get plenty of information about the involved methods, options, and risks. Most importantly, you will get a chance to improve your confidence – and hence your chances for success – for such an important project!

Here’s the abstract:

You’re sold on the benefits of structured content, but don’t know how to begin? This session shows you how to implement topic-based authoring by converting existing unstructured documentation into structured topics, even in regular office software such as Word.

The underlying process works for online help, user manuals, but also other content, such as wiki articles, training materials, etc., as long as you know which deliverables you need to create and their approximate purpose.

There are several stages to the process:

  1. Identify topic type or types per content section, for example, concept, task, reference, or use case. Content which mixes topic types can be sorted out with a little care.
  2. Re-chunk your sections to turn them into stand-alone topics. You can delete redundant or obsolete information which does not belong into a topic. Or you can spin it off into a topic of its own or integrate it with another, more suitable topic. Special strategies help you to deal with topics that are too complex.
  3. Re-sequence your topics, so they flow nicely when users read not just one or two of them, but need to follow a complete process. If the topic sequence doesn’t flow nicely, you may need to add some auxiliary topics which orient readers and ensure a good flow.
  4. Rewrite headings to guide readers to give users enough orientation when they read just one or two topics. Rephrase them so users can quickly dip in and out of your documentation.
  5. Add links between related topics to ensure that the structured topics work in various use cases, even if users refer only to few topics.

This presentation emphasizes practical tasks; you will

  • How and why to create a content model
  • How to identify topic types in existing content
  • How to re-chunk content into true topics
  • How to sequence your topics
  • How and why to write good headings for your topics
  • How to link related topics

We’ll meet on Monday, 19 May at 9:45 in 106 BC in the Phoenix Convention Center. Hope to see you there!

Second day at MadWorld 2014

A well-rounded program and excellent organisation at the second MadWorld user conference avoided many of the traps that can mar a sophomore effort and point to a way of growth into the future. The second day again saw great informative sessions and networking around doc issues and careers, not to mention lunch and drinks under the San Diego spring sun… 🙂

At MadWorld 2014 welcome event

Advanced single sourcing of content in Flare uses a clever combination of snippets and conditions which are called, not surprisingly, “snippet conditions”. These allow you to maintain reusable chunks of content with slight variations – which one is ultimately displayed is controlled on the topic that contains the snippet.

In his single sourcing presentation, Paul Pehrson had many more tricks up his sleeve:

  • To repeat a topic in the same TOC (with or without variation), put the topic content into a snippet and embed it into an empty unique topic container – else your breadcrumb trail in online help goes haywire, because it cannot know which of the two occurrences it should refer to.
  • To reuse similar front matter across several PDFs, you can put the front cover, the copyright page, and the table of contents into a TOC of their own, import it as a nested TOC into each target and control the individual differences with variables.
  • To maintain individual project styles and corporate styles, create your project CSS and import the corporate CSS into it by using the @import CSS command.
  • To share condition sets or variable sets across two projects, store them as External Resources where you can keep them in sync with one another.

The lightning talk round was a very colorful fast-paced session. Passionate speakers addressed widely different topics, including

  • A MadCap version of Jeopardy by Pam Coca (“Who is our tech comm game show host?” 🙂 )
  • Several ways to avoid inline formatting with the help of Chuck Norris by Scott DeLoach
  • MadCap’s latest group of spokespersons: Dentists who recommend Flare for more smiles than any other help authoring tool by yours truly

The 6 topics wouldn’t necessarily have warranted a full session each, but they were fun and valuable to have in a compressed 5-minute format.

Our addiction to meaning gave attendees food for thought as I led them through a quick overview of semiotics and mental models which included Japanese restaurants and misheard song lyrics. In the Q&A session we explored how we could apply such insights into cognition to offer users confidence in their tasks along with meaningful instructions.

Writing, editing and translating topics is frequently done by people who don’t have Flare and don’t need it either.

Mike Hamilton at MadWorld 2014

Mike Hamilton showed several scenarios to address special demands in authoring workflows:

  • For writing and editing users of Contributor, you can customize exactly which Flare files, such as snippets, condition and variable sets, are available to them. You can also lock down certain text in Contributor templates to enforce structure and labels.
  • For reviewers using Contributor, you can now apply conditions and variables when creating the review package, for example, to keep internal data out of packages for external reviewers.
  • For translators of Flare topics, you can create customized export packages and choose to preserve the code of snippets and variables or to convert them into flat text.

For MadWorld 2015, my guess is that we might see fewer sessions on basic Flare techniques which are well-covered in available webinars and other information online. Instead, I’d recommend more sessions that show how MadCap supports tech comm workflows and business cases. These could cover, for example:

  • How to integrate documentation and training content with Flare and Mimic
  • How to build a business case around single-sourcing topics that can turn tech pubs into a profit center which leverages content as corporate assets
  • How to use MadCap’s products in corporate, large-scale products, like Lynn Carrier has shown this year.

Overall, MadWorld was a very instructive, very fun event! The user conference format affords a welcome focus which sometimes gets lost in industry-wide conferences which have to try to satisfy more disparate needs.

CEO Anthony Olivier spoke of welcoming us all to the “MadCap family”. I think that metaphor is stretching it a bit. For me, it feels more like a versatile community of dedicated, often enthusiastic users who get to hang out with one another and with, say, a band we like – and we get to spend some time back stage in the hospitality suite. 🙂

First day at MadWorld 2014

The applicable advice about MadCap products from speakers and staff, the profound discussions about tech comm in general, and attendees’ enthusiasm to share and learn from another make MadWorld 2014 a perfect combination of a tool-centric user workshop and a “regular” tech comm conference.

The hip Hard Rock Hotel adds a flair of giddy excitement – after all, some of us 200 tech writers are a little too nerdy to feel comfortable when they’re treated like rock stars… 🙂

The welcome event started with a tongue-in-cheek video: MadCap’s signature cartoon figure Simon has become addicted to – the horror! – inline formatting. His brave MadCap colleagues stage an intervention to save him… I hope this video will soon come to a Youtube channel near you because it’s a lot of fun to watch!

At MadWorld 2014 welcome event

Product Evangelist Jennifer White introduced many of the MadCap key players in attendance, then MadCap CEO Anthony Olivier welcomed us and encouraged us to join the “MadCap family” to learn and network.

To turbocharge our authoring, Nita Beck showed us how we could bypass some of the pretty, but slow-to-use Flare ribbon features for faster alternatives:

Nita Beck's session at MadWorld 2014

  • Custom templates for topics and snippets can include  much of the recommended structure and formatting. Filling in such a template saves the time to manually reconstruct such structure and formatting in just about every template.
  • Contributor can be used by tech communicators as a low-distraction alternative to Flare for initial drafts which keeps many of the more complicated features of Flare out of the way for the time being.
  • Keyboard shortcuts for many Flare features are faster than doing the same with the mouse.

Pattern recognition elicited a lively Q&A session from the capacity crowd in my presentation. We found that, whatever patterns we recognize, they generally depend on their context in which the retain meaning. From there, we branched out to discuss information architecture and user experience design and how they also rely on patterns.  We also tried to tease out the pattern why my slides had broken the nice arrows and replaced them by the average sign in all places – but one!

Flare projects can support a scalable content architecture, as  Lynn Carrier showed. She described her employer’s project of introducing single-sourcing with Flare to cope with 3000 pages of docs per writer per year. The keys to her successful project were:

  • Ownership to involve all writers and tap into each writer’s skills and interests to assure them they weren’t writing themselves out of a job.
  • Infrastructure to make sure they have the tools and processes in place to create the deliverables in the structure and quality they customers need.
  • Reuse to ensure the most efficient way to single-source content, they carefully mapped out where and when to use snippets, conditions and variables. Conditions are heavily applied on folders and topics move around in the folder structure so they are available for the products and versions where they are needed – and only there.  They use few variables because in-sentence, they create problems during translation.
  • Publishing based on TOC templates and target templates ensures consistency in structure and easy maintenance.

Since we have very large projects as well, this was a very valuable session which gave us lots of ideas how to use Flare’s reuse and template features in a corporate environment. Lynn will have another presentation on the second day to show how a wizard enables customers to compile exactly the documentation they need into a PDF. This is something we’ve long thought of doing, so seeing her solution will be a great inspiration for us!

Face-to-face support beats written documentation any time which is why the “hospitality suite” is so great. It’s like walking into MadCap’s helpdesk as if the smartest MadCap users were your colleagues in the next room.

A good balance between sessions and networking opportunities allowed us to trade quirky, but powerful solutions around Flare that users have come up with, to trade career stories and make new friends among a group of technical communicators as diverse and friendly as you could hope to find at any tech comm conference.

Chilling in the early evening at MadWorld 2014

Well done, MadCap, I’m psyched for day two!

Rate and improve tech comm with the Net Promoter Score

You can use the Net Promoter Score to rate and improve technical communication – but it works best on the scale of corporate content for which the score was designed. Here’s why and how.

The Net Promoter Score (NPS)

The Net Promoter Score measures customer loyalty and satisfaction with a company or offering.  It boils down difficult issues with perceived quality to a simple question:

How likely are you to recommend our company/product/service to your friends and colleagues?

Usually, the answers are ranked on a scale from 1 (highly unlikely) to 10 (very likely). You distinguish the percentage of respondents in three groups:

1

2

3

4

5

6

7

8

9

10

Detractors

Passives

 Promoters

  • Detractors are people who replied with 6 or lower.
  • Passives are people who rated your offer as 7 or 8.
  • Promoters are people who answered 9 or 10.

The NPS is the percentage of promoters minus the percentage of detractors: If 20% of your customers are promoters who really like your offering (and answered 9 or 10) and 30% don’t think too highly of it (and answered 6 or lower), then your NPS is 20-30 = -10. Generally, an NPS  above zero, indicating more promoters than detractors, is considered a good thing…

NPS for tech comm?

So how can we apply that score to tech comm? Are customers loyal to a help system? Are they likely to recommend it to friends or colleagues? Probably not in isolation of the described product.

There don’t seem to be a lot of ideas “out there” that connect NPS to documentation, but one article by JoAnn Hackos does: Influencing the Bottom Line: Using Information Architecture to Effect Business Success.

The key to turning the NPS into a useful tool for documentation is to take the scope from the NPS, not from the documentation! Hackos shows how we can relate the NPS to corporate and product content as a whole. This includes tech comm, but also marketing and sales content. This is what drives the customer experience which the NPS reflects. And it takes improvements in the corporate-wide content and its information architecture to increase the NPS.

Hackos describes a company which found that content contributed to the low NPS:

… senior management became advocates for significantly improving content quality. That meant changing the relationship between the technical authors and the product developers, requiring that information architects establish close relationships with customer support and training, and redefining the type of content that would be delivered to customers in the future.

– Sometimes, tech comm can adopt management tools to their purpose and scope, but with the NPS it seems most feasible to plug in to the corporate use of the tool.

Does this make sense? Can tech comm benefit from NPS and improvement initiatives? Or is that a hare-brained idea, and we should really stick to key performance indicators suitable for tech comm?

Tech comm conferences too far, too costly?

European tech writers who don’t have the time or the money to attend a tech comm conference can still get a lot of knowledge and networking by attending the tekom Europe Roadshow – right in their backyard, comparatively speaking!

tekom Euurope Roadshow logo

The tekom Europe Roadshow puts on one-day events which are easier to get to from many places around Europe and waaay cheaper than full conferences. They bring together professionals from the region for presentations, discussions and networking.

Find the roadshow event nearest to you:

  • Paris, France: Monday, September 8
  • Ghent, Belgium: Wednesday, September 10
  • Eindhoven, Netherlands: Friday, September 12
  • Copenhagen, Denmark: Tuesday, September 16
  • Warsaw, Poland: Thursday, September 18
  • Istanbul, Turkey: Monday, September 22
  • Bucharest, Romania: Wednesday, September 24
  • Vienna, Austria: Friday, September 26

This year’s topic is:

TechComm Workflow & Media Production
Integrate Intelligent Media Production in Your TechComm Workflow

But even if that is not your professional focus, it might still be worth going for the contacts.

For more information, visit the event web site.

(Full disclosure: I have spoken at an early precursor of the tekom roadshow and think it’s a brilliant format, but I’m not involved with this event series.)

Why a content spec saves you time and money

A content specification will save you troubles, time, and money, especially when you’re not the lone writer on a documentation project. It will ensure that you offer your users consistent and holistic documentation across a team of writers.

A content specification is a list of all topics to be created which ideally maps planned topics to requirements and/or designs to ensure comprehensive and complete documentation. It usually comes in a table with one row per topic, listing:

  • Topic heading and/or file name
  • Topic type (concept, task, reference, or whatever else you may use)
  • Topic owner
  • Writer (in case writers may be different from topic owners)
  • Reviewers (for example, subject-matter experts)
  • Date ready for review or for post-review editing (depending on your workflow)
  • Mapped deliverables (where the topic appears, for example, a certain user manual, the online help, etc.)
  • Time estimate (how long will it take to write the topic, optionally, including review)
  • Documentation task type, to help you estimate time:
    • Create new topic
    • Major rewrite of existing topic
    • Minor fix or addition to existing topic

Without it, you risk delivering a bunch of topics with gaps in some places and overlaps in others. You can still string them together, but no overview topic can convey a coherent content experience, if you didn’t plan for it and bake it into the topics and their structure.

So a content spec is a blueprint of your documentation project, just as you would create one before you start building a house – or design any kind of experience.

Yet content specs often elicit negative reactions…

“Oh, but we’ve managed without one so far…”

Many tech writers I know are very competent, and a few are lucky to boot. Considering all their projects with more than, say, 50 topics which didn’t use a content spec, I’d bet half of them are incoherent (“organically grown” is an oft-used euphemism).

The cost doesn’t stop at poor user experience. Such examples are also more difficult and more expensive to maintain, especially if you have overlapping topics and don’t remember to update both of them…

“Bah, reality eats specs for lunch…”

To an extent, yes. But on the whole, reality is an orderly patron. In my experience, the final documentation reflect the approved content spec in up to 80% of the topics. An average 10% of the topics get added during the writing, where concepts or prerequisite and auxiliary procedures are found missing. Another 10% of the topics get reorganized because the initial content spec misunderstood something, or because content simply makes more sense somewhere else.

“Even if, we’ll fix it later…”

Yes, you can. But once again it’s very expensive. Remember that the list of topics is only one result of the content spec. Their structure is another. Finding that a structure by workflows is inferior to a structure by, say, instrument, requires not just re-ordering topics, but re-writing a lot of them.

You can avoid this by drawing up a complete content spec before you write a single topic and getting it signed off by the key stakeholders, so they know rather well what documentation they will get. The 20% deviations mentioned above are usually justifiable, if they conceivably improve the deliverables.

– Given that content specs are a big help in creating and maintaining efficient and effective user documentation, I strongly recommend using them. If you have any experience with or without content specs, I’d love to hear it.