How tekom crowdsources conference programming

The German tech comm association tekom has found an innovative way to involve its members in conference programming: They invited 15 tech comm practitioners for a workshop on 25 July to discuss the status quo of software documentation with best practices and challenges. The collected issues and topics will go into the “Software Documentation Forum”, a 1-day stream with 7 sessions at this year’s tekom conference in October. (I was lucky enough to participate, and I blog on my own account; tekom’s view and position may differ.)

The event was by invite which helped to ensure that a wide variety of technical communicators were present, from the software departments of industrial companies (whose employees make up the majority of tekom’s membership) to pure software companies, from small companies with a lone writer to very large corporations which have hundreds of technical communicators worldwide.

Common challenges

The workshop started with a short round of introductions, followed by a round of 10-minute presentations where participants addressed these issues:

  • Specific demands on software documentation in your company
  • Best practices that have proven successful
  • Current technological, organizational and economic challenges
  • The future of software documentation in your opinion
  • How do you improve the value of software documentation in your company
  • How can tekom help to improve the value of software documentation

As the presentations proceeded, common topics emerged. For example, several participants face the need to keep tech comm effective, efficient and consistent as processes become globalized. Some work in teams that are distributed across several countries. Others have writers far away from developers and engineers. Others reported difficulties to find writers with sufficient language skills.

In my opinion, the variety of participants was an advantage. Our backgrounds are different enough to represent a well-rounded spectrum of issues and practices. Yet we are similar enough so our experiences resonate with at least some of the other participants. One participant even uses the software of another participant – and had a few comments about its documentation…

Joint programming

After lunch, we went into solution mode to see how tekom can support software technical communicators – and specifically which sessions at the tekom conference in October can address the issues and challenges we raised.

Here are some of the issues we’ve collected that we would want to see at tekom. Much of this is still pending and depends on whether tekom can find people to speak or discuss the issues, so there might be all kinds of changes before the following hits the official conference program:

  • How tech comm processes change when you move from a waterfall process model to agile processes
  • Integrating tech comm with GUI design for a better user experience (a similar issue appeared as “progressive disclosure” in a talk at the STC Summit in Chicago)
  • When to apply DITA and when not to
  • Norms and legal regulations in software documentation
  • How globalization affects tech comm processes and quality

Crowdsourcing benefits

I think the workshop was a great success thanks to its good preparation and well-defined goals. And I think it will turn into a worthwhile series of sessions at the conference.

For me personally, it was very good to connect and network with the German software tech comm professionals, with lots of exchanges and networking. I even felt some of the excitement and buzz that I get at actual conferences. So I think we have good reasons to look forward to tekom12. 🙂

Pattern recognition for tech comm as STC webinar

If you’ve missed my session on “Pattern Recognition for Technical Communicators” at the STC Summit in Chicago, you now have another chance: On Wednesday, 8 August at 1 pm EDT / 7 pm CEST, I’ll be presenting the session as a live webinar.

Session abstract

Pattern recognition is an essential mental strategy for acquiring and disseminating knowledge, though most of us are not aware of it. When applied consciously, technical communicators can employ pattern recognition processes to develop effective documentation more efficiently and help readers orient themselves.

Learn what pattern recognition is and how it works, what pattern recognition strategies you may already be employing without even knowing it, and how you can employe those strategies to efficiently acquire information, structure documentation, and support users to:

  • Make sense of new subject matter
  • Start to build new documentation
  • Design and structure documentation
  • Support users efficiently

What attendees said

Attendees at the STC Summit came away with these insights (according to session evaluations):

  • “Pattern recognition can help chunk topics, find reuse opportunities, & help your reader navigate.”
  • “The session confirmed what I believe is needed, in that users want to know what to expect in each chapter, or book, that that is done by applying pattern recognition. I just didn’t have a term for it before.”
  • “Classifying TOC as top down, search/index as bottom up, combining the two to find a balanced form of communication.”

Sign up

You can sign up for the webinar at the STC web site. See you on the 8th! (Well, not really – but we’ll be able to hear each other… 🙂 )

P.S. This will be my second webinar – and I’ll be sure to take the lessons and suggestions from my first one to heart!

DITA with confidence (DITA Best Practices book review)

I recommend DITA Best Practices: A Roadmap for Writing, Editing, and Architecting in DITA by Laura Bellamy, Michelle Carey, and Jenifer Schlotfeldt to anyone who looks for practical guidance with DITA or topic-based writing with a future DITA option. (This book review has appeared in the Summer 2012 issue of ISTC’s Communicator magazine on p. 57 in different format.)

Cover of the DITA Best Practices book

The DITA bookshelf has been growing slowly but surely. Thanks to the recent addition of the seminal DITA Best Practices, you can now find most information you need for a DITA implementation project in one book or the other.

The paperback comes from IBM Press which has also given technical writers Developing Quality Technical Information by Gretchen Hargis, et al. If you know that recommended title, you will enjoy the same usefulness and clear layout in this new book.

Starting with topics

DITA Best Practices addresses the practical concerns of writers, editors and architects of DITA content in three well-structured parts. The first part on writing starts with a chapter on topic-based writing and task orientation as two methods underlying DITA. The authors give clear instructions and guidelines for both methods. A generous amount of tips, best practices and ‘watch out’ warnings add the voice of the experienced practitioner which help to keep you on track or avoid beginner’s mistakes. The fictional ‘Exprezzoh 9000N’ coffeemaker is used consistently throughout the book to illustrate tasks and topics. Explanations why and how the methods work give writers the motivation to apply the advice with confidence. The chapter ends with a concise wrap-up section of the big points and a checklist to ensure you apply these big points in your work.

I have outlined the first chapter in such detail, because its clear and competent combination of elements — instructions, tips and warnings, examples, motivation, wrap-up and checklists — make this book so useful throughout.

One chapter each is then dedicated to topic types task, concept and reference. Each chapter describes the characteristics and motivation for the topic type, followed by instructions and examples along the standard DITA topic structure. The task chapter, for example, proceeds from <title> via <shortdesc>, <context>, <prereq> to <steps>, etc. However, most guidelines, examples, tips and warnings apply to good topic-based writing practices in general.

A chapter dedicated to DITA’s ‘short description’ element with its multiple uses in topics, links and search results helps novices with the challenge to use this powerful element correctly.

DITA’s architecture explained

The second part of the book builds on the first. After describing topics as DITA’s most essential building blocks, the book focuses on making topics work together by connecting them and by expanding their usability.

Two chapters show you how to connect topics into a coherent output, such as an online help system or a book. The first chapter on DITA maps explains how to create tables of contents, including bookmaps for print publications. The second chapter on links describes the four different ways to link topics to each other that in DITA. In their reassuring style, the authors help you to distinguish them, so you understand when to use which link type and how to apply each correctly.

The next three chapters explain how to make topics work together by expanding their usability: You can use metadata to make your topics ‘smart’ by adding information such as index terms, addressed audience or described product or version. You can use conditional processing to customise output. And you can reuse content for more consistent output and reduced translation costs. A clear workflow helps you to determine which of your content you can reuse and how.

Editing in DITA

The third part of the book deals with editing. One chapter outlines the steps and decisions of a project to convert your exiting content to DITA. Useful worksheets help you to analyse your content and prepare it for conversion. The chapter on code review helps you to avoid or eliminate common problems that restrict the benefit of your DITA code. Based on their experience, the authors remind you to use DITA topic types and elements correctly, for example, to use the <steps> element in task topics instead of a more generic ordered list. The chapter on content editing applies best practices of editing to DITA topics and maps.

Useful and recommended

Since it came out, I have used this book more than any other technical writing book, except a style guide. Had it been published earlier, it would have saved me many an uncertain moment when I was designing and teaching our information model. I especially appreciate the clarity, the concision and the well-argued advice of do’s and don’ts. For all its benefits, be aware that the book covers neither the DITA Open Toolkit nor DITA specialisations!

DITA Best Practices lives up to its subtitle and provides essential instruction and advice to technical writers, editors and information architects. Project managers will find it equally helpful but should also consider Julio Vazquez’ Practical DITA which reflects a project structure better. Decision-making managers are probably better off with Ann Rockley’s DITA 101 which gives a shorter high-level overview.

Top 3 reasons to do a language edit

There are several good reasons to do a language edit and most have to do with the overall quality and usability of your documentation. That’s the lesson I learned while editing our upcoming Release Notes for language.

The language edit is the unloved stepchild of the tech writing process: It is frequently last in line and gets less attention than the actual writing and the content review. Yet it really impacts the quality of the documentation, precisely because it occurs so close to publication. Here are three benefits of a language edit.

Correct, clear and consistent language

The most obvious benefit is to ensure that the language in the document is correct, clear and consistent. This is why you do it in the first place. And it is especially important if the documentation is written by several authors. Even if you have a really good and detailed style guide, you’re bound to get different ways to interpret it – to put it benevolently… 🙂

Ensuring clear language requires a language editor who is well-versed in the subject – but not so steeped in it as subject-matter experts (SMEs) who may review content who often have too much of an insider’s view and won’t spot potential misunderstandings. (Many SMEs are also uncomfortable with reviewing language, either because they lack the skills or because it distracts them from the review they should be doing.)

Ensuring consistent language requires an editor who reads across the complete document – instead of distributed authors and reviewers who work on a chapter each. Most chapters I encountered had their distinct style, but the differences between chapters couldn’t have been greater if they had been in entirely different documents. This seems to be a stylistic issue at first, but it spills over into the other two reasons for a language edit, so follow me on to the next item which is…

Consistent structure

Our standard operating procedure for writing Release Notes contains a standard structure which each entry should follow. Essentially, it looks like this:

  • Summary of enhanced or new feature
  • User benefit of feature
  • Prerequisites of feature
  • How to use feature
  • Results and next steps of feature

Depending on the scope of an enhancement, some of these structural elements per entry are optional. A new menu option may be self-explanatory enough, so we don’t need to point out the user benefit, and there are no special prerequisites. The whole entry may be but two sentences long.

However, it is a matter of judgment how many details and customer guidance we want to provide. For a team of distributed authors, this is almost impossible to agree upon before hand. A language editor can help to ensure consistent structure. This may require consulting with the author about missing structural elements, but if they’re quick to respond, it is well worth the effort.

Raising documentation standards and value

While you are talking with authors about missing elements, this is a good chance to offer feedback about recurring issues. I’ve noticed that some authors have their “favorite” mistakes. One has missed a guideline in the style guide and keeps repeating the same small mistake. Another may have a larger misunderstanding, whether it’s about topic types, the use of personas, etc.

The language editor not merely improves a document, but in the process also performs a thorough audit of current documentation practices. So she or he can help to ensure adherence to documentation standards and raise the value of the documentation you publish. A language edit is often the last chance to ensure your final document feels and works as it should and lives up to your corporate and documentation standards. Omit it at your own peril.

For more about editing, see my earlier post “Editing for tech writers“.

– Do you do language edits? What benefits do you get from them? Which shortcomings made you wish you had done one? Please leave a comment.