Scott Abel on Structured Content at TCUK12

Scott Abel delivered his keynote It’s All About Structure! Why Structured Content Is Increasingly Becoming A Necessity, Not An Option in his usual style: Provocative, but relevant, fun and fast-paced (though he said he was going to take it slow). He even channeled George Carlin’s routine on Stuff: “These are ‘MY Documents’, those are YOUR documents. Though I can see you were trying get to MY Documents…”

His style doesn’t translate well onto a web page, so I’ll restrict myself to his 9 reasons Why Structured Content Is Increasingly Becoming A Necessity:

  1. Structure formalizes content, so it can guide authors who need to make fewer decisions when writing it. It also guides readers who can find more easily where the relevant information is in the whole documentation structure or within a topic. And it guides computers which can extract relevant information automatically and reliably.
  2. Structure enhances usability by creating patterns that are easy to recognize and easy to navigate with confidence.
  3. Structure enables automatic delivery and syndication of content, for example, via twitter – and you’ll be surprised occasionally when and how other people syndicate your “stuff”.
  4. Structure supports single-sourcing which means you can efficiently publish content on several channels, whether it’s print or different online outputs, such as a web browser, an iPad or a smartphone.
  5. Structure can automate transactions, such as money transfers, whether they are embedded in other content or content items in their own right.
  6. Structure makes it easier to adapt content for localization and translation, because you can chunk content to re-use existing translations or to select parts that need not only be translated but localized to suit a local market.
  7. Structure allows you to select and present content dynamically. You can decide which content to offer on the fly and automatically, depending on user context, such as time and location.
  8. Structure allows you to move beyond persona-ized content. This is not a typo: Scott doesn’t really like personas. He thinks they are a poor approximation of someone who is not you which is no longer necessary. With structured content (and enough information about your users) you can personalize your content to suit them better than personas ever let you.
  9. Structure makes it much easier to filter and reuse content to suit particular variants, situations and users.

Top 4 steps from manuals to topics

A little bit of planning ensures you get clean, manageable topics from your conversion of user manuals.

While most help authoring tools support importing Word documents, there’s more to getting re-usable topics out of user manuals, as I’ve found out. I’ve spent the last few weeks converting 3 related Word manuals of 360 pages into 400 topics in Madcap Flare – though I believe that the process below applies to other tools as well.

The aim was to merge the contents from separate Word-to-PDF manuals with the online help topics into a single sourcing repository from which we can create both online help and manuals.

My two key lessons of the conversion are:

  • Plan first, execute second – several hundred topics are too many for trial & error and picking up the pieces later.
  • Do each task as early as possible – some Word idiosyncrasies are hard to clean up after the conversion.

And here’s how I did it in 4 steps:

 

1. Start with plans

The whole conversion exercise benefitted much from a couple of designs that I followed:

  • An information model
  • A folder structure for my topics

The information model defines the 4 topic types we have and what each type contains internally. It’s basically “DITA, without the boring parts” about which I blogged previously.

The folder structure divides my one Flare project into several sub-folders, so I don’t have 400 topics in one heap. Instead, I now have 13 sub-folders which divide up my topics by topic type (concept, task or reference) and even by task type (initial setup or daily workflow). That makes it easier to manage the topic files.

2. Prepare for the import

Once I had the basic structure to organize topics and their insides, I prepared my Word manuals, so I didn’t have to deal with a GIGO situation, where I get Garbage In, Garbage Out.

First, I edited the documents into topics, so each section could become either a concept, task or reference topic – or an auxiliary topic which ensures that the chunks still flow nicely when you read them in the future manual output. I also ensured that section headings indicate topic contents and type:

  • Concept topics use noun phrases as headings
  • Task topics start with an imperative

Then, I cleaned up the documents. You can convert unstructured Word with layout applied in styles, modified styles and manual formatting into topics just fine, but it will give you unmanageable content and endless grief. So do your future self a favor and dissolve all modified styles and manual formatting.

3. Import

Thus prepared, I’ve found that Flare’s built-in Word import is very good, consistent and reliable if you throw well-structured Word documents at it. Only tables didn’t import well (or I couldn’t figure out how to do it), so I re-styled them in Flare.

If you’re a stickler for clean topics, you can go ahead in Flare and clean out unnecessary remnants:

  • Remove Word’s reference tags in cross references by replacing *.htm#_Ref1234567″ with *.htm”
  • Remove Word’s Toc tags in Flare’s table of contents by replacing *.htm#_Toc1234567″ with *.htm”
  • Remove Word’s Toc anchors in topics by deleting <a name=”_Toc*”></a>

4. Adding value to topics

At this point, I had a pile of 400 clean topics, but no added value from the conversion yet. That came from additional tasks:

  • Dividing up topic files into the folder structure, which makes hundreds of topic files manageable.
  • Assigning a topic type to topic files (Flare lets you do that for several files at once, so this was very fast), which makes the content intelligent, because topics “know” what they are.
  • Assigning in-topic elements (as div tags) to topic paragraphs according to the information model, which allows you to identify and reuse even parts of topics, for example, instruction sections or example sections.
  • Creating relationship tables for cross-references into relationship tables where feasible, which ensures that links are easier to manage and to keep up to date.

Your turn

Have you done a similar conversion? What were your experiences? Did you do it yourself or with an outside consultant? Feel free to leave a comment.

A. Ames & A. Riley on info experience models at STC12

Andrea Ames and Alyson Riley, both from IBM, presented a dense whirlwind tour on “Modelling Information Experiences” that combine four related models into a heavy-duty, corporate information architecture (IA).

While the proceedings don’t include a paper on this session, Andrea posted the slides, and the presenters published a related article (login required) “Helping Us Think: The Role of Abstract, Conceptual Models in Strategic Information Architecture” in the January 2012 issue of the STC’s intercom journal.

The session proceeded in six parts. First, Alison explained IA models in general and how they work. Then Andrea described each of the four model types that make up an IA specifically.

IA models as science and art

Information architecture relates to science as its models draw on insights and theories of cognition. And its models relate to art as they aim to create a meaningful experience. Both aspects are important. Only if IA models manage to blend science and art can they touch the head and the heart.

The session focuses on IA models instead of theories (which are too vague and abstract) or implementations (which are too specific and limited). Thanks to the in-between position of IA models, we can use them to

  • Ask the right questions to arrive at a suitable, feasible IA
  • Tolerate the ambiguities of “real life”

Models present descriptive patterns, not prescriptive rules. They don’t say how stuff must be, but how it can be represented. They differ from real life, but real life is still recognizable in them.

That means you cannot simply implement a model on autopilot and get it right. Instead, you have to

  • Think how to implement the model
  • Vary the model for your users’ benefit
  • Listen to the right knowledgeable people when implementing

Models help you think

To arrive at your concrete IA, you take the model’s abstract patterns and apply your project-specific details to them, the who, what, where and when.

This task is less mechanical and less copy-and-paste than it sounds. It’s not so much a question of following rules and recipes, but of making abstract patterns come to life in a coherent, flexible whole. (If you’ve ever tried to create meaningful concept or task topics by following an information model, you know it’s more than just filling in a DITA template. You need to use your own judgment about what goes where to achieve and maximize user benefit.)

Since there are four related models, you need to think carefully how each of these models should benefit your users. And the models help you think, they scale and adapt to:

  • How your business and its information requirements develop over time, how they grow and expand in desired directions
  • How your customers find, understand and apply the information they need

The four IA model types

The IA model that Andrea then explained consists of four related model types:

use model (content model + access model = information model)

Each of these model types needs to be developed and validated separately.

The use model defines how users interact with information. It describes standard scenarios for optimal user experience within the product or system context. It outlines what information users need and why and how they use it. It includes use scenarios for the entire product life cycle and user personas that outline different types of users. Fortunately for us technical communicators, Andrea pointed out, describing all this is part of our core skill set.

The content model defines how information is structured effectively. This could be DITA (in the case of the company I work for, this is what we call our DITA-derived “information model”). It includes the granular information units and their internal structure, for example, task topics and their standard sequence of contained information. It also describes how these granular units are combined into deliverables.

The access model defines how users access the information efficiently. It makes provided information useable, thanks to a navigation tree, a search function, a filtering function to increase the relevance of search results, an index, etc. Artefacts of this model type are wireframes, storyboards, a navigation tree and the like.

The information model defines how users get from one stage to the next. It uses the other three model types as input and fills in the gaps. It combines the content and access models which probably work fine during the installation and configuration processes, but may not yet carry a user persona from one stage to the next. As such, the information model is sort of the auxiliary topic that you sometimes need to insert between concept, task and reference topics to make a complete book out of them. At the same time, this model type is more detailed than the use model and closer to the other two types.

My takeaways

I was extremely grateful for this session and rank it among the top 3 I’ve seen at the summit – or any tech comm conference I’ve been to! Yes, it was fairly abstract and ran too long – my only complaint is that it left only 2 minutes for discussion at the end.

As abstract as much of the session was, it actually solved a couple of problems I couldn’t quite put into words. After designing and teaching our company’s DITA-derived information model (which is a “content model” by this session’s names), I thought our model was applicable and useful, but it had two problems:

  • Our model lacked context. – Now I know that’s because we haven’t mapped out our use model in the same detail and failed to connect the two.
  • Our model was baked into a template for less experienced writers to fill in – with varying results. – Now I know that’s because the models are not supposed to provide templates, but require writers to use their own judgment and keep in mind the big picture to deliver effective information.

Another lesson I learned is that many structured information paradigms seem to include a less rigid element that sweeps up much of the miscellaneous remnants. In DITA, there’s the general topic which is used for “under-defined” auxiliary topics to give structure, especially to print deliverables such as manuals. In this IA model, there’s the information model which fills the gaps and ensures a more seamless user experience than the other three models can ensure.

– For an alternative take, see Sarah Maddox’ post about this session.

Top 5 reasons I look forward to the STC12 Summit

I’ll be going to my first STC Summit in a couple of weeks and I’m already really excited about it. Here are my top 5 reasons and motivations:

1. Learn about new trends

The obvious reason to attend a conference: Many of the 80 sessions cover new industry trends – or at least topics that are new to me. We’re currently implementing a new HAT which brings a a lot of opportunities and some challenges, so I’m looking forward to:

2. Find inspiration and solutions

The sometimes unexpected benefit: At previous conferences, I frequently got ideas about improving a broken process or solving an irritating problem, even if that was not the main focus of a session. Such insights might come from an aside comment or something I see on a slide that inspires me to connect the dots. That’s why I’m looking forward to:

3. Present my own session

A highlight for will be Pattern Recognition for Technical Communicators!

My STC Summit speaker button

I’ll be on Wednesday morning at 8:30. I know that’ll be difficult after Tuesday’s banquet and whatever after-hours may transpire. But it’s actually a very good time!

  • A good time for you, because you can ease into the last day with an entertaining session that gives you a different, thought-provoking perspective on what you do anyway.
  • A good time for me, because I can get a feel for the conference on Monday and Tuesday and then get it out of the way firsrt thing on Wednesday. So I hope to see you there!

The conference program

After teasing you about several interesting sessions, here’s the complete conference program:

  • In a website, sortable by track, time, speaker or session code
  • In PDF, sorted by day and time, with session codes and titles only
  • In Excel 97-2003, sorted by day and time, with titles and main presenter

The first two are the official resources from the summit website, the spreadsheet is from me. All three are current as of May 6, but only the first one will be up to date in case of changes (an updated PDF may have a different link…). To be on the safe side, check the official summit website. – Now back to the reasons…

4. Meet old friends, make new friends

The pleasant side effect also called “networking”: As much as I enjoy social media as a virtual lifeline to stay in touch with the techcomm community, nothing beats meeting in person over a beer once or twice a year. So I’m looking forward to meeting speakers and delegates, tweeps and blog readers!

5. See Chicago

The tourist bit: I know Chicago a little bit from when I went to UW Madison in the 1990s. But I haven’t been in a while, and I’m especially looking forward to visiting the Art Institute and the new Modern Wing – or at least new to me. 🙂

6. Shop around for help authoring tools

Your bonus reason. The company I work for is not in the market right now for a new tool, but maybe you are. With more than 50 product and service providers exhibiting, you’ll have an excellent chance to see a lot of products up close and compare them closely. It’s a little like meeting friends: Nothing beats a first hands-on experience, and it’s a lot less daunting when you don’t have to install a trial version and click your way around. Vendor exhibitions at conferences were essential for us when we were choosing our tool.

7. Deep dish pizza

The gourmet reason. Thanks to Larry Kunz for the reminder, see his comment below. I was quite fond of Pizzeria Uno in my Madison days…

– If I forgot a reason to go to a conference, please share it below. If you’re attending the STC Summit, I hope to meet you in Chicago!

Summing up Scriptorium’s tech comm experts webcast

In Scriptorium’s “Ask the experts” webcast on 17 April 2012, Sarah O’Keefe, Nicky Bleiel and Tony Self reflected on frequently asked questions and trends. Here’s a timed play-by-play synopsis, so you can access the bits in the recording that interest you.

I try to provide teasers, not spoilers, so scoot right over to Scriptorium’s blog and check out the meaty answers for yourself!

FAQs

The panel starts with the questions they hear most often, from the underlying architecture via the tools to the deliverables.

5:46 – What is the best help/XML/CMS tool to use?

Tony tackles this first question. While there is no clear, short answer, Tony sums up some criteria which can guide your choice to selecting a system that works for you.

9:20 & 14:20 – What should we be delivering?

Nicky hears this one frequently – and often the underlying sentiment is: “So many outputs, so little time.” Again, there’s no simple answer that suits everyone, but Nicky outlines how to make an appropriate decision. And once you throw single sourcing in the mix, you’ve likely got a scenario that works for you. (They go back to this question after the next one, that’s why there are two timestamps!)

11:54 – Should we implement DITA/XML?

Sarah has several answers for this question: Showing an ROI is tricky, but most compelling – and it is more likely in some scenarios than others. The strategic aspect of making your content future-proof helps, but it may not be sufficient for your business case.

Hot topics

Next up are some more or less controversial questions:

18:50 – Should we use a wiki for documentation?

The experts chime in with several perspectives that can help you make a decision. Tony thinks it can be a valid format for many use cases, Sarah cautions of the very different processes a wiki brings.

25:10 – Content strategy: Is it a fad or fab?

The experts’ answers focus on what content strategy actually means for technical communications, from “we’ve basically been doing it for years” to embracing it as part of our profession’s maturing process to new job roles and titles.

30:30 – The tech comm career: Is it awful or awesome?

This one is interesting: The experts see both, the glass half empty AND half full. The consensus is that the role for technical communicators is changing, and fast, so there are challenges and opportunities to those who adopt.

Audience participation time

For the last round, the webinar audience submits some questions for scrutiny:

37:50 – Can tech comm be complex when products get ever simpler?

The panel is not ready to dumb down documentation at all costs. Complexity may be warranted depending on the products and audience expectations.

42:15 – Is agile good or bad for tech comm?

Agile can be good for tech comm, when it’s implemented well. Tony also points out that agile may give technical communicators a stronger role in the development process.

45:55 – Can product specs be turned into docs with a single edit?

It’s hard to tell without knowing the details. But probably not.

How to disrupt techcomm in your organization?

If you need to “disrupt” your tech comm content, I believe it’s more beneficial to integrate content across the organization than just to get tech comm to become more business-oriented or more like marketing.

The idea comes out of a worthy new collaborative project Sarah O’Keefe launched last week, Content Strategy 101: Transform Technical Content into a Business Asset. (This blog post is based on a couple of comments I’ve left on the site.)

Tech comm goes to business school

A recurring discussion is that tech comm needs to be more business-like to be justifiable in the future, not only on this blog but also elsewhere. Proponents of this view definitely have a point, if only because tech comm is often seen as a cost center and finds it hard to claim a return on investment.

I think, however, that this view is detrimental to all involved parties:

  • Tech comm risks to abandon its benefits to users and quality standards in an attempt to be “more like marketing”.
  • Managers may risk permanent damage to the documentation of their product without solving the bigger problem.

Breaking down all silos

The bigger problem often is that most content production is inefficient – because it occurs in parallel silos. Many companies have gotten good at making their core business more efficient. But they often neglect secondary production of content which remains inefficient and fragmented.

I’ve seen several companies where marketing, technical communications and training (to name just three areas) waste time and money. Due to inefficient, silo’ed processes, tools and objectives, they create similar, overlapping content:

  • Marketing and tech comm create and maintain separate content to explain the benefits of a product.
  • Tech comm and training write separate instruction procedures for manuals and training materials.

Once companies wake up to these redundancies, all content-producing units will face pressure to streamline content and make it easier to produce and reuse. This will revolutionize corporate content production and publishing.

Quo vadis, technical communicators?

I think this issue raises two questions for technical communicators.

The strategic question is:

Which kind of content disruption is more beneficial for the organisation and for customers: Folding tech comm into marketing or integrating all content with a corporate content strategy?

The answer depends on several issues, among them:

The tactical question is:

What’s the role of technical communicators in this content disruption: Are they the movers or the movees? Are they shaping the strategy or following suit?

The answer again depends on several issues:

  • What is your personality, clout and position in the organization?
  • Which team has the most mature content and processes to be a candidate to lead any kind of strategic change in content?

I think tech comm can lead a content strategy, especially if and when the tech comm team knows more about content than marketing or training or other content producers.

Beef up tech comm skills with free webinars

If one of your new year’s resolutions has been to improve your tech comm skills, here’s your chance. Industry experts offer several webinars in upcoming weeks to start you off. Many of them are free, so you really have no excuse! 🙂

Scriptorium

Scriptorium’s free webinars cover industry trends and technologies, such as:

  • Content strategy in technical communication
  • Trends in technical communication, 2012
  • HTML5 and its impact on technical communication

I’ve attended many Scriptorium webinars and have learned a lot from them. They are substantial and presented well. If you’ve missed one, you can catch up on the canned recordings.

Oh, and Sarah O’Keefe, who does most of them, has just taken #2 on MindTouch’s list of the Most Influential in #Techcomm and #ContentStrategy.

Comtech Services

JoAnn Hackos’ free webinars (announced on the DITA Writer blog) are dedicated to moving towards DITA in a 3-part series of “Crossing the Chasm with DITA”.

Hyper/Word

Neil Perlin’s free webinars are usually more tool-oriented, so they’re hands-on training sessions on topics such as:

  • MadCap Flare Mediums
  • Using Help Authoring Tools as CMSs
  • GUI Mobile App Authoring Tools
  • Creating Mobile Apps with Viziapps
  • Mobile documentation in Flare and RoboHelp

STC

STC’s webinars bring together the widest roster of industry experts, but they’re not free. They offer up to 3 webinars per week. Here are just the next six through the end of January:

  • Mental Model Diagrams: Supportive Content for Specific Folks
  • The Art of the Demo
  • Getting Yourself Into Print
  • Introduction to the Mobile Ecology
  • Designing Quick Reference Guides
  • Successful Strategies for Continuous Improvement

If you’re an STC member, sign up until January 31 and get $20 off on each webinar.

MadCap

MadCap’s free webinars are strong on tools and processes. Currently they only have one on offer about migration to Flare. But you can always check out the recordings for free. The tool-agnostic ones are quite valuable, even if you don’t use MadCap’s products.

Adobe

Adobe’s free webinars also mix tool-specific training with general topics. You do need an “Adobe Account” to register. Coming in January are:

  • Key Trends in Software User Assistance: An Expert’s Perspective – Part 1
  • Top 10 key trends shaping the Technical Communication industry of tomorrow: An industry research
  • Why upgrade from older versions of RoboHelp (X5, 6, 7 or 8) to RoboHelp 9? What is the value proposition for your business?
  • How to optimally leverage a Content Management System as a Technical Communicator
  • What is the future of indexing for technical documentation?

If you know of additional tech comm webinars, feel free to leave a comment.

Top 3 tech comm lessons in 2011

2011 was an eventful year for me as a tech writer. Here are the three most important lessons I learned this year.

Content strategy can change tech comm in 2 ways

… and only one of them is up to us tech writers:

  • Tech comm departments can engage in content strategy bottom-up, connect with stakeholders in training, customer services, marketing, and producct management to try to break down silos, reuse content and make content a corporate asset. One way to do this was the topic of Ray Gallon’s webinar “Content Strategy for Software Development”.
  • Corporations can can engage in content strategy top-down and essentially change the way the organization works. The objective is essentially the same as above, the main difference is who’s driving it. While tech writers cannot do it without management support, managers may decide to relegate tech comms to one of many stakeholders – which I think would be a pity. tekom’s Content Strategy day offered several sessions which discussed corporate content strategies.

The “big disconnect” is closing

The “big disconnect” is the difference in IT innovation between consumer IT and corporate IT. Geoffrey Moore coined the phrase in an AIIM white paper and presentation: “How can it be that I am so powerful as a consumer and so lame as an employee?” Earlier, I wrote about exploiting the big disconnect as a tech writer.

The reason it’s closing in tech comm is that consumer web sites have appropriated help systems and all their benefits, so the use cases and business models finally catch up with user demands and technologies, as Scott Abel pointed out in his tekom keynote address.

Content migration is about people first

In summer, our team of writers embarked on the journey towards structured authoring. One of the surprises to me as we proceeded was that metaphorically speaking, for every hour I spend moving content, I’m spending another hour moving minds.

Your turn

What did you learn about tech writing in 2011? Feel free to leave a comment.

Art vs. online: 2 dimensions of curating

Curating is a cool word, or trendy jargon, for what happens in web technologies and in art museums, but they are fundamentally different activities.

In this post, I want to add an alternative view to Rachel Potts who recently wrote about “When software UX met museum curation“. Where Rachel emphasises similarities, I’d like to focus on the differences, especially as they relate to art museums.

Your artefacts

One serious limitation and difference in curating at art museums, compared to anything in software and online, is that you need to care for original, unique works. If you mount a special exhibition, you need to procure them to begin with. And sometimes you cannot get them, no matter how much you want them in the show to present an artist or an era in history or to make your case.

  • Some works don’t travel because they’re fragile or because the insurance is too costly or because they’re centerpieces in the collection that owns them.
  • Some owners won’t lend works to you, because you cannot satisfy security requirements or because you’re too small a museum or because they don’t like your director.
  • Some works are simply lost.

Of course, you can always do with fewer or lesser works or, in the case of historic artefacts, with copies, but that invariably hurts the critical response and your attendance.

"Stalking Christina" - other people regarding my favorite painting

Your objectives

Another difference is that for many art museums “enabling users to learn” is one objective among many. And several other objectives are, unfortunately, at odds with it:

  • Some pieces are too sensitive to light or touch or movement to allow more than very few people to experience them.
  • Some museums need to please or placate donors (who may influence what’s shown and what not) and trustees (who may influence what gets paid for and what not).
  • Some museums don’t have the means: They lack the manpower to accommodate visitors more than a few hours per week. Or they don’t have the expertise to allow them to learn well.

Your audience

A third difference is that art museums who put on ambitious, critically well-regarded exhibitions find that attendance is surprisingly low. The reason is simple and disappointing: Many people don’t want to be enabled to learn in art museums. They don’t want to learn new things, much less have their beliefs challenged. Instead, many people visit an art museum, because of the way it makes them feel:

  • Many go to be in the presence of beauty or to be awed. Hence the success of any show whose title mentions a best-selling artist or any of the words “Impressionist”, “Gold” or “Gods” – even if the title is far-fetched and the show mediocre at best. “Dinosaurs” gets kids, and anything that flies or shoots gets their dads.
  • Some go to feel cool. Hence the success of after-work parties in modern art museums.

The words

Roger Hart once told me, it’s futile to try to stop linguistic change. And the web is a great change agent of language:

  • How many kids today know that women warriors (or a river) gave their name to an online store?
  • The German language has known about “email” for centuries (though we only spell it thus after a recent change in orthography); in English, it’s known as “enamel”.

But if language is to represent the real world, I advocate to respect the differences within one word, such as curating. Conflating two similar activities into the same word cheapens our experience of the stuff that surrounds us.

Tech comm meets content strategy, with Ray Gallon

Technical communications and content strategy have a lot to say to each other.  Bloggers have frequently related the two disciplines. Tech comm conferences run streams on content strategy, for example, tekom11 dedicated a whole day to the topic.

Content strategy for software development

Ray Gallon at tekom. Photo by @umpff, used with permission.

Ray Gallon at tekom. Photo by @umpff, used with permission.

Leave it to Scriptorium and their excellent webinars to shed some light on the situation. Recently, they invited Ray Gallon to present his “Content Strategy for Software Development“. (To learn more about Ray, read his recent interview over at the Firehead blog. Or you can check out the very similar presentation which Ray held at tekom.)

Ray’s presentation was very enlightening to me, because he applied content strategy to software development. I create documentation for software applications, so I can relate to creating content for them. In the following, I’ll mainly focus on the first half, but I recommend watching the entire webinar.

Software development as information-rich environment

For the sake of his argument, Ray set the stage by looking at (complex) software developments not as products or tools, but as information-rich interactive environments. Software could thus be an expert system that supports users to make an appropriate decision, e.g., a medical diagnosis.

The first question to content strategists is then: What does the user need from the software? There are several answers; the user may need to

  • Know something (relating to concept topics)
  • Do something (relating to task topics)
  • Explore or understand something
  • Integrate or combine the software with his other tasks and processes

Plan to help your users

Ray then presented several document types which help content strategists to plan how they can best support users in their tasks and decisions. Among them was the Content Requirements Worksheet:

Ray Gallon's Content Requirements Worksheet

Ray Gallon's Content Requirements Worksheet explained. Click to enlarge. (Pardon the mediocre quality; prezi > WebEx recording > SlideShare is one compression too many...)

This document was a real-eye opener for me. It represents a holistic view of all user-facing content in a software application:

  • Informational, editorial content
  • Structural software content, such as user interface messages
  • User guidance, such as tool tips, help screens
  • User decision support to help users do the right things for the right reasons
  • Dynamic content, such as search results
  • Live interactive content as preesented in forums and social networks

Content workers, unite!

The Content Requirements Worksheet can be very beneficial to future users of a software. But it presents a challenge to content strategists to get a wide variety of requirements right. That is the opportunity for technical communicators and user experience designers and information architects to pool their skills and join forces.

As Ray said in his comment to my previous post:

Many tech comms do a bit of, or a lot of, content strategy in their work, and if an organization has a content strategy then everyone, tech comms included, needs to understand it and be on board.

So let’s transcend the silos of our systems, their manifold features and the artefacts we’re used to creating. Let’s start with good, thoughtful design from which our users benefit.

Your turn

Is this a good way to relate content strategy to technical communications? Or do you know better ways? Feel free to leave a comment.