Paul Perrotta on change management at tekom/tcworld

Content management/strategy and the business of tech comm were my two focus areas during the tekom/tcworld conference in Wiesbaden, Germany, last week, and I will summarise some of the sessions I attended in several blog posts.

(For a general overview of what tekom is like, refer back to How German is tekom and tcworld? UK tech comm consultant Ellis Pratt and I have been commissioned to sum up this year’s event for an upcoming issue of ISTC’s Communicator magazine.)

Paul Perrotta on change management

Paul Perrotta from Juniper Networks offered two sessions on change management in tech comm. He reported on his unit’s journey from siloed, bickering, intransparent groups to a more efficient Information Experience (iX) organization.

Part of the problem is that we in tech comm are often pretty bad at saying what we do and what value we provide to the company and to customers. Instead, “docs happen” frequently in a black box. If you measure how well-regarded each unit is by their budget increases, a black box is not a good place to be in, because it won’t get you better funding. Executives don’t know (and don’t need to know) how tech comm works. But they need to know whether it’s successful and how it helps them be successful. And whether 8 dollars spent on it will increase their bottom line by 10.

So make tech comm more business-like and make managers’ worries your own: How can we increase customer satisfaction? How can we contribute to increase market share? Address these challenges to show the value tech comm contributes and how you can help the business to deflect some of the threats, such as:

  • Doing more (work) with less (resources).
  • Deferring costs to a less certain future.
  • Offshoring tech comm.

Here’s what you can do specifically:

  • Define a vision and mission for tech comm to clarify what they do – and what they don’t do. (See also “Why you need a tech comm mission statement“.)
  • Make improvements manageable by chunking them up into strategic initiatives.
  • Dissolve the documentation siloes by architecting and governing all content as a whole.
  • Improve content to make it complete, searchable and findable.
  • Connecting tech comm with marketing, sales and support to contribute to and benefit from the same content.
  • Rebrand tech comm as information experience to emphasize its contribution to the customers’ experience.
  • Focus on users and engage with them, for example, via user satisfaction surveys, feedback, social media.
  • Install an iX customer advisory board which meets regularly.
  • Seek out managers with the power and money to help you and map out your allies throughout the organization.
  • Make tech comm measurable and operationally efficient:
    • Link tech comm to development metrics where possible.
    • With proven competence, you can aim for 5% of R&D spend which is industry best practice in IT.
    • Ask how much of the product price tags the documentation is worth.
    • Show what (else) you could do with X more money.

Some of the results that Paul found:

  • Many customers are happy to offer feedback if they find they get heard, and tech comm improves as a result.
  • An ongoing discussion with users builds trust and customer loyalty.
  • Commonly governed content becomes more reliable and more easily findable for employees and customers alike.
  • Managers will support you because your success is their success of you demonstrate competence and that it’s easy for them to help you.
  • If you map your projects to executives’ objectives, you can clarify what you can and cannot do with available resources.
  • Achievements require focus to reap their full benefits – and then advertisements to make sure executives realize that you can work like a business…
  • To measure their achievements, tech comm quality metrics are not enough; you need customer engagement/experience metrics as well.
  • As a side effect, you will have to abandon an implicit ethos that treats tech comm as special, as an art that creates books.
Advertisement

2nd day of sessions at TCUK 13

The business and managing of tech comm was the predominant topic of my TCUK13 experience, as I reflect some more on the sessions I attended and the conversations I joined.

A. Westfold on collaborative authoring in DITA

Andrew presented a case study of McAfee over several years, from separate product teams and “artisanal”  lone writers to a larger, unified team of writers collaborating in DITA. During this time, McAfee also grew by acquisitions which meant that additional writers, methods and tools came on board. Here are the most essential stages of their journey:

  1. Improve several individual procedures for quick wins: Single sourcing reduced translation efforts. Automating the translation round-trip cut out costly manual layout efforts.
  2. Move to topic-based authoring: They chunked up content into topics and moved them into DITA to validate the topic structure. (It turned out that many task topics could not be automated and essentially had to be rewritten in valid structure.)
  3. Bring in a content management system to reap the full benefit from single sourcing and topic-based authoring. This helped to reduce the number of redundant topics and to make localization even more efficient.

While their journey is far from finished, McAfee has realized the following benefits so far:

  • Easier administration of topics than of larger content chunks before. It’s also easier to solicit reviews for smaller stand-alone chunks.
  • Faster, more consistent creation of deliverables for several product variants thanks to better use of standard templates.
  • Documentation processes align well with recently introduced agile development processes.
  • More efficient, streamlined workflow thanks to better integration between documentation and localization.

I really enjoyed Andrew’s presentation. It showed that projects to improve tech comm do work out, even if you don’t always see past the next stage, and you may have to adopt due to other changes in the company.

A. Warman on “Managing accessible mobile content”

Adrian Warman from IBM hooked up two important tech comm issues, accessibility and documentation for mobile, into a survey session.

Accessibility makes it easier for everyone to fit in, participate and contribute, irrespective of disabilities. In short, it ensures that a user’s disability does not mean a personal disadvantage. For tech comm, this means that sufficient documentation is accessible. For example, if your online help in HTML is accessible, it’s not necessary to make the same contents in PDF accessible as well – or vice versa, as the case may be. Adrian advised us to keep an eye on “EU mandate M 376” which may soon make some level of accessibility mandatory for products traded within the EU.

Mobile (smartphones and tablets) for tech comm means not just a technology, but an expectation, a mindset. It’s more than simply fitting our output onto smaller screens. Its different dimensions of interactivity, such as progressive disclosure and user-generated content, challenges us tech writers to re-think how to best convey an idea. Which is the best taxonomy that supports both, mobile devices and accessibility?

I don’t think there was a lot of new, revolutionary content here, but since I haven’t dealt much with either topic so far, it was a welcome introduction that was concise and well presented.

E. Smyda-Homa on useless assistance

Edward reported on his twitter project @uselessassist where he “Retweets to remind organizations of the frustration and negative emotions that result from poorly prepared assistance.” He presented many examples of poor user assistance. Some people complained about insufficient instructions, whether they had not enough images or only images. Some found the instructions too long (“I know how to prepare toast!”) or too short or redundant. Some pointed out typos or bad translations.

This was a very entertaining session – and you can easily get the gist of it by simply looking up the account or following the twitter feed. It’s anecdotal evidence in real-time that users actually do read the manual – or at least try to.

While every tweet is heartfelt, I think not every one merits a change in the documentation – if only because some are contradicting each other. But I find Edward’s project very enlightening and nodded to myself in embarrassed recognition a couple of times…

– Feel free to leave comments about any of the sessions, whether you have attended them or not.

How to disrupt techcomm in your organization?

If you need to “disrupt” your tech comm content, I believe it’s more beneficial to integrate content across the organization than just to get tech comm to become more business-oriented or more like marketing.

The idea comes out of a worthy new collaborative project Sarah O’Keefe launched last week, Content Strategy 101: Transform Technical Content into a Business Asset. (This blog post is based on a couple of comments I’ve left on the site.)

Tech comm goes to business school

A recurring discussion is that tech comm needs to be more business-like to be justifiable in the future, not only on this blog but also elsewhere. Proponents of this view definitely have a point, if only because tech comm is often seen as a cost center and finds it hard to claim a return on investment.

I think, however, that this view is detrimental to all involved parties:

  • Tech comm risks to abandon its benefits to users and quality standards in an attempt to be “more like marketing”.
  • Managers may risk permanent damage to the documentation of their product without solving the bigger problem.

Breaking down all silos

The bigger problem often is that most content production is inefficient – because it occurs in parallel silos. Many companies have gotten good at making their core business more efficient. But they often neglect secondary production of content which remains inefficient and fragmented.

I’ve seen several companies where marketing, technical communications and training (to name just three areas) waste time and money. Due to inefficient, silo’ed processes, tools and objectives, they create similar, overlapping content:

  • Marketing and tech comm create and maintain separate content to explain the benefits of a product.
  • Tech comm and training write separate instruction procedures for manuals and training materials.

Once companies wake up to these redundancies, all content-producing units will face pressure to streamline content and make it easier to produce and reuse. This will revolutionize corporate content production and publishing.

Quo vadis, technical communicators?

I think this issue raises two questions for technical communicators.

The strategic question is:

Which kind of content disruption is more beneficial for the organisation and for customers: Folding tech comm into marketing or integrating all content with a corporate content strategy?

The answer depends on several issues, among them:

The tactical question is:

What’s the role of technical communicators in this content disruption: Are they the movers or the movees? Are they shaping the strategy or following suit?

The answer again depends on several issues:

  • What is your personality, clout and position in the organization?
  • Which team has the most mature content and processes to be a candidate to lead any kind of strategic change in content?

I think tech comm can lead a content strategy, especially if and when the tech comm team knows more about content than marketing or training or other content producers.

Half-way DITA: Why some is better than none

If DITA seems like a good idea, but you cannot make the case for it, you can move towards structured writing and make your documentation “future-proof” by meeting the standard half-way.

At the company I work for, we tech writers created manuals in parallel, but separate to online help. Over time, this gave us a documentation set that was inconsistent in places and hard to maintain to boot. Topic-based authoring which reuses topics in print and online can fix that, of course.

First, a documentation standard

Deciding on the method is one thing, but we also wanted a consistent structure that made the documentation easier and clearer to use – and easier to maintain for us writers. That required a model that specifies which kinds of topics we want to offer, how these topics are structured inside and how they relate to one another.

As we looked towards a documentation standard, we had two options:

  • We could create a content inventory of our documentation, analyse and segment it to tease some structure from that.
  • Or we could rely that others had solved a similar problem before and see if we can’t use the wheel someone else had already invented.

Turns out the second option was quite feasible: The DITA 1.2 specification gives us about all the structure we need – and more. We left out the parts we didn’t need (for example, some of the more intricate metadata for printed books) and adopted a kind of DITA 1.2 “light” as our information model.

Second, the tools

Note that I haven’t mentioned any systems or tools so far! Even though it happened in parallel to the rolling out topic-based authoring as our method and DITA light as our information model, the tool selection was mainly driven by our requirements on documentation workflows, structure, deliverables, and budget.

The tool that suited us best turned out to be MadCap Flare – even though it doesn’t create or validate DITA!

Using our information model in Flare, we believe we get most of the benefits of DITA – and considerable improvement over our less-than-structured legacy content. And speaking to people at WritersUA 2011, it seems that we’re not the only one to move from less-than-structured writing to XML and something “close to DITA”.

Technically, we’ve defined the DITA elements we need as divs in the Flare stylesheets, but otherwise use the straight Flare authoring-to-publishing workflow. Flare is agnostic to whether a topic complies with DITA, is somehow structured but not complying or totally unstructured.

The benefits of DITA, half-way

To us tech writers, the largest benefit of DITA, half-way, is that we can actually do it. We could not have gotten away with DITA, the full monty, which would have required a much longer project, a much bigger migration effort and hence, uncertain ROI.

For new topics, we are committed to writing them structured, so they follow the information model. To migrate legacy topics, we’ll have to ensure they have an identifiable topic type and a suitable heading, but we can cleaning up their insides in a “soft fade”, moving them towards structure one by one. This gives us a quicker win than cleaning up literally thousands of topics before having anything to show in the new method, model or system.

So we will have been working in Flare and with our home-grown information model for a long while, before all topics actually comply with the model. But then we will have a documentation set which we can feasibly move into real structure, whether we opt for DITA or some other XML-based CMS, with or without a CMS.

This post is an elaboration of a comment thread on the “Why DITA?” guest post on Keith Soltys’ Core Dump 2.0 blog.

Tech comm meets content strategy, with Ray Gallon

Technical communications and content strategy have a lot to say to each other.  Bloggers have frequently related the two disciplines. Tech comm conferences run streams on content strategy, for example, tekom11 dedicated a whole day to the topic.

Content strategy for software development

Ray Gallon at tekom. Photo by @umpff, used with permission.

Ray Gallon at tekom. Photo by @umpff, used with permission.

Leave it to Scriptorium and their excellent webinars to shed some light on the situation. Recently, they invited Ray Gallon to present his “Content Strategy for Software Development“. (To learn more about Ray, read his recent interview over at the Firehead blog. Or you can check out the very similar presentation which Ray held at tekom.)

Ray’s presentation was very enlightening to me, because he applied content strategy to software development. I create documentation for software applications, so I can relate to creating content for them. In the following, I’ll mainly focus on the first half, but I recommend watching the entire webinar.

Software development as information-rich environment

For the sake of his argument, Ray set the stage by looking at (complex) software developments not as products or tools, but as information-rich interactive environments. Software could thus be an expert system that supports users to make an appropriate decision, e.g., a medical diagnosis.

The first question to content strategists is then: What does the user need from the software? There are several answers; the user may need to

  • Know something (relating to concept topics)
  • Do something (relating to task topics)
  • Explore or understand something
  • Integrate or combine the software with his other tasks and processes

Plan to help your users

Ray then presented several document types which help content strategists to plan how they can best support users in their tasks and decisions. Among them was the Content Requirements Worksheet:

Ray Gallon's Content Requirements Worksheet

Ray Gallon's Content Requirements Worksheet explained. Click to enlarge. (Pardon the mediocre quality; prezi > WebEx recording > SlideShare is one compression too many...)

This document was a real-eye opener for me. It represents a holistic view of all user-facing content in a software application:

  • Informational, editorial content
  • Structural software content, such as user interface messages
  • User guidance, such as tool tips, help screens
  • User decision support to help users do the right things for the right reasons
  • Dynamic content, such as search results
  • Live interactive content as preesented in forums and social networks

Content workers, unite!

The Content Requirements Worksheet can be very beneficial to future users of a software. But it presents a challenge to content strategists to get a wide variety of requirements right. That is the opportunity for technical communicators and user experience designers and information architects to pool their skills and join forces.

As Ray said in his comment to my previous post:

Many tech comms do a bit of, or a lot of, content strategy in their work, and if an organization has a content strategy then everyone, tech comms included, needs to understand it and be on board.

So let’s transcend the silos of our systems, their manifold features and the artefacts we’re used to creating. Let’s start with good, thoughtful design from which our users benefit.

Your turn

Is this a good way to relate content strategy to technical communications? Or do you know better ways? Feel free to leave a comment.

Structured content does not kill creativity

Structured content is cooler than you may think. As a model for technical communications, it suffers from several misconceptions which prevent that you and your organization get the most out of it.

I’ll debunk a couple of misconceptions that I’ve encountered. Each one presents a learning opportunity where you can show a writer, a subject-matter expert or a manager how structured content is actually quite beneficial.

Myth #2 is:

Structured content kills creativity, right?

The argument is that the structured content forces you into a corset of rules and reduces you to filling in the blanks. It means to comply with a structural model which can get quite intricate. For a procedural topic, rules could include:

  • Start the topic with a heading; start the heading with a verb.
  • Start the text with an introductory phrase, sentence or paragraph, depending on how much context the procedure requires.
  • Write all procedural steps in a numbered list.
  • Etc.

Channeling creativity

I think the argument, taken at face value, misunderstands creativity. Creativity, whether in the arts or in more craft-like professions, is always an expression regulated by rules and confined by boundaries.

Think about poems. Ann Rockley (I think) once gave the example of a sonnet, a fine form of poetry which has been around with few changes for centuries. It is highly regulated in terms of number of lines, rhyme scheme, etc. Or think of a haiku: 3 lines of 5 + 7 + 5 syllables, that’s pretty strict. But I’ve never heard a poet claim that the rules kill his or her creativity.

So structured writing sets up more obvious rules than you may be used to. With them, it channels creativity to ensure that your writing is more reliable, more accountable and to your readers more useful.

Anybody can write?

Filling in the blanks in a structured writing template seems more mundane and banal than to write one paragraph flowing into the next. This can lead to the idea that structured writing might somehow be easy…

Structured writing is mainly different from technical prose, I think – and ultimately just as demanding. In both scenarios, you can ask yourself: Have I put my best sentences into the topic? And in both scenarios you will meet people who think that anybody can write. But the marks of high quality writing are pretty similar in either case: Is the writing clear, consistent, and correct?

Your benefits

For you as a writer, structured writing doesn’t so much limit or kill creativity, but it helps you to channel it: You can focus on putting the most useful, most concise documentation on screen or page in consistent structure. It frees you from having to worry about structure, content and layout at the same time: You can focus on content alone, while the structure is given, and the layout is applied separately.

For readers, structured writing increases their trust and confidence in the documentation. Whether you spell it out explicitly or leave them to discover it by themselves, structured writing ensures a level of consistency that is hard to achieve by other means.


If you’ve found this post helpful, if you disagree or if you know additional benefits of structured content, please leave a comment.

Structured content is not just a style guide

Structured content is cooler than you may think. As a model for technical communications, it suffers from several misconceptions which prevent that you and your organization get the most out of it.

I’ll debunk some misconceptions that I’ve encountered. They aren’t exactly wrong, but they miss the big picture. So each one presents a learning opportunity where you can show a writer, a subject-matter expert or a manager how structured content is actually quite beneficial.

Myth #1 is:

Structured content means writing topics with a style guide, right?

No, but it’s a good start! Structured content plays very well with topic-based authoring and a style guide, but it goes beyond them. Way beyond.

To take a favorite example, imagine you’re writing a cookbook with several co-authors. Without topics and a style guide, every writer’s output looks different, though they’re probably all effective and recognizable as recipes.

A topic structure gives you some coherence in all recipes. They each might start with a short description with regional and culinary context. Then you have a list of ingredients. Then you have the preparation instructions, probably as a list of steps. You may include preparation time and difficulty.

A style guide adds more coherence, this time in layout and maybe sentence structure. You define what headings and sub-headings you use. You decide on metric or imperial measurements. You might regulate that instructions should use imperatives.

Surface structure

On the surface, you now have structured content: All recipes share the same structure and layout and similar writing style. You can mix and match topics. If you have 200 recipes like this, you could easily combine them into a dozen different cookbooks, some with a regional theme, one with desserts only, and a vegetarian one.

A problem arises when you show ingredients in grams and liters and want to convert them into ounces and pints. You could do that manually for 200 recipes. Or you could ask a developer to write a text manipulator that searches for “grams” and “g” and “kilogram” and “kg”, finds the number preceding it, convert it and hope that you catch everything, and it comes out right.

“Smart” structured content

Structured content helps with such conversion, because each topic “knows itself” inside out: Each topic contains markers that identifies it as a recipe. Inside these markers are more markers that set off the intro, the list of ingredients and the step-by-step instructions. Each ingredient on the list identifies an amount, a unit and the actual ingredient.

The benefit of such structured content is that it can be parsed automatically, reliably and efficiently to make it more useful:

  • You can convert measurements.
  • You can compile cookbooks automatically, if each recipe “knows”
    • its country, for example, France
    • its region, for example, Mediterranean
    • its type, for example, dessert
    • its preparation time, for example, 20 minutes
  • You can offer the same recipes online and allow users to search for
    • Mediterranean cuisine (because that’s what they feel like)
    • which is easy to prepare and takes up to 30 minutes and
    • uses aubergines and tomatoes (because they need to go).

Your benefits

Your organization can reap similar benefits, given it has enough content and enough scenarios where you need to reuse it efficiently. For example:

  • Benefits and motivation for a new product or module may already be contained in documentation (or even the development specification), so marketing can just reuse it.
  • Setup procedures in a user manual can be reused in tutorials or training materials.
  • Topics written for online help can be reused in manuals.

If you’ve found this post helpful, if you disagree or if you know additional benefits of structured content, please leave a comment.

And check back next week for myth #2: “Structured content limits kills creativity, right?”

Top 3 reasons to attend Congility 2011

Relevant topics, great speakers and a price that’s hard to beat (for at least one person) make Congility 2011 a great conference to attend.

Conferences are a great way to keep in touch with fellow tech communicators, content strategists, UX experts, e-learning pros and user assistance speecialists – as I’ve pointed out before.

Congility 2011

One conference worth going to this year is Congility 2011. Here are just a few of the talks which will be good and worthwhile. (I’m fairly certain because I’ve heard the speakers on related topics before, and am convinced they know what they’re doing.)

Other talks I’m curious to hear are:

Bad news and good news

The bad news about Congility is that I won’t be able to go – they won’t hold up the production cycle for the writer to attend a conference, and the deadline looms. 😦

The good news is I can give away

one free conference ticket (valued at 495 GBP) and
a 20% discount for everyone else

Thanks go to the newly joined conference sponsors who make this new last-minute offer possible. Here are the details:

Congility 2011, May 24-26, just outside London, England, is for content professionals looking to advance their organisation’s goals with better content strategy, management and process. It is the only European platform bringing together such a diversity of content experts and learning opportunities under one roof. Learn from ‘The Mother of Content Management’, Ann Rockley, renowned content strategist Rahel Bailie, and case studies from eBay, Nokia, AMD, IBM, AGFA and more.

As part of an arrangement with this blog, you could attend completely free, by taking advantage of this unique discount code. The first person to use the code below will be given access to the conference (but not workshops) at no cost to them besides travel and expenses. Everyone else who uses the code will be entitled to the 20% discount*:

KAWCA11BLG20SD

* If you can’t go even at 20% discount, you can cancel your registration without commitment or penalty.

How to convince managers of topic-based authoring, part 2

To get managers behind a migration to topic-based authoring (TBA), focus on benefits and savings. This is the last post in a two-part series. Find the beginning and background in part 1.

I present the speaker notes and explanations instead of the actual slides which only contain the phrases in bold below.

Benefits and challenges for writers

Make documentation efficient. For technical writers, the structure within topics and across all topics makes writing topics more efficient because you spend less time stressing over what goes where and over layout.

Make documentation transparent. The structure of the topics collection as a whole makes content more transparent: It’s easier to spot a missing topic, if each setup procedure (how to set up stuff) is accompanied by an operating procedure (how to use what you’ve just set up) and by a concept topic (what is that stuff you’ll set up and operate). Thanks to their structure and smaller units, documentation efforts also become easier to estimate – though maybe more tedious to report on in their details.

Collaborate more easily. The structure also makes it easier and faster for writers to collaborate on writing, reviewing and editing each other’s topics, again, because it’s quickly obvious what belongs (or is still missing) where.

Assume new tasks and responsibilities. Challenges for writers are learning a whole new range of tasks and responsibilities, from “chunking” subjects into topics and making sure there is one (main) topic for each subject to interfacing nicely with the topics of colleagues to peer-editing other people’s topics. On the other hand, most writers no longer have to double as layouters and publishers, since that role is usually in the hands of a few people.

Migrating legacy content. Another challenge is, of course, to migrate all existing contents into topics. However, this is a one-time effort, while the benefits of clearly structured topics keep paying off.

Benefits and challenges for companies

Of course, the benefits and challenges for writers affect the company as a whole. But there are additional effects to the company owning topic-based documentation.

Leverage corporate content. Cleanly structured (and tagged) content in topics is much easier to leverage as part of a corporate content strategy. (Did I mention this was a presentation for managers? Hence the verb “to leverage”…) After all, there are other teams who may well hold stakes in some documentation topics or parts of them:

  • Product management or even Marketing may want to reuse parts of concept topics, such as use cases.
  • Training could reuse procedural topics.
  • Quickly searchable documentation can improve customer services – or any type of performance support your company may offer.

Make recruitment more efficient. Clearly structured, topic-based documentation will make it easier on a company to find and hire professional, qualified technical writers – and help new writers get up to speed faster.

Savings from topic-based authoring

Your mileage will vary, depending on your current deliverables, processes and tools. However, from the case studies I’ve seen around the web and at conferences, our numbers are not unusual. Savings are in hours for writers who apply topic-based authoring compared to their earlier efforts without TBA.

  • Writing Release Notes as usual – saving 0%
  • Writing Online Help, largely reusing Release Notes topics – saving 45-60%
  • Writing new User Manuals, by reusing some topics from Release Notes or Online Help – savings unknown
  • Updating existing User Manuals, by reusing Release Notes topics – saving 60-75%

Complementary information

To read more about measuring efforts and costs, see my previous posts about:

About topic-based authoring, I recommend these two books:

Your turn

Would these arguments convince your managers to support you in moving to topic-based authoring? What other arguments might it take? Should such an initiative to restructure documentation come from writers or managers? Please leave a comment.

How you can exploit the “Big Disconnect”

By way of consumers, web 2.0 and social media present a disruptive influence on corporate IT: Existing “systems of records” face challenges by new “systems of engagement”.

The thesis is by Geoffrey Moore in an AIIM white paper and presentation, and I’ve come across it by following some links in Sarah O’Keefe’s post “The technical communicator’s survival guide for 2011“. So once again, I find a great idea, summarize and apply it, instead of thinking up something wildly original myself. (Maybe not the worst skill for a tech writer, come to think of it… 🙂 )

Out-of-sync IT developments

Moore’s premise builds on out-of-sync advances of corporate vs. consumer IT:

  • Corporate IT developments recently focused on optimizing and consolidating otherwise mature, database-based “systems of record” which execute all kinds of transactions for finance, enterprise resource planning, customer relationship management, supply chain, etc.
  • Consumer IT, on the other hand, saw the snowballing improvements in access, bandwidth and mobile devices which have quickly pervaded ever more spheres of everyday culture.

“The Big Disconnect”

This imbalance leads to the pivotal insight of Moore’s analysis: As I read it, the disruptive influence on corporate IT occurs not through technologies or processes, but through people.

People are quick to adopt or reject or abandon new consumer IT tools and habits that cater to their needs. The same people feel hampered by corporate systems and workflows that seem unsuitable and obsolete. Moore calls it “The Big Disconnect”:

How can it be that
I am so powerful as a consumer
and so lame as an employee?

How consumer IT affects corporate IT

For the next 10 years, Moore expects that interactive, collaborative “systems of engagement” will influence and complement, though not necessarily replace traditional “systems of record”:

  • Old systems are data-centric, while new systems focus on users.
  • Old systems have data security figured out, new systems make privacy of user data a key concern.
  • Old systems ensure efficiency, new systems provide effectiveness in relationships.

For a full comparison of the two kinds of systems, see Moore’s presentation “A ‘Future History’ of Content Management“, esp. slides 10-12 and 16.

But does it hold water?

Moore’s analysis has convinced me. I used to think that corporate and consumer IT markets differ because requirements and purchase decisions are made differently. But this cannot do away with the “Big Disconnect” which I’ve seen time and again in myself and in colleagues. Personally, I know that this frustration is real and tangible.

Also, the development of wikis and their corporate adoption is a good case study of the principle that Moore describes. If you know of other examples, please leave a comment.

What does it mean to tech comm?

The “Big Disconnect” affects those of us in technical communications in corporate IT in several ways.

Tech writers write for disconnected corporate consumers. So we do well to integrate some of the features of “systems of engagement” that Moore describes:

  • Add useful tips & tricks to reference facts.
  • Provide discussion forums to complement authoritative documentation.
  • Ensure quick and easy access to accurate and complete documentation.

But technical communications can do one better by helping to ease the drawbacks of engaging systems:

  • Offer easy, comprehensive searches through disparate formats and sources.
  • Moderate forums and user-generated contents carefully to maintain high content standards and usability.

Tech writers are disconnected corporate consumers. So we can push for the improvement of the products and processes we describe or use.

  • On consumers’ behalf, we can advocate for improved usability and for documentation that is more efficient to use.
  • On our own behalf, we can insist to improve workflows that serve a system rather than us writers and our processes.
  • We can urge to replace help authoring systems that support only fragments of our documentation workflows with more efficient tools.

Our managers are also disconnected, most likely. So when we argue for any of the above disruptions, we can probably fall back on their experience when we have to justify them. We’ll still need good metrics and ROI calculations, though… 🙂

To read further…

The “Big Disconnect” and its effects connects nicely with a couple of related ideas:

Your turn

Does the Big Disconnect make sense to you – or is it just the mundane in clever packaging? Do you think it’s relevant for technical communications? How else can we tech writers exploit it? Please leave a comment.