About

I'm Mike Pope. I live in the Seattle area. I've been a technical writer and editor for over 35 years. I'm interested in software, language, music, movies, books, motorcycles, travel, and ... well, lots of stuff.

Read more ...

Blog Search


(Supports AND)

Feed

Subscribe to the RSS feed for this blog.

See this post for info on full versus truncated feeds.

Quote

If you believe the doctors, nothing is wholesome; if you believe the theologians, nothing is innocent; if you believe the military, nothing is safe.

— Lord Salisbury



Navigation





<October 2024>
SMTWTFS
293012345
6789101112
13141516171819
20212223242526
272829303112
3456789

Categories

  RSS  
  RSS  
  RSS  
  RSS  
  RSS  
  RSS  
  RSS  
  RSS  
  RSS  
  RSS  
  RSS  
  RSS  
  RSS  
  RSS  
  RSS  
  RSS  
  RSS  
  RSS  
  RSS  
  RSS  
  RSS  
  RSS  
  RSS  
  RSS  
  RSS  
  RSS  
  RSS  
  RSS  
  RSS  
  RSS  

Contact Me

Email me

Blog Statistics

Dates
First entry - 6/27/2003
Most recent entry - 9/4/2024

Totals
Posts - 2655
Comments - 2677
Hits - 2,700,797

Averages
Entries/day - 0.34
Comments/entry - 1.01
Hits/day - 348

Updated every 30 minutes. Last: 1:15 PM Pacific


  09:57 PM

Not long ago, I posted some notes about what not to expect from a technical review of your document. One premise is that tech reviews are inherently limited in what they can provide you. When I was pinging people for ideas on that post, more than one person said it would be equally (or more) useful to post something about how to nonetheless get the best technical review that you can. This is a topic of perennial interest; I believe that there's been a session on how to get good tech reviews at every tech-writing conference I've been to.

Herewith, then, some ideas. I will actually start with general tips. Next time, some strategies.

Tips

  • Spell out what you want from the reviewer. Don't just throw it at them and hope for the best. Leave questions about specific things you want to get answered or draw their attention to. Colleague Brian left the following suggestions as a comment on the previous post:
    I have a little boilerplate message telling them I want to know two things: (1) Is what's there correct? Run the code, try the steps, etc. (2) Is anything not there that should be?
  • Prioritize. You will have more to review than they'll have time for; not everything can be reviewed. Pick out the material that's most critical. Include the pri-2 stuff if you want, but make sure your priorities are clear to the reviewers. (See also next point.)

  • Don't overwhelm. Choose a reasonable number of topics for review, not the entirety of a 1000-page book or doc set.

  • Be aware of the reviewers' schedule. Don't schedule a tech review at a time when your reviewers are in their own crunch mode (which might not coincide with yours).

  • Provide context. Show the reviewers how the topics under review fit into the larger documentation set and how they relate to each other. (Provide a table of contents, point to an online doc set, whatever.) Don't just give them a stack of docs in arbitrary order.

  • Make it easy to review. Use any means you can – technological or other – to make it easy for people to review your docs. Stated another way, don't make your process a burden to the reviewers. Don't make it hard to get the docs to review, don't require that reviewers use clunky tools, don't make it hard for reviewers to get their feedback to you. If they want to email comments, say yes. If they want to meet with you face-to-face, say yes, etc.[1]

  • Get the right reviewers, i.e., people who can provide you with concrete feedback. For example, don't send an enormous whitepaper for review to someone who only knows about a small segment of your product. Colleague Doug:
    The targeted approach has worked best for me overall as in: I will focus most of my energy on these two people who actually have a clue and actually respond to me while still including all of the others that will ignore me.
    By the same token, ...

  • Don't overlook possible reviewers. People who are not necessarily on your radar can be great reviewers, not just the direct stakeholders -- testers, other writers, even outside people who are under NDA.

  • Find several reviewers. Don't rely on a single reviewer. At the same time, beyond a certain core number (3? 6?), you'll only get incremental additional value at best. (And don't forget that there will be overlap between you and other writers – don't overwhelm your backup reviewers.)

  • Understand what you can get from specific reviewers. As I noted in the last post, people will have different knowledge, interests, and strengths to bring to a review. Find folks to cover the range of what you need.

  • Be clear about your schedule. When do you want it (need it) back by?

    And of course, ...

  • Allow enough time! Paradoxically (or not), you also don't want to allow too much time.

  • If you're not getting reviews, send out highly targeted, very specific text in email. This is a trick suggested by Colleague Rick: "I like to pull out a pre-edited paragraph that's over the line to get their attention. You can't do this on every article, but you can do it 4-5 times a year on the most difficult documents. I use it only when they are swamped and the usual requests for TR are ignored."

    A particularly sneaky variation on this is to send out something that you know is wrong and say "This is ok, right?" From KC Lemson: "You'll be amazed at how quickly someone will take the time to correct you, particularly if the question was aimed at more than one person, since it's an opportunity for that person to prove their knowledge in front of others." Careful, tho. Rick: "You can't cry wolf too often."

  • Finally, be grateful and acknowledge the help. Make people glad they helped you and willing to help you again.
In general, put yourself in their spot: if you were being asked to spend several hours reviewing someone else's work, how much time would you consider reasonable to do that, how much work would you be willing to do, and what would make it easiest for you?

Next time, strategies for how to conduct tech review.


[1] Of course, there's a difference between making it easy and indulging a reviewer's whims. (“Call me on the phone and read me the doc.” Nuh-uh.) I don't mean to say you shouldn't try to negotiate a mutually satisfactory approach.

[categories]  

|