I'm Mike Pope. I live in the Seattle area. I've been a technical writer and editor for over 30 years. I'm interested in software, language, music, movies, books, motorcycles, travel, and ... well, lots of stuff.

Read more ...

Blog Search

(Supports AND)

Google Ads


Subscribe to the RSS feed for this blog.

See this post for info on full versus truncated feeds.


We copy editors are the skeptics, the nay-sayers, the fault-finders. We look at a text expecting to find it defective and are seldom disappointed.

John McIntyre


<October 2020>



Contact Me

Email me

Blog Statistics

First entry - 6/27/2003
Most recent entry - 9/26/2020

Posts - 2627
Comments - 2636
Hits - 2,303,889

Entries/day - 0.42
Comments/entry - 1.00
Hits/day - 365

Updated every 30 minutes. Last: 8:08 AM Pacific

  12:19 PM

In an effort to improve my sleep regimen, I was recently prescribed a CPAP machine. This device helps with obstructive sleep apnea, where your throat closes during sleep. The CPAP machine basically pushes air into your nose and/or mouth to keep things open.

The concept is relatively simple, but it involves technology. You wear a mask; the mask is connected via a hose to the device itself, which blows air and has sensors to adjust the pressure and temperature. There's a water reservoir so the machine can humidify the air it's blowing at you. There are air filters that need to be changed periodically. The mask, hose, and water reservoirs need to be washed regularly.

So how does the manufacturer (Philips) make and distribute a machine that's a this complex but is intended for a wide variety of people? I count four ways, and am wondering about a fifth.

First, before you can take home your CPAP machine, you get a 20-minute training session slash demo. The trainer walks you how to assemble and use the machine, and they give you the schedule and some tips for cleaning.

I read once that people retain 10% of what they hear during a presentation. The exact number (10%) isn't that important; the idea is just that people don't retain everything you tell them.[1] And I think about all the people who use a CPAP machine. Is a 20-minute demo going to be enough to train this wide range of people? My guess is no.

So second, the machine comes with a manual—two, in fact, a quickstart and a detailed manual. These reiterate a lot of what you learn in the presentation, so you have at least the possibility of hearing it all twice. The quickstart has a lot of pictures. (The manual, a few line drawings.) For me, it was an interesting case of reading documentation for something that I felt I kind of already knew but that I needed some refreshing on.

A third way that Philips tries to help their users is through design. For example, when things connect, they go together in only one way. There's only one way to plug in the machine. The hose goes into the machine only one way. The filters and water reservoir go into the machine only one way; you can't close it if they're not right. The nosepiece in the mask has clear markings right on it for how to put it together. These are all versions of self-documenting features: you literally do not need to read the manual to understand how to put together the components.

As another example, there are very few controls on the machine. For basic use, you need to press only a single button to turn the machine on and the same button again to turn it off. There's a second button if you want to adjust the "ramp"—that is, to set the air pressure to increase gradually when you start up the machine.[2]

And there is a fourth way in which Philips can help people. The machine phones home to report a bunch of statistics about the user's sleep. (That part of the machine required no setup at all, which was great.) This feature has several purposes (some of them a little uncomfortable to contemplate), but at least if the distributor is getting odd reports or no reports from the patient, they know something went wrong. Perhaps they contact you if that's the case.

I imagine that with these efforts on Philips's part, most people can manage to put on their mask and get the machine running. But how well Philips does attempt to help users with ongoing maintenance? It's easy to forget to fill the reservoir. Similarly, the training emphasized that you should wash the hose every week. Is everyone really going to do that? I mean, we're all supposed to floss every day, but how many people really do that?

So I wonder. Does the machine stop and display a message if the water reservoir is empty? Does it tell you when it's time to change an air filter? Does it (somehow) figure out that it's time to wash the hose? I don't know, and I'm reluctant to get into a situation where the machine has to tell me these things. But given the many ways in which the machine, once running, can go wrong, I hope that the manufacturer has taken steps to try to keep it going.

All in all, giving out a complex piece of technology to people and expecting them to all use it right is a hard problem. It's clear that the folks at Philips have thought a lot about this and come up with different ways to try to handle it. Still, I am curious how many people fail when trying to use the machine—they never figure out how to use it, they use it wrong, or they don't maintain it and the machine itself fails. As someone with a professional interest in communicating complex concepts, I find this to be an interesting challenge.

[1] Someone on my team at work has a variant on this idea: you need to hear something 7 times before you learn it. ^

[2] There is also a dial that you can use to make a bunch of other settings, and the dial is multi-modal: turn left for one mode, turn right for another mode, and push for a third mode. Arg. I hate multi-modal controls. But at least this one is optional after you've done the initial setup. ^

[categories]   ,

[1] |

  12:16 PM

Over the weekend, I bought a recliner at Costco, which my wife laughingly suggested was my admission that I'm an Old Guy. But before I could, you know, recline, I needed to assemble the chair and figure out how to work it. In our modern era, reclining chairs are electronic, which means there are 5 different controls, which in turn means that there is an instruction manual.

The manual has instructions for how to perform the one required assembly step, although tbh I had figured out how to do that without the instructions. There are also pictures of how to plug in the two (!) electronic connections, though again, these were self-evident and had also been designed so they could be plugged in only one way.

The useful part of the instructions was the diagram that showed what the buttons on the control panel do. Ironically, this illustration is very small for, you know, Old Guys. This is it to scale as best I can render it (2 inches wide):

A curious part of the manual is that whoever created the manual decided that they needed to cast the instructions as a set of numbered procedures. Here's step 2, which is one of the more forced applications of a numbered procedure step that I've seen.

Where is step 1, you ask? I'm saving the best part for last. Step 1 concerns a feature of the recliner that I had not previously thought needed instructing:

("While seated in the recliner, enjoy the rock feature which allows you to gently rock backward and forward.")

There's a lot of fun stuff to unpack here:

  • It's step 1.
  • Don't do this while standing next to the recliner.
  • Imperative "enjoy."
  • "The rock feature."
  • Which "allows" you to rock.
  • Gently.
  • Backward and forward, as if there might be other ways to rock in a recliner.

Moments after I read this out loud to my wife it seemed clear that "enjoy the rock feature" has a good chance of entering our familect, as it's called: we now have a stock answer to the question of "What are you doing right now?" "I'm enjoying the rock feature."

But to end on a serious note: numbered procedures ("how-to" documentation) are useful if the reader needs to follow a sequence of steps to achieve a task. Rocking a chair is not a task that requires numbered steps. Understanding what the controls on a control panel do needs reference documentation, not how-to documentation. There are many challenges in technical writing, and one of them is choosing the appropriate style of documentation for what the reader needs.

More dubious guidance: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10

[categories]   ,


  08:15 PM

I spend a lot of time on social media—Facebook, Twitter, and Instagram. There's value here; for example, I "know" many people only through these media, and I much appreciate what I've learned from them. For example, I know hardly any linguists IRL, but I follow many on Twitter, and it's great.

But even I can tell that I overdo it. It's a time suck, and it's an easy way to procrastinate when I need to be doing, you know, work. ("While this documentation is building, I'll just check Twitter quick-like.")

More insidiously, too much social media starts making me cranky. And when I get cranky, I do unfortunate things, like respond in pissy ways to innocent posts by other, nicer people. Or, gah, I give in to the temptation to respond to morons and their idiotic political opinions, a no-win situation if ever there was one.

So I decided recently to implement what I'm calling Asocial Sundays. Between midnight on Saturday night and midnight on Sunday night, I don't visit any social media sites, period.

This is a new experiment, but I can see some benefits already. Not having the option of social media redirects my attention to more productive things. If I'm sitting at my desk and finish some task—paying bills, say—I don't just mindlessly switch to FB or Twitter to see what's up. Instead, I might actually get up from my desk and wander into the rest of the apartment.

It also has been a way to unplug from a source of stress. We all know that it is distinctly not conducive to good sleep to doomscroll Twitter before bed. Politics and COVID are inescapable on social media, and both are not only inherently stressful, they're sources of endless arguments, outrage, scolding, shaming, uninformed opinions, and on and on. It's nice to take a break from that.

I'm far from ready to withdraw altogether from social media, the way some of my friends have. (A social media detox or social media fast, as it's sometimes called.) I've occasionally considered unplugging permanently from Facebook because their ad models are scary and because Zuckerberg is an incorrigible weasel. But as I say, I still get a lot from my social interactions.

If the experiment goes well, I might at least expand my lights-out policy for social media. I suspect that the more I do it, the easier it will be to use social media in a healthier way.

[categories]   ,


  10:42 AM

One of the effects of this year's protests is that it has brought about heightened consciousness about language and how it affects or reflects certain thinking. For example, there have been discussions in the editorial community about capitalizing the word Black "in a racial, ethnic or cultural sense."

In the world of IT, we've been discussing the implications of certain terms for a while. The Microsoft style guide has suggested for a over a decade that writers avoid the terms whitelist and blacklist in order to avoid a connotation that white==good and black==bad. (I wrote about this a while back.)

Our own style guide has a section on inclusive language, and it suggests finding alternatives to a range of language that, when you look at it consciously, can have negative connotations or the possibility of offense. In addition to disrecommending the word blacklist, we tell our authors to use alternatives to terms like crippled system, dummy variables, sanity checks, native features, and first-class citizen.

A short digression about cloud technology. (For tl;dr, you can skip to the discussion of terminology.) We work in the world of cloud computing, which has its own concepts and vocabulary. The cloud, as the jokey definition goes, is just using someone else's computer. More specifically, it means using someone else's one or ten or hundreds of computers, because a fundamental benefit of the cloud is that you can adjust computing power as needed. For example, if you have a retail business, your website might need medium power normally, low power during a summer slump, and heavy-duty computing power to handle your yearly sale. The point is that your need for computing resources goes up and down, and rather than having to build out your own data center to accommodate the maximum possible demand (and whose high capacity might mostly just sit idle), you build your system in the cloud and spin up computers (servers) when demand is high, and take them down when they're no longer needed.

In this environment, computers are commodities. Any single computer is just a faceless worker in your overall computing infrastructure. Compare that to the computer sitting on your desk; you've probably personalized how it runs, installed many updates, and otherwise fussed over it. If one of the faceless computers in the cloud crashes, it's not a big deal; another one spins up and carries on the work. On the other hand, if your personal computer crashes, it's usually a disaster.

Ok, back to terminology. To conceptualize the difference between the faceless fleet of computers in the cloud and your personal computer, technologists devised the metaphor "cattle, not pets."[1] If a computer is a quasi-anonymous something that can be replaced any time, it's "cattle"; if it's an indispensable part of your work, it's a "pet."

Some authors love this metaphor, because, undeniably, it has explanatory power. More than once it's appeared in a document I'm editing, and if I question it, I'll be pointed to how widespread the expression is in IT/cloud texts. And we try to use the vocabulary of our audience.

However. One of my colleagues recently pointed out what might be obvious to a lot of people: the expression "cattle, not pets" is … problematic. One of our principles is "avoid unnecessarily violent language," and once you think about it, you realize that there's implicit violence in the metaphor as pertains to the fate of the "cattle." Moreover, there are cultures in which no cow is just "cattle," and for whom the idea of animals being killed is abhorrent. Therefore, we've updated our guidance to "avoid the use of figurative language that relates to the slaughter of animals."

This sets up a tension we sometimes have when we tell writers to avoid terminology that's still common in a field. We might ask authors to avoid whitelist or cattle-not-pets, and they'll point out that these are well-understood expressions and that using some alternative form potentially confuses readers. ("Allowlist? Did you mean whitelist? Why not just say so?") And people might search for these terms and they'll have no joy if they don't appear in our documentation. Plus it can give off the vibe that we don't know our own field.

But change begins at home. Sure, these problematic terms are part of the industry lingo. And we can't just ignore them out of existence. What we often do, then, is to include the expression parenthetically on first mention of the preferred term. So we might suggest something like "Create an allowlist (whitelist)" or "servers as commodities (sometimes referred to as 'cattle, not pets')" to tie the old term in the reader's mind to a new, better one.

We hope that the collective work of consciousness-raising by many editors and writers over a period of time will gradually alter the perception of problematic terms. The work is never truly done, of course, but it has to start somewhere.

PS I should note that not everyone supports the idea of this type of language change; there's plenty of pushback in the IT world against changing to "so-called inclusive language." These things are not easy.

[1] Devised to explain scaling by Bill Baker, adapted for the cloud by Randy Bias.

[categories]   , ,


  07:02 PM

Imagine that there's a big family reunion and you're charged with making dinner arrangements. You book a room at that Italian place you always go to. It's not your favorite by a long shot, but it's the one that everyone likes. After dinner, though, you're talking to some of your cousins, and you find out that actually hardly anyone likes that place. But you've all been going there because people thought everyone else really liked it.

This is a situation described by a term I learned this week: pluralistic ignorance. This is the idea that you think many others in the group hold different beliefs than you do. As one article puts it, it's the difference between actual norms in the group (what people really think) and perceived norms (what we believe others in the group think). Or to put it more succinctly, "When group members conform to what they think others want, they may end up doing what nobody wants." (The Wikipedia article on pluralistic ignorance cites the story of the Emperor's New Clothes as an example.)

Pluralistic ignorance can have unimportant consequences, like where the family goes for dinner. But it's often negative: in a group, it can prevent people from speaking up—for example, to ask a question, because they wrongly imagine that everyone else already knows the answer. It can lead people to believe that they're "different" and lead to things like impostor syndrome.

I think (I don't remember now) that I ran across this term while reading about the recent protests. The protests unmasked some pluralistic ignorance; when the protests turned out so big, many people discovered a heretofore unsuspected number of their friends and neighbors who thought like they do. Even support for an unpopular political candidate is subject to pluralistic ignorance; arguably, it helps explain the difference between the predictions and the outcome of the 2016 US election.

Surely one result of the internet is that it can help overcome pluralistic ignorance—you might think your views represent a small minority, but you can learn that there are others who think like you do. Maybe not in your family, or classroom, or workplace, or neighborhood, but Out There, at least.

For origins, a word whose history I learned from Edward Banatt on Twitter: reluctant. There are almost no words in English that are related, which is why it's not obvious what it means. Some spelunking in the OED learns me that there was a verb, now obsolete, to reluct, which meant "to fight against." This begins to reveal the story.

The re- prefix is "against." And the -luct part is part of Latin reluctari, meaning "to struggle against, resist." Like, I'm reluctant about (i.e., struggle against) getting out of bed in the morning.

As I say, there aren't English cognates readily at hand. But there are some in other Latin-derived languages. For example, the luct stem shows up in Spanish as luchar, "to fight." Are you a fan of the Mexican sport-theater known as lucha libre? Well, those fighters are—heh, heh—very reluctant. Get it?

Like this? Read all the Friday words.


[2] |

  08:25 AM

We've been without a microwave oven for about 9 months now.

This is not because we have some sort of philosophical objection to microwaves; the reason is much more practical. Before we remodeled the condo, there was a built-in microwave above the stove, which doubled as the no-vent/recirculating range hood. We knew that we didn't want this 1980s-era microwave, and we replaced it with a dedicated range hood:

Our initial idea was that we would get a countertop microwave. But once we'd moved in and sorted out the kitchen, there wasn't really an obvious place to put a microwave, because there's just limited counter space, alas.

So we've been doing without. This means we've had to explore substitutes for how we used to use the microwave. Like, how do you reheat leftovers without a microwave?

We use our oven or toaster oven. The room we might have given over to a microwave is taken up by a small toaster oven. That appliance is more useful than a microwave, I think, because it can heat and toast and broil. I make toast all the time, and we have no toaster-toaster, so this is a daily-use device for me.

One of the advantages of a microwave is that it's fast. But heating things in our little toaster oven doesn't take that much longer. It's a little oven, so it heats up pretty fast. It also has a convection setting (which I think just means it has a fan that blows the hot air around). All in all, what might have taken, say, 2 minutes in the microwave takes maybe 8 or 10 minutes in the little oven.

We heat things on the stovetop. Anything that's got liquid—soup, stew, whatever—we can throw in a pot and heat on the stovetop.

There are definitely dishes that would be easier to reheat in a microwave. Leftover pasta, for example. What I've been doing is putting these into a pan on the stovetop with a glug of water and a lid. It's not perfect, since it inevitably steams whatever you're reheating, but it's tolerable.

We (mostly I) heat things in a frying pan. Some dishes can be fried or refried. Heat a splash of oil in a frying pan and toss in your leftovers. If I fried up hash yesterday and have leftovers (unlikely), I can reheat it by giving it a short version of the same treatment again. And if we've got something like leftover rice, I can steam it again, or I can make fried rice.

I never much liked defrosting things in the microwave anyway. These days, if I have to defrost a hunk of hamburger, for example, I'll put it in the Dutch oven on very low heat on the stovetop.

And then there's popcorn. I love me my popcorn, but I never used prepackaged microwave popcorn anyway. I did have a series of microwave poppers, but they have an unfortunate tendency to break. So I went to the discount store and got a cheap 6-quart pan that is my dedicated popcorn popper. I heat it on the stove and do that classic thing where you shake the pan as the popcorn pops merrily.

I don't miss the microwave as much as I thought I would. It's been interesting to, in effect, return to our pre-microwave days. I didn't grow up with a microwave (back when they were known as "radar ranges," ha), and it wasn't ingrained in me to rely on it. We might still get one someday, but I think that the longer we make do without it, the less likely it is that we'll want to house one more bulky appliance in our limited kitchen space.

[categories]   , ,


  08:54 PM

I picked up a fun word from Twitter recently: infit. The meaning is clear once you understand the context, which is … contemporary.

When you dress for an occasion, you put on what? An outfit. Let's take a look at that word for a moment. An outfit is what you wear when you are equipped for something; the -fit part pertains to being fitted. The out- part does not in this case have a sense of "external" (outland, outbreak) or "exceed" (outdo, outlive). Instead, it seems to mean something like "completion": an outfit is a complete set of clothes or equipment.

But folks have come up with this nominal opposite to outfit, namely infit. What's an infit? It's what you wear indoors, and specifically, what you wear while hanging around inside under quarantine:

The clever part to me is in reanalyzing the out- part of outfit to mean "outdoors" so that the in- part of infit can mean "indoors." I'm easily amused that way, I guess.

I should acknowledge that infit is also used with other meanings:

  • There's an InFit app where the -fit part refers to "fitness," so a lot of the #infit hashtags on Twitter show people doing active-y things. There's a related #InFitness hashtag (often #InFitness&InLife)
  • According to a dubious entry in the dubious Urban Dictionary, infit is an outfit that's "in," meaning "stylish." We'd need more than that contributor's word for it though.
  • On Twitter, infit is also a surprisingly common typo for unfit.

On to origins. Who among us has not been obliged to write an essay, yea, verily, perhaps even the famed five-paragraph essay? But where does the word essay come from?

Yet another etymological surprise: essay is related to the word assay, which means "to examine or analyze." I don't think I'd use assay in a generic sense of examining a thing; I think of it as something done to or with, dunno, gold ore or something. And there is definitely a metallurgical sense of assay.

The verbs essay and assay were originally variations of the same idea, both referring to "test." Or if I read the OED right, to essay was a variant on to assay, which was based on French essayer; essay is actually the older form.

The word essay for the written form was apparently first used by the French writer Montaigne, who wrote a bunch of them. With his essays, Montaigne was indeed testing ("trialling") ideas. As per the article in Wikipedia, his essays …

did not aim to educate or prove. Rather, his essays were exploratory journeys in which he works through logical steps to bring skepticism to what is being discussed.

Apparently Francis Bacon brought both the idea and the word into English in his 1597 book Essayes. There is no particularly formal definition other than that it's usually in prose. If for some reason you have a teacher who insists that you write an essay that follows a rigid format, you can quote Samuel Johnson at them, who described an essay as "an irregular undigested piece." That should quiet them down.

Like this? Read all the Friday words.



  11:06 AM

The new words are coming fast and furious these days. Tony Thorne and Nancy Friedman have been tracking Covid-related terminology. But that was last week's news.

This week it's about protests. I saw a couple of related terms that emerged this week: optical allyship and ally theater. My understanding is that these mean essentially the same thing, namely talking the talk but not walking the walk. Another term is performative allyship. The term optical allyship was apparently invented at the beginning of May by Latham Thomas, who was observing that doing something like posting allyship messages on social media can look like allyship but isn't by itself the whole story.

There are some interesting things to examine here. First, there's allyship. There's a neutral definition ("The state or condition of being an ally"), but in the context of optical allyship it's defined this way:

Allyship is an active, consistent, and arduous practice of unlearning and re-evaluating, in which a person holding systemic power seeks to end oppressions in solidarity with a group of people who are systemically disempowered.

The disempowered in this context can be any minority, including LGBTQ people and people of color. (There are initiatives where I work for people who want to learn and practice allyship, which is good, because tech has its issues with privilege.)

Then there's the optical part. Optics has referred for a while to the appearance of a thing, as in the phrase bad optics. Ben Zimmer wrote a column 10 years ago in which he found a member of Jimmy Carter's administration in 1978 saying "It would be a nice optical step." It's easy to unpack optical allyship as someone who only appears to be doing allyship.

I'm not aware offhand of other optical-type compounds like this (and I can't devise a search that finds such compounds), but I can see it being productive in forming new "only the appearance of" terms.

I also mentioned the synonym ally theater. This was reminiscent to me of the term security theater, which was coined by the security expert Bruce Schneier. Security theater refers to measures that look like they're providing security but aren't particularly effective—except perhaps at making people feel more secure. (The example people usually point at is TSA checks in airports.) Thus also ally theater, which might make people feel good, but is not very effective.[1]

If you're interested in non-optical allyship, a web search will give you plenty to read. And if you know of other optical-type compounds or more [concept] theater terms, let me know.

Ok, origins. This week it's percolate. When I was a wee lad, people made coffee using a percolator, a method that ends up boiling the coffee, which probably makes a lot of people today shudder. There's also of course a metaphoric sense of "spread gradually."

The origin is almost clear from the word, it turns out. The per- prefix means "through." And the -colate part is "to strain." It's pleasing to me that we've seen this root before, in the word colander! Who knew. Although in the case of percolate, the word doesn't have that "excrescent N" that somehow found its way into colander.

And speaking of origins, here's another quickie word origin, one that's been in the new this week: loot (via Nancy Friedman) and looter (via Ben Zimmer in the WSJ, paywall).

[1] I have a grumpy feeling that some part of what we're seeing with the anti-COVID measures is "hygiene theater."

Like this? Read all the Friday words.



  07:47 AM

Today's new-to-me word was interesting to me in part because of the context in which I found it. I was reading a TPM article about this week's executive order that pertains to social media, and I saw the sentence "Trump revealed the kayfabe of the whole exercise." I wasn't sure I'd seen kayfabe before; it certainly doesn't come up much in my political readings.

The term apparently originated in the world of all-star a.k.a. professional wresting. This is a type of sport—and I suppose that label is questionable—in which the participants seem to be engaged in a contest, but which is more of a performance. An important part of the culture of professional wrestling is that it pretends to be real; to paraphrase a different type of sport, the first rule of the sport of professional wrestling is that everyone pretends it's a real sport.

This see-through illusion is referred to as kayfabe. As the Wikipedia article on it says, kayfabe is the suspension of disbelief that surrounds all aspects of wrestling, from the actual performances to the personas and supposed rivalries. As near as anyone can tell, the word kayfabe is a Pig Latin version of the word fake, used to (supposedly) hide the word fake from outsiders.

Back to the TPM article. Here's the full context:

Appearing with Attorney General Bill Barr, Trump revealed the kayfabe of the whole exercise: “If you’re gonna have a guy like this be your judge and jury, I think you shut [Twitter] down, as far as I’m concerned,” he said, referring to Twitter’s fact checks.

I think that the writer is suggesting that the executive order is theater, that everyone—including the participants—knows that this gesture is about playing to the fans. It can be a bit hard to tell, though, in contemporary politics. In professional wrestling, when the match is over, everyone goes home and nothing changes. We'll see about this EO.

Origins question for you: is it weird that the word irony includes iron? Surely the metal can have nothing to do with irony as "conveying the opposite of a literal meaning"?

No, whew. Irony comes originally from Greek eironeía, which means something like "dissimulation"—feigning ignorance. We got the iron-ical spelling from French via Latin, which had pre-borrowed the word from Greek for us.

The sense of irony as a way of being witty goes all the way back to the Romans. But the original Greek sense wasn't just about saying "I love it when it rains on the day I want to go hiking." For example, Socratic irony is a technique in the Socratic dialogues (I guess?) where someone pretends ignorance not to be funny, but to lure them into showing their ignorance.

I guess I'll note that there actually can be iron in irony, as I learned when I looked up the word; you can use irony to mean "iron-like" or "containing iron." The OED has a cite for this sense from 2009 ("the irony taste of blood"). People do get a little confused at times about what irony means, but I doubt they'd get these two senses mixed up.

Like this? Read all the Friday words.



  01:32 PM

When there are attention-getting events like COVID/Covid/covid, it's natural that technical vocabulary leaks into popular media. For example, a friend of mine asked me why it's the novel coronavirus, and the best explanation I had was that the "novel" appellation was used by virologists and epidemiologists and had made its way (probably unnecessarily) into news stories.[1]

In this vein, while reading an article in the New Yorker recently, I ran across the word nosocomial. Granted, they were using quotation marks and explaining the term, so they weren't trying to sneak it past us or anything. But still, when have we previously seen this term? Outside specialty literature, I mean.

Nosocomial refers to an illness that's spread in a hospital; "hospital-acquired." I could not guess from looking at nosocomial what it could mean. It's ultimately Greek; the constituent parts are noso, meaning "illness," and kom, meaning "care."

As with many infections, putting a lot of people into proximity[2] has the unfortunate tendency to make it easy for the infection to spread. Thus the nosocomial coronavirus, which has had high incidence in places like nursing homes and, yes, hospitals.

Hospitals are a particularly insidious vector because the healthcare professionals treating patients in one hospital can easily spread it to another one. This means that the concept of nosocomial spread is related to iatrogenic, meaning you got sick from a doctor. I am reminded of a book I read not long ago, The Butchering Art, about Joseph Lister's efforts to introduce antisepsis to medical procedures in Victorian times. In those days, one place you definitely did not want to be treated was in a hospital.

Update: I asked my wife, who's in healthcare, if she knew nosocomial. "Oh, yeah," she said.

For origins this week the word soldier. It doesn't appear to have obvious cognates that suggest where we got it from. So off we go to the dictionary.

Not surprisingly, the -ier ending tells us that it's from French. The sol- part is the interesting bit: it's a historical word that used to refer to a type of French money or coin. It goes back to the name of a Roman coin, the solidus, whose name is indeed related to the word solid. You want people to soldier for you, you'd better pay them with some solid money.

The French sol does have a modern descendant, namely the French sou. I guess that the sou is not in use anymore, but it does retain a metaphorical sense of "a coin or thing of very little value," sort of like the British use of farthing (?).

Anyway, a soldier is essentially someone who's paid for military duty. Not to be confused with a mercenary, who gets paid to soldier for other people, which is to say, whose loyalty is to the sol, not to the person/country/entity that's paying it.

Like this? Read all the Friday words.

[1] Decades of technical writing have hammered into me the value of using the vocabulary of your audience.

[2] I would have written "close proximity," but that would earn me an editorial spanking: pleaonasm.