I'm Mike Pope. I live in the Seattle area. I've been a technical writer and editor for over 30 years. I'm interested in software, language, music, movies, books, motorcycles, travel, and ... well, lots of stuff.

Read more ...

Blog Search

(Supports AND)

Google Ads


Subscribe to the RSS feed for this blog.

See this post for info on full versus truncated feeds.


A person almost always burnishes his reputation by shutting up: I learned that as a boy.

— Garrison Keillor


<November 2018>




Email me

Blog Statistics

First entry - 6/27/2003
Most recent entry - 11/16/2018

Posts - 2532
Comments - 2584
Hits - 2,096,273

Entries/day - 0.45
Comments/entry - 1.02
Hits/day - 373

Updated every 30 minutes. Last: 11:35 PM Pacific

  07:58 AM

Suppose that your computer gets disconnected from the internet. If you use Chrome as your browser, you might see an error like the following:

There's a little T-Rex on the page, designed in 8-bit, monochrome graphics style. As I found out just today, this little critter is known as the downasaur. Haha, get it? It's a dinosaur that tells you your connection is down.[1] I happened to come across it when reading about a technology that always works, even if you're offline. ("Reliable - Load instantly and never show the downasaur, even in uncertain network conditions.") I was struck that the reference was used without quotes or italics, which suggests that the writer expects a term to be a well known.

The downasaur is one species in a veritable zoo of fail pets, which includes the GitHub Octocat and the Google Broken Robot:

But there's more! The downasaur isn't just a cute graphic—it's a game. If you happen to encounter the downasaur, press the spacebar on the keyboard.[2] The downasaur starts running, and you press the spacebar again to have them jump over obstacles like cactuses:

None of this is new, just new to me. (The game was added in 2014; you can read an interview with the creators on the Chrome blog.) There's apparently no official name for the game. The downasaur reference that I ran across is from 2017, and might have been coined internally at Google and has since leaked out into the world via pages like the one I was reading. I hope to investigate the name a bit more. But in the meantime, I know what to call it when Chrome tells me my connection is down.

And a short new-to-me origins today, although it's a kind of meta one. For some reason I got curious about where we got the word cognate. A cognate is a word that's similar in two (or more) languages, because it comes from a common root. For example, father in English is a cognate of Vater in German; calculate in English is a cognate of calcular in Spanish.

Well, it turns out that the etymology of cognate is sort of right there in the word. It comes from Latin co ("with") and gnatus ("born"). The gnatus part is related to words in English like genesis, generate, natal, nascent, and a bit further afield, genre, gonad, native, and pregnant. A productive root indeed.

Like this? Read all the Friday words.

[1] We could have a discussion, I suppose, about the -a- in the middle of downasaur and why it isn't an -o-.

[2] If you don't want to disconnect your computer just to try this, enter chrome://dino in the Chrome address box.



  07:41 AM

You are undoubtedly familiar with the expression "to blow [or toot] one's own horn." Suppose that you wanted to create a noun that captured the meaning of that expression: "to talk about one's own accomplishments." There's boasting, of course, but that has a more negative connotation than we want, perhaps, as is true for bragging and crowing.

The lexicographer Peter Gilliver might have solved this issue for us. In 2016, he published a book about the history of the Oxford English Dictionary (a.k.a. the OED, much cited here in Friday words). As he describes in a forum thread, he spontaneously invented the word autotrombation as an email subject line. He likes the word well enough that he's continued using it:

What's charming about the word is how it reduces "blow one's own horn" down to a single word. Auto captures "one's own," and trombation is a made-up verb based on the root that gave us trumpet and trombone (i.e., a root meaning "horn").

However, there is one problem, namely that trombate, the nominal but invented source for trombation, is a vulgar word in Italian. (Definitely don't search for this word while you're at work.) That issue aside, though, it's a word that deserves more widespread use.

On to origins. This week I attended a funeral and unexpectedly found myself acting as a pallbearer. Later on I got to wondering about the word. Bearer, sure, that's "one who bears." But what does pall mean?

It's an interesting case of transference. Pall is cloth, or a piece of cloth. It comes from a Latin word pallium, which meant a cover or cloak. In the church, the Latin word took on a more specialized meaning; for example, in the Catholic church, archbishops wear (wore?) a pallium.

In English, the word pall meant both a fine robe and a cloth spread over the altar or used in some other ceremonial way. One of those ways was a cloth laid over a coffin. Thus a pall-bearer was originally someone who held the edge of a cloth in a funeral procession. Here's a great example from 1834 that shows this meaning: "In addition to the six persons who supported the bier..there walked, on either side of it, the three others who were selected for the office of pall-bearers."

From the cloth, we transferred the meaning to the coffin itself. These days, a pallbearer is someone who carries or escorts the coffin. There might or might not be any cloth involved, despite the origins of the term.

In case you're curious, the word pall in an expression like "a pall of gloom descended" or "it cast a pall" is the same word. Covering something with a cloth is a good metaphor for a dark mood coming over people.

As an aside, the funeral was not at all gloomy.

Like this? Read all the Friday words.



  07:51 AM

I don't even remember what I was reading, but not long ago I ran across the term ninja rocks. This sounds kind of cool, doesn't it? These "rocks" are in fact pieces of broken porcelain that are useful for the commission of certain crimes.

Here's the deal. Take a handy automotive spark plug. This has a lot of metal bits that are partially encased in an extremely hard shell of porcelain:

Grab your hammer and break up the porcelain. (Note to the thrifty: do this with a used sparkplug.) Then take these broken pieces and throw them at a car window. Remarkably, doing this can easily shatter the window, as this video shows:[1]

I didn't research this extensively—breaking into cars is not high on my list of priorities at the moment—but this works because of the way auto glass is designed. Namely, in order not to leave large, jagged pieces of glass if it does get broken, auto glass is built to shatter. If you can hit it hard enough with a small enough point, it will do just that. Enter ninja rocks.

I am amused that many descriptions of ninja rocks, including the Wikipedia page, feel the need to point out that ninja rocks "have no traditional association with the ninja." I duly include that disclaimer here.

The other day a colleague took a bite out of a cookie and made a face that was not one of extreme satisfaction. Someone asked him about that, and he responded, "Tastes like marzipan." Considering how widespread marzipan is, I'm surprised at how many people don't love it. But the incident did afford me an opportunity to look into where we got the word.

The first thing I learned is that for several centuries, English used the word marchpane, which seems like it's probably a folk etymology (as we saw with penthouse)—a change in the sound of a borrowed word to make it more native-like. But we re-borrowed it in the 19th century and this time the exotic foreign spelling stuck. We might have gotten it from German, but that language and a bunch of others seem to have gotten marzipan from Italian.

The trail is murky before that, but it's generally thought to lead back to Arabic, with various theories. It might have come from Martaban, the name of a city that was renowned for a type of pottery; the name of the city was attached to the jars and then in turn attached to contents in those jars. Another theory is that it referred to a type of coin, which in turn might have come from a word in Arabic that means "to remain seated" (possibly referring to an image on the coin). The OED calls this last theory "tenuous."

Which leads us back to that cookie, whose flavor my colleague likewise seemed to consider pretty tenuous. I'd probably agree on this last point.

Like this? Read all the Friday words.

[1] I picked this video in part because of the evil laugh at the end.



  06:08 AM

Suppose you have 2 researchers interested in Cat Studies. Researcher A performs an experiment using 12 cats that shows a remarkable correlation between how much TV cats are exposed to and how much they meow. Researcher B performs a similar experiment using 60 cats, but he finds no correlation.

Which of these experiments will get written up in Cat Studies Journal? It’s not hard to guess that it’s Researcher A who will publish an article and then go on to do interviews for TV Guide and NPR.

Researcher B didn’t even bother to submit an article to the journal, because he knew it wouldn’t get published. Who wants to read about experiments whose conclusion is “Nothing to see here?” Instead, Researcher B takes his video tapes and R programs and shoves them in a drawer.

Which brings me to today’s new-to-me word: the file drawer problem, also known as publication bias. I first learned this term from an old Planet Money podcast about why so many experiments can’t be replicated. The term file drawer problem was first used in 1979 in a paper whose abstract begins this way:

For any given research area, one cannot tell how many studies have been conducted but never reported. The extreme view of the "file drawer problem" is that journals are filled with the 5% of the studies that show Type I errors, while the file drawers are filled with the 95% of the studies that show nonsignificant results.

The file drawer problem has important implications for how research is reported. As noted, the communication in the field is biased toward research that has positive results. Moreover, these positive results might be have come about through defective methodology, or maybe just as a statistical fluke. (The PsychFileDrawer site lets researchers share the results of attempting to replicate published studies—for example, there's a page that collects studies about the efficacy of "brain training" games.) Anyway, the next time you hear in the popular media about some study that shows an intriguing correlation between, dunno, playing Monopoly and success in the stock market, remember that there might be other studies in file drawers that fail to replicate that study.

Ok. What do you suppose bachelors have to do with vaccines? Cows. Well, maybe; it's not certain where the word bachelor came from, which entered English from French in medieval times. A possible origin is that in Latin there was a word baccalarius that referred to someone who worked on a baccalaria, a dairy farm; hence, a baccalarius could have been a cowhand. Bacca is a variant on vacca, a Latin word for cow (vaca in Spanish, vache in French). We also got vaccine and vaccination from this root, because when Edward Jenner did the groundbreaking work with smallpox vaccinations, he deliberately infected people with cowpox, a related but much milder disease.

In English, bachelor initially referred to a knight who was too poor or too young to have his own banner. It also referred to a junior member of a guild. By Chaucer's time, a bachelor was someone who earned the lowest (pre-master) degree at a university, or even a man who hadn't yet married—both of these senses appear in The Canterbury Tales (1386). While we contemplate the bachelor-cow connection, we can think about how to capitalize and punctuate a bachelor's degree. Fortunately, Grammar Girl is on the case.

Like this? Read all the Friday words.



  08:33 PM

The new-to-me word this week is a pretty new one altogether: mispronouner. This is a word invented in a blog post by the linguist Dennis Baron, who was making a logical extension of the verb to mispronoun. To mispronoun is to use the wrong pronoun for someone; this might be by mistake or might be as a way to harass someone. (Compare to deadname.) Thus someone who performs this act of using the wrong pronoun is a mispronouner.

I suppose that this is pronounced mis-PRO-noun-er, but I like to think that we might also pronounce it mis-pro-NOUN-er. Being linguists, we’ll have to wait and see which way the language goes.

BTW, Professor Baron has written lots about our attempts in English to come up with an epicene pronoun, and I recommend reading his blog for more information on this and many other language topics.

For today’s etymology, I investigated the word slide, in the sense of a photographic slide. For those who might not have personal experience with slides, photographic slides are pieces of 35mm film that are packaged in a square cardboard or plastic frame:

In slides, the image is “positive”: it shows the actual colors captured by the camera, as opposed to the opposite, or “negative,” image that’s captured in regular film. Photographing with negative film is a two-stage process. When you take a photo, light falls onto the film, which is clear stock that’s coated with a light-sensitive substance. The more light falls on the film, the darker the image gets on the negative. After you’ve processed the film, you make a print by shining light through the negative onto paper stock that’s also been coated with light-sensitive stuff. The darker the image on the negative, the less light gets to the print, so what was light in the original scene is light in the final print.

Color negative, slide, black-and-white negative

Slides—i.e., positives—start the same, with clear stock coated with light-sensitive emulsion. They also get darker the more light falls on the film. But in a bit of photographic sorcery, the image is reversed directly on the film during processing. Ergo, when the slide has been processed, it has a positive image on it. Back when movies were made on actual film, this was the process that they used to create movie film.

This, finally, brings us back to the actual word slide. Slides are intended to be projected onto a screen. Slides actually pre-date photography. The original slides were painted onto glass, and then often covered with another piece of glass. They were projected using the charmingly named magic lantern:

As you can see from the photo, the lantern had a slot in it into which you could … ready? … slide the piece of glass. (“His history..passes before us like a series of slides in a magic lantern.” [1858, via the OED].) Slides sometimes had a series of images on them, so you could go from image to image. (If you remember the View-Master, that’s basically the same idea, except that the View-Master slides were round, with something like 7 stereoscopic images on them.)

Aside: the slide used in microscopy captures the idea of an image—or, well, something—sandwiched between two pieces of glass.

Magic lanterns evolved into slide projectors, and the original glass slides evolved into the cardboard squares. (A brilliant scene in the TV series Mad Men (s1:e13) shows the ad exec Don Draper pitching the Carousel slide projector, which many of us olds know very well.)

Back to the present. The word slide also evolved into the all-too-familiar format of PowerPoint. Altho we left behind the last vestiges of photography in this new meaning of slide (unless you include a kitty picture in your slides, as I always try to do), we have hung on to the original sense of “sliding” from one image to the next. But the idea of being in a darkened room subjected to someone yacking about whatever’s projected on the screen—well, that seems to go back quite a ways.

Like this? Read all the Friday words.



  07:00 PM

I commute to work on crowded mass transit, and when I get there, I work in an open office. So I consider good headphones an essential part of my gear. My employer apparently agrees; they subsidize headphones for us. I’ve appreciated the pair I got: over-ear, noise-reducing, Bluetooth headphones. I use them for hours a day every workday.

But the daily use has taken a toll. A few months ago I noticed that the headphones seemed loose on my head. Close examination revealed that the plastic arch between the earpieces had cracked. Thus began an ever more involved effort to save these lovely headphones.

Bridging the crack

My first thought was to patch over the crack. I found a washer that was about the size of a quarter, and used epoxy to glue the washer across the crack, then taped it over to salvage some semblance of aesthetics. (Ha.) was a little dubious about this, but it actually worked ok.

However, a few weeks later the headphones were loose again. I thought my patch had failed, but no—a second crack had appeared at a different point. It seemed clear that there are stress points in the headphones:

I tried a second patch like the first one, but a third crack developed.

Repair or replace?

After this discouraging development, I spent some hours online looking for a replacement for my headphones. I looked and looked, but two things ultimately stopped me from buying a new pair. One was that omg, headphones that have all the features I want (NR, Bluetooth, over-ear, decent audio) are expensive. And to add to this disheartening discovery, many reviews suggested that many other brands of headphones were probably just as prone to breakage as the ones I already had. So I returned to the idea of trying to engineer a fix for the ones I already had.

Brothers of bands

After the severally patched cracks had failed, I kept thinking that I needed to in effect make a new arch for the headphones. I needed some sort of spring-like band of material that I could attach to the headphones. (Some people might already have thought about an obvious solution, which I arrived at later; bear with me a few moments.) I kept thinking about some sort of plastic, but couldn’t arrive at a material that was both flexible enough and had enough spring. What I eventually did was to cut apart the plastic jar from a well-known brand of popcorn and laminating four layers. This seemed to provide the right amount of spring:

I then taped this ad-hoc spring to the headphones with lots of tape, even further reducing their visual appeal:

(I swear that I catch people on the train looking at my jury-rigged headphones and wondering “What the heck is that?”)

Is there a spring for the head?

This worked pretty well for a couple of months. But inevitably, my plastic spring started losing some of its sproing, so I was back to thinking about a better way to make this fix. It finally occurred to me that there is a device that is pretty much designed for this exact purpose: headbands for hair. I betook myself to the beauty section of the local drugstore and pondered my many choices. I ended up with a set of thin metal bands:

I disassembled and reassembled the headphones, this time adding one of the metal headbands to the arch. (I don’t want them to be too springy, because I wear the headphones for long periods and don’t want to squash my ears.)

And that’s where I am today. I’m hoping that this repair, or if necessary, another one like it, will hold until the electronics fail, or I step on them accidentally, or I have some other reason to buy a new pair. And next time I’ll have a head start on ways to fix the headphones when they start cracking.

[categories]   , ,


  07:09 AM

This week's new-to-me word is Voldemorting, which I learned about from an article in Wired. As you can guess, it originated in the Harry Potter series, where characters avoided saying the name of He Who Must Not Be Named. (As it turned out, avoiding the name was wise, because simply uttering the name Voldemort had consequences.)

The new use of Voldemorting is also about avoiding names—but in this case, it's so as to not give the name more search "juice" on the internet. In its original use, the idea was to avoid naming trashy celebrities, thereby not helping build their fame (and search result ranking). As the Wired article explains, this has extended into the political realm, where people use euphemisms and work-arounds to avoid naming politicians they disapprove of, and (potentially) to avoid being tracked by people who track mentions of particular names.

What I find fascinating about this term is how it ties into our collective unconscious about the power of language. Probably since language was invented, people have felt that certain language was special: using certain words (abracadabra) or names (YHWH) had powerful, perhaps mystical effect. We still have societal customs around performative language ("I now pronounce you man and wife," "You're out!"), and even in these latter days, there are words that are so powerful that they're simply taboo: we may not utter them for fear of consequences.[1]

Voldemorting extends this sense that certain words have special powers, but with a new and technological twist. Using certain words (on the internet, anyway) has actual, trackable effect. And as with taboos of the past, we can fend off certain undesirable outcomes by avoiding those words. You don't want to accidently summon any demons, right?

Origins. You might know that when I wear my editor hat, there are cetain words that I'm constantly trying to remove from technical documents. Among them are consider ("Consider increasing the memory size") and desire ("Specify the desired operating system"). I found out just recently not only where these words come from, but that they have a common root.

From the lexicographer Serenity Carr, I learned recently that desire is de ("removed, away") and sire is really sider, a Latin word for a star or heavenly body. (Compare sidereal.) The connection between "long for" and heavenly bodies isn't entirely clear ("the sense-history is unknown"—OED), but being away from a star makes you long for things. I guess.

And we can now see the connection to consider, whose origins I learned just this week. Consider is of course con ("with") and sider ("star" again). To consider is to "examine closely"; as the OED says, the relation to stars "might thus be originally a term of astrology or augury." But they're not sure.

Notwithstanding these interesting origins, I'm still going to try to spike every instance of desire and consider that I encounter in the docs I edit. Fair warning.

[1] The list of taboo words of course changes over time. Words that we can freely speak today were taboo in previous times; words that our forefathers used routinely are now taboo.

Like this? Read all the Friday words.


[2] |

  07:35 AM

I have some new terms today, but I also need to do a little housecleaning, so to speak. I have a couple of new-to-me terms in my queue that got a good airing in other forums recently. So why not just let those folks do the talking? Here you go:

himpathy. In the New York Times, Kate Manne discusses this word in terms of the back and forth during Brett Kavanaugh's confirmation battle to be seated on the US Supreme Court. She defines himpathy as "the inappropriate and disproportionate sympathy powerful men often enjoy in cases of sexual assault, intimate partner violence, homicide and other misogynistic behavior." I like wordplay, but I’m always leery that new words based on rhymes (for example) might not have staying power. But who knows! Seems like a useful term.

HODL. This is a misspelling of "hold" that originated in the world of cryptocurrencies like Bitcoin. (As in, hold, don't sell.) Nancy Friedman has a great write-up in which the original "HODL" misspelling is only the beginning.

Ok, I can cross those off my list!

I have to confess that the two new-to-me terms that I have today grabbed me because I liked the way they sounded when I learned them via a tweet. The terms are the jingle fallacy and the related jangle fallacy. The jingle fallacy is when multiple concepts are considered the same (or lumped together) because they have the same name. An early example was the term college students: people think of college students as constituting a more or less homogeneous population, but any such population will include part-timers, people who end up dropping out, a student who's there only for a year, plus of course full-time students. Another commonly cited example is the word anxiety, which is often used to also cover the separate condition of fear.

The jangle fallacy is sort of the inverse: when people think that concepts are different because they have different names. Here are some examples that also show why the jangle fallacy is problematic:

A particular attribute may be labeled a “skill” by an economist, a “personality trait” by a psychologist, a certain kind of “learning” by an educationalist, or a “character” dimensions by a moral philosopher. Each may have the same concept in mind, but miss each other’s work or meaning because of the confusion of terms.


Both terms seem to be largely confined to psychology and cognitive studies, at least for now. I was a little surprised to read that jingle fallacy goes back to 1902. In the original discussion, H.A. Aikins uses jingle in the sense of something catchy. His alternative term was the reflex fallacy because the fallacy was the result of reflexive thinking:

On to origins! If you want to win, you need a strategy. You know, a plan. And who makes those sorts of plans? Leaders! High-up type leaders, like maybe generals. Indeed, as Friend Ben noted to me not long ago, the word strategy comes from a Greek word for a general. (Tho as the OED says, in English this is "Partly a borrowing from Latin. Partly a borrowing from Greek.") The constituent parts in Greek are stratos ("army") + agein ("to lead"). As a bonus, I learned that stratocracy means "government by the army." I can't say that that would be, haha, a good strategy.

Like this? Read all the Friday words.



  08:09 AM

Let's go back to language words this week. I have two new-to-me words that are actually new—so new that a toothpick stuck in the middle doesn’t yet come out clean.

The word kerning, as you might know, refers to adjusting the horizontal space between letters. Here are examples of text that is tightly kerned and loosely kerned:

There are times when badly done kerning results in, um, suboptimal text. If you search the web for kerning fail, you'll get many amusing results, like this one:

Last week, the author and puzzlemaker David Astle proposed the word keming for "any word misread due to improper kerning." See what he did there? So clever.

Update: A number of people have noted that keming was coined in 2008 by the photographer David Friedman.

Moving on. A couple of weeks ago, an xckd cartoon explored compound words that could be reversed, so to speak:

The gang of linguists that hangs out at the Language Log had some fun with this, trying to come up with as many of them as they could in a short time. There really are tons of them, as the commenters showed, especially when you allow non-closed compounds: shotgun/gunshot, cathouse/housecat, offhand/handoff, racehorse/horse race, etc.

The English professor David-Antoine Williams took a systematic approach and scoured the dictionary (the OED) for pairs like this; he came up with 2,568 pairs (!). He wrote a blog entry about them, and among the points he ponders is what we should call them. One candidate is chi compounds, based on the Greek letter chi (Χ). (Compare the chi in the name of the literary device chiasmus.) Another proposal is bidirectional compound terms. Descriptive but maybe a little dull? The name that Williams settles on is the simplest: boathouse words, after boathouse/houseboat. I vote for this term. It's easy to remember, and there's precedent for using a name like this. For example, when Brianne Hughes was exploring compounds that consisted of a verb+noun (spitfire, pickpocket), she named them cutthroat compounds after a good exemplar of the genre.[1]

So much discussion of new words that we're almost out of time for origins! A quick one today: where does the word zombie come from? The received story is that it's from a Bantu language in West Africa; there are comparable words nzambi ("god") and zumbi ("fetish"). Douglas Harper repeats this theory, but also suggests that it might come from the Spanish word sombra ("shade") to mean "ghost." No one else says this, so it would be interesting to learn what Harper's source is and learn more about that. Fun zombie fact: in the film Night of the Living Dead, they never speak the word zombie; the script [PDF] refers once to zombie, but mostly refers to the undead as ghouls. Wikipedia: "How the creatures in contemporary zombie films came to be called 'zombies' is not fully clear. "

1 Not to mention shitgibbon compounds.

Like this? Read all the Friday words.



  10:00 PM

This is my maternal grandfather, known as Opa because he was German:

I don't know a lot about this portrait, other than it was done in 1956. I guess it's done in conté, a type of artist's crayon. I suspect that the portrait was done as a birthday gift by family or by colleagues.

Ever since I was quite young, people have told me that I look a lot like my Opa. For example, when I was 14, we visited one of my grandfather's friends, and the friend couldn't stop laughing at the resemblance. To my 14-year-old mind, looking like an old guy seemed literally impossible. I imagine that it's hard for people to see their resemblance to someone else; I have never really seen it. Still, my mother shared this belief, and a few years later, she took a photo of me next to the portrait so she could show distant relatives this supposed resemblance:

Ok. About a year ago, I watched a video by the artist Eric Chapman, a time-lapse of him doing a portrait:

While I watched the video, it occurred to me that this was something like my Opa's portrait. And this led to what might have been the most vain thing I've ever done: I contacted Eric and asked about having a portrait done that was complementary to my Opa's. Sure, no problem, he said, after he'd seen a photo of the original.

I got my daughter to take a series of photos, which I sent off to Eric. I had to make some decisions—size? show all the hair or not?—but those having been made, after a couple of weeks Eric was all done:

When I got the portrait, I had it framed, and now Opa and I occupy a wall together:

I had a funny moment when I finally saw the pieces side by side—I realized that I'm actually a year older in my portrait than he was in his. But no matter how old I get, I'll always think of him as the old guy.

My own kids seem to be ok with all this. In fact, my son mentioned that maybe he'd have a portrait done as well. People tell me that he resembles me, hmm.

[categories]   ,