A simple solution to the confusion between "testing" and "checking"

October 3, 2021

Recently I mentioned the concept that testing and checking are distinct activities. I find this distinction valuable. However, I don’t find this terminology to be valuable.


It’s confusing. And for no good reason.

I’ll elaborate.

  1. This use of these two terms (without confusion) is only possible among the initiated. If you’re at some sort of tester conference, you may be able to use these terms freely without confusion. If you’re speaking in almost any other context you’ll get confused stares (or worse) if you say something like “It’s impossible to automate testing!”

    For most practical purposes, you may as well say “It’s impossible to cross the street!” while silently intending your own secret definition of “street” that somehow refers to a thing that is uncrossable.

    The normal definitions of “test” and “check” don’t have this distinction, and expecting developers, project managers, or CEOs, to adopt this new terminology is a lost cause.

  2. “Check” is a horrible alternative to “test.” As if “test” wasn’t already vague enough, “check” is even more vague. At least when we talk about testing (in normal, every-day usage), we usually have a sense of “correctness”. “Check” doesn’t even do that! “Check on the baby” or “check the oven”. Check is such a horribly vague term that I discourage it in function names, too.

  3. Even the proponents of this “check” vs “test” nomenclature can’t use it consistently.

    I was recently involved in a conversation in which someone tried to convince me that “Writing tests is not testing.” What? If you want to dictate that these are checks, not tests, let’s also mandate that we say “writing checks.”

So what’s the alternative?

I’m sure I’m not the first to propose this. Even so, I’m sure my opinion is contentious to some. So be it. In my estimation, all evidence points to the fact that the “test” vs “check” terminology battle was lost before it ever began began. Granted, languages do evolve over time. But there’s too much existing meaning in these terms to expect any sort of meaningful change in anything less than decades, if even then. We need terminology that makes sense now.

My proposal: Let people use the word “test” to mean what it already means to the masses. That is, let it continue to encompass both manual and automated testing. Let it encompass unit testing, acceptance testing, API testing, end-to-end testing, exploratory testing, load testing, fuzz testing, Test-Driven Development, and all the other types of testing you can imagine. The term is already broad. Trying to add another, specific meaning, to this already overloaded word literally only makes the problem worse.

Instead, when it’s important to be clear about which type of test we’re doing, we can use accepted English words or adjectives to clarify. “Exploratory testing”, for example, is not an exact synonym for “test” as defined above, but it’s close enough to probably work as a stand-in in most conversations. And in place of “check”, why not use the widely understood terms “validate” or “verify”, for example?

Rather than confusing, or worse berating, people who, through no “fault” of their own, are using the English language perfectly correctly, but not according to some essoteric standards, we should change our standards. If you can’t beat ‘em, join ‘em.

Keep the good distinctions. Lose the bad terminology.

Related Content

Reader response: How does QA fit into DevOps?

“We solve the problem of the ‘wall of confusion’ between dev and QA by enabling dev teams to do most testing on their own.”

Are developers blinded by their optimism?

"Developers are optimistic, testers are pessimistic." I don't buy it.

That one time I fired our QA team

I got a few complaints from developers who didn't enjoy splitting focus between dev and testing, but they admited it was better than before.