HOME·SIMPLICITY VS OVERSIMPLIFICATION

Simplicity vs Oversimplification

The idea for this post came from a moment when I was once again told I was overcomplicating things and my approach went beyond what was actually necessary.

Even though I was confident that my approach was justified, I couldn’t help but revisit the familiar thoughts: “This is too complicated. Let’s keep it simple.” or “No point spending time on this — it’s never going to happen anyway.”

These thoughts require active effort to push back against. It got me thinking: what role does Simplicity actually play in our craft? And how is this concept used (or misused) in different contexts?

Agile-based software development methodologies have inherited a heavy emphasis on the concept of Simplicity a principle deeply rooted in design and engineering traditions. Cultural ideas like the KISS principle and Less is more, as well as Lean Thinking, have been reinterpreted over the years and eventually became part of the foundation for Agile itself. One of the 12 Agile principles clearly states:

Simplicity — the art of maximizing the amount of work not done — is essential.

And indeed, not writing code that no one uses, not releasing features nobody needs, and not generating artifacts that don’t influence decisions these are all critically valuable mental models in today’s world. Sounds simple enough but let’s take a quick side road for a moment.

Side track story

Back in my younger days, I was part of an amateur basketball team. We trained together, competed in local tournaments, and just had fun. At one point, someone in our team chat brought up an idea: “Why do we only say happy birthday? Why don’t we actually give presents?” I was quick to respond that while it sounded nice, it wasn’t necessarily a good idea or at least not as straightforward as it seemed. But the group enthusiasm took over, and the general sentiment was: “Don’t overcomplicate it! It’s obviously a great idea. Celebrating each other, sharing positive emotions, building team spirit — what could go wrong?” So we went for it.

And… less than a year later, we dropped the practice entirely.

Here’s what happened: The first few birthdays went well. Someone would take the lead, create a separate chat without the birthday person, suggest a gift, collect money, and buy it. Then we’d all present it together. Over time, though, the initial enthusiasm faded. Fewer people wanted to organize the process. Even though the team had fewer than 15 members, suitable gift ideas quickly ran out. Being a basketball team, we went through the obvious options first — balls, uniforms, accessories. After that, it became repetitive. Eventually, we had an honest conversation and admitted that the practice no longer made sense. It had turned into something we kept doing simply because it had been started. We were forcing ourselves to keep it going just for the sake of tradition. We decided to stop. The entire tradition lasted less than a year. A few teammates who participated in every birthday never received a gift themselves. In the end, it turned out I wasn’t the one overcomplicating things. We were actually oversimplifying  the effort, the responsibility, and the expectations needed to sustain that practice. And once we looked at it honestly, the illusion of simplicity quickly unraveled.

Back to our craft

Take a moment to reflect on these statements:

See bad code? Just refactor it.
Want developers to implement requirements properly? Just describe them clearly and add acceptance criterias.
Need high quality? Use TDD. Write failing tests, then write code to make them pass. That’s it  no more bugs.
Deployment too slow and painful? Just automate everything.
Productivity issues? Just remove every single impediment.

For some, this might sound like a reasonable approach to problem-solving. But for those who’ve truly experienced these struggles, such statements often sound no different than populist slogans, like:

To solve alcoholism — just ban alcohol.
To solve hunger — just give food to the poor.
To stop wars — just forbid people from fighting.

Why does this kind of thinking persist? I believe there’s a deeper pattern behind it: we have a built-in cognitive bias to overestimate (and overcomplicate) our own craft while underestimating (and oversimplifying) the work of others. We see the same mechanism in discussions about AI replacing jobs.

With the rise of AI, we’re constantly hearing predictions about entire professions being replaced by artificial intelligence. And yet, whenever you read an interview with someone from a particular field, they never say their own profession is at risk. Never. Designers suggest that testers and developers might be replaced but not them, because their work is uniquely creative and handcrafted. Developers claim that product managers can be automated. Product managers, in turn, believe developers are the first to be optimized away.

Psychologists even have names for this: the illusion of explanatory depth our tendency to think we understand complex systems better than we actually do and the bias blind spot, where we recognize biases in others but fail to see them in ourselves.

What Simplicity Really Means

The issues described above can often be summed up like this:

Sometimes people say:

“Let’s keep it simple,”

When what they really mean is

“Let’s not think too much about it.”

True simplicity is the result of deep understanding, not a substitute for it. The real challenge lies in the ability to distinguish between simplicity and oversimplification, and between complexity and overcomplication.

Our world is complex. Human nature is complex. Look around, some things simply don’t have clear, definitive solutions  and those who claim otherwise don’t fully understand the problem or want to sell us their solution.

To simplify doesn’t mean to remove or hide complexity it means to understand it deeply enough to handle it wisely. A great example of this is found in the laws of physics: they’ve stood the test of time, not because they’re simplistic, but because they capture the essence of complex phenomena in a clear, usable way. They may sound simple  but behind them lies an enormous amount of observation, thinking, and refinement.

Strategy for Preventing Oversimplification

As a tester by craft and by mindset I tend to look for patterns in how mistakes happen and how low-quality products emerge. One of the recurring patterns I see is the failure to recognize and mitigate our built-in cognitive biases. And oversimplification is one of the typical outcomes of that failure. I use the following principles as a strategy for preventing oversimplification:

Understand the Context and the Goal

As mentioned earlier, we tend to oversimplify things we don’t fully understand. Thus, before accepting someone’s “simple” explanation or plan, ask yourself: Does this person truly understand the topic? If not, the risk of oversimplification increases dramatically.

Sufficiency Check

Good decisions must consider risks and potential consequences. Sometimes it’s easy to estimate the worst-case scenario that a simplified approach might lead to — and decide based on that. For example, when I buy something relatively inexpensive, I don’t spend a lot of time comparing every option. In the worst case, I lose some money and I’m fine with that. I’m aware of the risk and I consciously choose to simplify the process.

But in other situations, oversimplification can lead to serious long-term problems. Imagine choosing a database technology for a product expected to scale. Picking the one you’re most familiar with just because “it works for now” might feel like a time-saver. But if it can’t handle future performance needs or lacks critical features, you’ll face painful migrations, outages, and technical debt later.

Having a Complexity Advocate

This might be the kind of person who insists on packing a first aid kit before a road trip. Or your friend who tells you to buy tickets in advance for that rock concert because they know what happens when you don't. In software development, the role of a complexity advocate is best filled by testers. It’s often their job to paint vivid pictures of terrible futures — the ones where we didn’t fix a problem, didn’t validate a scenario, or ignored a warning sign.

Retrospectives to Learn

Much of our modern world is shaped by lessons from past mistakes. For instance, the policy of removing the appendix before polar expeditions appeared only after numerous deaths from acute appendicitis. Likewise, any long-running software project has a trail of decisions behind it. If we look back and analyze whether more problems came from oversimplification or overcomplication, we can recalibrate how we approach future decisions.

Collect Arguments to Protect Your Craft

One last thing worth mentioning: survival in the post-truth era. As was mentioned above, narratives already exist claiming that certain professions are less valuable or can be fully outsourced to AI. Not because someone is genuinely seeking truth, but because shaping a narrative is often the first step toward reshaping reality. The best weapon against this is systematic, well-reasoned destruction of these false assumptions.

Here’s an example. When I hear someone say that "to ensure software quality, all you have to do is fix all the bugs and prevent new ones from appearing," I usually recommend a simple exercise: Go read the release notes of major software products (provided they’re publicly available). In almost every release, you’ll find lines like: “Fixed an issue where X didn’t work as expected.” or “Resolved a minor inconsistency in Y.”

And this isn’t because the teams behind those products are using the wrong methodology. It is because the nature of bugs is complex and multifaceted. There are ways to keep the process under control — but there is no such thing as a serious product completely free of defects.