Dealing with ambiguity is one of the key skills sought after in software engineers. The more senior you are the more you are expected to know how to handle it. But even outside of work we have to make decisions in uncertain situations. In this post I would like to share some thoughts on this go over existing frameworks, and synthesize a new framework.

Before we dive in, so that you know, I don’t like uncertainty and ambiguity that much and would like to briefly revisit a point from a previous post, that a person has to be a psychopath not to worry about uncertainties at all, but at the same time uncertainty is a core part of our lives and we should learn to embrace it.

Baselining

Let’s first expand a bit, mainly because “dealing with ambiguity” has some ambiguity to it. The word itself could mean uncertainty but it can also mean something bearing multiple meanings. Multiple meanings to the same definition is a more strict meaning of the word, but:

  • If you interpret it as purely ambiguity (multiple interpretations), you might miss the fact that your company also expects you to handle incomplete, uncertain futures.
  • If you interpret it as just uncertainty, you might miss the expectation that you actively clarify, simplify, and drive alignment when things are already vaguely defined.

So for the sake of this post it is both: uncertainty and multiple interpretations.

Expanding

What are some existing frameworks and approaches  for dealing with ambiguity?

  1. Cone of Uncertainty
    1. What: Early estimates are highly uncertain but converge as more information is discovered.
    2. Personal take: I really like to point to this one whenever someone isn’t too sure about a project. I don’t only frame it as estimates but overall uncertainty and ambiguity reduces as the time passes. This also helps me put any project into perspective and 
  2. Double Diamond
    1. What: The Double Diamond is a structured approach to tackle design or creative challenges in four phases:
      1. Discover/Research → insight into the problem (diverging)
      2. Define/Synthesis → the area to focus upon (converging)
      3. Develop/Ideation → potential solutions (diverging)
      4. Deliver/Implementation → solutions that work (converging)
    2. Personal take: Honestly, the double-diamond framework isn’t something I knew about before writing this post, but I could see how it is similar to what I’m used to doing anyways whenever designing a system at work or even approaching something creating, like writing a blog post.
  3. OODA: Observe–Orient–Decide–Act
    1. What: Borrowed from the military framework for fast-moving, changing situations and decision making. This is particularly useful for competitive environments.
    2. Personal take: while software engineering isn’t so dynamic and competitive, some elements of this framework are useful, especially quick re-orientation based on observations.
  4. Cynefin framework
    1. What: Cynefin offers four decision-making contexts or “domains”:
      1. Obvious (Clear): Cause–effect obvious → apply best practices.
      2. Complicated: Cause–effect requires expertise → apply good practices.
      3. Complex: Cause–effect only clear in hindsight → probe, sense, respond.
      4. Chaotic: No cause–effect → act to stabilize, then respond.
      5. Confusion (Disorder): Domain unclear → break down into parts.
  5. First Principles Thinking
    1. What: When ambiguity comes from assumptions, noise, or convention, break down a problem to its fundamental truths, then build reasoning from the ground up.
    2. Personal take: this seems to be the best approach when you observe that people talk about the same thing but in different ways or they use same words but mean different things.
  6. Lean / Agile
    1. What: Build → Measure → Learn loop to reduce uncertainty with fast experiments.
    2. Personal take: In a way agile was specifically created to deal with ambiguity by prioritizing adaptability over too much upfront planning (waterfall). I personally was a big proponent of agile methodologies. Sometimes there are too many rituals associated with these methodologies, but otherwise it’s great.
  7. Others…

Map -> Reduce by Priority

Can we synthetize these approaches into something that works specifically for us, software engineers? Maybe, yes, maybe not. What I mean is that sometimes one particular tool works best in one situation but doesn’t work too well in another. But regardless, I think there are enough similarities and here is my attempt, based on my experience:

  1. Step 1: Baseline: quickly document known knowns, known unknowns, and range for unknown unknowns. Potentially something like Cynefin categorization can be used here, but I normally just go with a format that best works for a given situation. 
  2. Step 2: Expand: Go on a quest to discover and collect even more of the unknown. Yeah, this step is actually hard to do, because instead of formulating some hypothesis or making assumptions this asks to seek more uncertainty. Practically, most of the time, this boils down to talking to more people and asking them to give more points of contact to talk to. Obviously, the more people you talk to the more perspectives you get. I think this overlaps with “Discover/Research” from the double-diamond or “observe” from OODA.
  3. Step 3: Map -> Reduce by Priority: Analyze information collected in the “Expand” step, identify key similarities, and group where possible, thus reducing the number of possible interpretations and meanings. One important thing could be to see which of the collected artifacts might have the outsized end result on the project. For example, an increase in latency of a service is uncertain, but it simply could be a complete blocker for the project, so this uncertainty should be prioritized. No point in reducing uncertainty elsewhere if this one shows the entire idea is no-go. There might be multiple tracks to reduce uncertainty here and this can be parallelized (similarly to map-reduce). This roughly maps to “decide->act” from OODA or “Define”/”Deliver” phases from double-diamond, or “measure” phase from Agile / Lean.
  4. Step 4: Synthesize and Iterate: Some of the most critical uncertainties and ambiguity was reduced in the previous step, so now it is a good time to converge on a new revised state of the things and suggest a course of action based on gained knowledge. Practically this means answering all of the questions about most critical unknowns, and how remaining unknowns will be mitigated.

Synthesize and Iterate

The Framework:

  1. Baseline: List what’s known and unknown (use Cynefin or a simple format).
  2. Expand: Actively seek more perspectives and data to surface hidden unknowns.
  3. Map and Prioritize: Group findings, reduce interpretations, and tackle the highest-impact uncertainties first.
  4. Synthesize and Iterate: Converge on a clearer picture, decide actions, and repeat as needed.

Conclusion

In fact, I used this same framework while writing this blog post. Baselining what I knew, expanding with research, mapping frameworks, and finally synthesizing a new approach. Ambiguity is unavoidable in code, projects, and life. I’d love to hear how you personally deal with ambiguity, are you using any structure or framework for it?