_[[Beliefs ≠ Facts|There is an ancient parable that tells of several blind men encountering an elephant for the first time. One touches the trunk and declares it's a snake. Another feels the leg and insists it's a tree. A third grasps the tail and argues it's a rope. Each man experiences a real part of the elephant, yet none sees the whole.]]_
In diagnostic product development, we are often the blind men from the story. R&D touches the technical performance and sees innovation. Marketing feels the competitive landscape and sees differentiation. Sales grasps customer feedback and sees market demand. But the elephant in the lab—the actual user experience—remains largely invisible until it's too late.
Here's the harsh truth that most are reticent to acknowledge: **Your users care less about your breakthrough than you think they do.**
## The Interest Assumption Trap
I've spent more than a decade building tools for [[On Science|science]] and [[diagnostics]], including being part of a team that built an [[On Oncology|oncology]]-focused liquid biopsy platform that isolated cells from whole blood samples and that was cleared as a medical device in the United States and around the world. Having been involved from an early stage, I remember the passion we had for solving technical challenges and for being at the leading edge. Like most teams, we assumed our customers would share this passion for our particular innovation.
However, the reality is a bit more nuanced, and it is what inspired me to shift from work in R&D to a more strategic commercial focus on products and the way we support them. In reality, users aren't loyal to tools—they're loyal to workflows that help them prioritize what is important to them. That lab technician running the assay wants to take a lunch break and leave at a reasonable time. The pathologist interpreting results has seventeen other cases waiting. The researcher most likely has three grants to manage and a paper deadline looming.
If your product doesn't make their work easier, faster, or more reliable, it becomes a net negative regardless of its scientific elegance. Across the industry, I've watched brilliant technologies collect dust because they solved problems customers didn't actually have. Simple often beats brilliant when brilliant creates friction.
## Instructions Are Rarely Enough: A Nuanced Problem
Having spent significant time working with customers to support their custom workflows, early on, I often found it easy to forget the complexity gap between the controlled development environment and real-world deployment. Product teams often hope that customers will carefully read instructions, follow protocols precisely, and possess the same level of training.
This assumption often leads to a huge blind spot when moving forward with designing effective customer experiences.
For example, for [[Liquid Biopsy Sample Collection Considerations|liquid biopsy]] (particularly when isolating rare cells), the devil is in the details. Take sample handling: customers will default to standard good lab practices (GLP), having been trained that thorough mixing ensures sample homogeneity. The nuance that _LBx samples_ often required more gentle handling would tend to get lost in translation. Likewise, many lab technicians miss the nuance associated with proper pipetting technique. Especially for rare cell workflows, pipetting is critical, and an aggressive technique can be a major source of variation across users.
**The economic reality compounds this problem.** Where ease of use, UX, and process controls are overlooked in exchange for a more skilled operator, the result is a higher the barrier to adoption for the product. Every critical step that relies on operator nuance limits your addressable market and increases support costs. The solution isn't more detailed instructions—it's designing out the nuance wherever possible and providing robust risk mitigation for steps where nuance remains essential.
## The Lab Reality Check: Does Your Diagnostic Approach Survive Contact With The Lab?
As I transitioned away from development and toward commercial roles, I became fascinated with better understanding the customer requirements. All of a sudden I was working to answer new to me, but fundamental questions:
- What job is the customer actually hiring this tool to do?
- What clinical decision does this make easier for the ordering physician?
- How does this fit into existing workflows rather than requiring entirely new ones?
Through the years I have watched a number of brilliant technologies fail despite solving scientifically interesting problems. Many solutions are scientifically interesting. They just fail to address clinically urgent problems. Often, practical constraints that seemed mundane on a requirements document for development became deal-breakers in deployment. Here are a few examples:
**Floor space efficiency:** I've seen purchase decisions hinge on whether an instrument could fit into available bench space. Working clinical labs operate with spatial constraints that R&D environments rarely face. It doesn't matter whether a customer loves your technology if they can't justify displacing existing equipment that handles higher sample volumes.
**Workflow integration:** Each lab will have a core competency. If their preferred analytical readout is slide-based, and they have dialed in their workflow, they may not have interest in a system that requires a paradigm shift in how samples are processed.
**Shift compatibility:** Will your assay fit into a standard work shift? Can it be paused if needed? I learned these weren't just convenience factors—they were fundamental requirements that determined whether a lab could even consider adopting new technology.
**Technician satisfaction:** This metric often does not appear on product requirements, yet it determines long-term adoption success. From experience I can say that the labs where technicians found our system interesting and fulfilling became our biggest advocates. The ones where it added tedium had a tendency to churn quickly.
## Building for Humans, Not Just Science
After a decade in this space, I know that to achieve success, you have to:
1. *solve an interesting technical problem*
2. *ensure that the problem also helps address an unmet medical need*
3. *navigate fitting the technology into a complex system comprised of people, processes, and practical constraints that determine whether innovation turns to impact.*
Over the years, I've spent significant time thinking and documenting a diverse range of customer needs and workflows. Each time, the exercise is eye-opening. As a developer and product professional, I do my best to create a great customer experience. However, I continue to be amazed by the workarounds and shortcuts that were never intended but that make sense at a customer level.
As I strive to improve and avoid these blind spots, I've learned that diagnostic teams need structured approaches to see the whole 'elephant':
- **Get out of the lab and into customer sites regularly.** Not for demos, but to watch real workflows with real-time pressures.
- **Challenge comfortable assumptions about user behavior.** That elegant protocol you designed might not survive contact with a busy Tuesday afternoon.
- **Build feedback loops that capture the mundane frustrations,** not just the major failures. Small annoyances compound into big headaches.
The diagnostic industry stands at an inflection point. Technologies like AI and computational pathology promise to revolutionize patient care, but only if we design them for the humans who must adopt and use them daily. We need to make these new technologies easy to understand and implement and we need to be sure they interface with existing workflows. The companies that succeed will be those that touch not just one part of the elephant but work deliberately to understand the whole.
The blind men in the parable eventually stopped arguing and started sharing their individual experiences. Only then could they begin to understand the elephant. In my experience, the best diagnostic products emerge when teams do the same—when R&D talks to sales, when engineering listens to customer support, and when everyone spends time watching real users in real environments.
> [!INFO] What Blind Spots Have You Discovered?
> What blind spots might your team be missing? I'd love to hear your experiences with the gap between laboratory innovation and clinical reality.
>
P.S. If you're interested in a structured approach to revealing blind spots with your team, I've created a simple card-based exercise which can produce powerful results. *[[What are Blind Spot Revealer Cards?|Check out the Blind Spot Revealer Cards here.]]*