Go Back Up

Greater than the sum of its parts

Technology Jan 23, 2025 10:15:00 AM Kirby Bloom, CTO & Co-founder 3 min read

Listen to this page:

Loading the Elevenlabs Text to Speech AudioNative Player...

In life, we all have those moments of awe that are followed by the revelation of reality - like when you first learned the truth about Santa Claus or the Tooth Fairy. (If I just spoiled that for you, my apologies, but congratulations on your reading level!) While these moments can feel heavy, I consider myself fortunate to have spent my career working with technologies that evoke a similar sense of wonder, only to reveal complexities that defy initial expectations. Why fortunate? Because as a technologist, I’ve found that these moments often carry opportunities to drive meaningful change. Let me walk you through one of mine.


Back in 2007, I was wrapping up a project for Homeland Security, where we built a platform to view and annotate multimodal imaging data from instruments scanning shipping containers bound for the U.S. While it was cutting-edge work, I struggled with the slow-moving pace and bureaucratic nature of the public sector. Living in San Diego at the time, I faced a tough realization: if you weren’t working in defense, your career options were pretty limited. During my search for a new role, I stumbled across an intriguing job post on Craigslist. 

I’d love to tell you that I fully understood what Illumina was doing when I applied. Spoiler: I didn’t. It just sounded fascinating. As someone whose biology education stopped at high school, I assumed I had no shot at getting the job once I learned it involved building a platform to design oligonucleotides. But apparently, my experience designing Lightwave tracking systems for fiber optic networks struck a chord with the team. At the time, I didn’t grasp the scale or complexity required to evaluate and operationalize all the permutations for microarray designs. But I soon realized that the optimization challenges I’d faced with fiber optic networks - managing data throughput and scale - were surprisingly relevant to designing synthetic DNA strands. Who would have thought? 

Fast forward a couple of years at Illumina, and I was constantly in awe of my colleagues’ brilliance and the groundbreaking technologies being developed. I became hooked on the idea of personalized medicine, even though I was still pretty naïve about the intricacies of biology. One day, I remembered something from high school: different cells perform different functions. So I asked, “How do we know which cell this reported mutation is in? And what happens if you have a tumor sample with a mix of healthy and cancerous cells?” The response I got was something like, “Well… single-cell technology might help, but even if it works, the analysis will be a huge bottleneck.”

Our progress over the past few years has reignited my belief that we can achieve the level of precision which personalized medicine demands.

It was a classic “complaining about Wi-Fi on a plane” moment for me - I was definitely taking the complexity of what we had built for granted.  But when I realized genetic sequencing results often rely on averaging variances across many cells, the promise of precision medicine suddenly seemed shaky. How could we call it “precision” when we’re essentially averaging and smoothing over the details? I was left with more questions than answers. Would it ever be possible to truly deliver on the promise of personalized medicine?

As I've moved closer to healthcare applications of this type of technology, I’ve noted the advance in single-cell and multi-omics technologies with interest. These advancements have turned spatial biology into a tangible reality. The progress over the past few years hasn’t brought back my childhood Tooth Fairy, but it has reignited my belief that we can achieve the level of precision which personalized medicine demands.  Unlike a decade ago, the key pieces to enhance resolution in complex, heterogeneous environments are finally within reach. The challenge now is to build a framework that integrates these tools and processes vast amounts of data in a way that’s not only actionable but also accessible for clinicians. Managing the “data exhaust” and the massive number of permutations produced at this level of resolution is daunting - if not impossible - for any human. 

Fortunately, the explosion of large language models (LLMs), agents, and knowledge graphs offers a path forward. I’m not one to buy into hype, but I’ve seen firsthand how a well-scoped agent, paired with a high-performing model appropriate to the field, can process and interpret data at a speed and proficiency we could only dream of a few years ago. Thoughtfully designed agents can automate tedious processes like data collection and analysis, uncovering patterns and correlations that might otherwise remain invisible. Meanwhile, knowledge graphs can organize and interlink vast datasets, providing a comprehensive map of relationships - like how proteins interact across cellular landscapes.

Thanks to these technological breakthroughs, we’re on the cusp of transforming spatial biology from a research-focused discipline into a critical tool for clinical practice. The potential of integrating these technologies is far greater than the sum of their individual contributions. That’s why I’m so excited to join forces with Jay and Brian to bring spatial proteomics into the clinic. Together, we're embarking on a journey to make this vision a reality.

Kirby Bloom, CTO & Co-founder