When it comes to leveling up our human workers, we face a fundamental choice: do we give them tools to become more effective, or do we integrate those features directly into the systems they use? This isn’t just a theoretical question; it’s a practical dilemma with massive implications for the future of work, especially in critical fields like healthcare. The path we choose will determine the pace of innovation for years to come.
Let’s explore this choice through a real-world example. My girlfriend is a nurse navigator, guiding people diagnosed with breast cancer through the overwhelming maze of the American healthcare system. It’s a role that requires immense organization and empathy, acting as a patient advocate in a system with countless actors, policies, and providers. A huge part of her job involves documenting every patient interaction into a charting system to ensure continuity of care—a vital but incredibly time-consuming task.
The Charting Challenge
In healthcare, charting is everything. It creates a record of advice given, symptoms reported, and concerns raised. When done well, it’s a fabulous system that ensures seamless care even if providers change. However, the process of charting ad hoc conversations, often held when emotions are high, is fraught with challenges. It’s hard for anyone to accurately recall every detail of a stressful conversation. This is where the opportunity for AI comes in, and it presents us with our two distinct paths.
Path 1: Augment the Human
The first option is to equip the human—the navigator—with tools that give her perfect recall. Imagine an AI assistant that can accurately and unbiasedly synthesize a conversation into a perfect summary for her charts. This approach is about upgrading her, not the entire healthcare system.
This means we don’t need to change the healthcare system. We don’t need to upgrade her computer. We don’t need to integrate new AI into healthcare. We need to upgrade her.
The beauty of this approach is its simplicity and speed. If the tool is 100% private to the navigator and doesn’t send sensitive patient data to the cloud, it doesn’t violate HIPAA or require a massive legislative overhaul. We’re simply giving a professional a powerful new skill. From a legal perspective, we haven’t changed the dynamic; humans are still responsible for the sensitive information they possess.
Path 2: Augment the System
The other option is to uplevel the entire healthcare system. This would involve creating something like a mandatory, secure dialing bridge for all patient communication, with a guarded AI that automatically analyzes and charts the conversation. While this sounds great in theory, the reality is starkly different.
The interesting thing is that that first solution can occur this year. The second solution may take years or decades.
Large institutions like government, education, and healthcare are notoriously slow to adopt new technology. I’ve already seen people get the smackdown from leadership for using tools like ChatGPT in environments not yet comfortable with the data-sharing implications. System-wide change is a battle against institutional inertia, regulatory hurdles, and massive implementation costs.
The Race We’re Already Running
This brings us to a fascinating dynamic. On one hand, individuals who are willing to augment themselves through upskilling and adopting new tools will become more valuable, more marketable, and more effective in their roles. On the other, if we wait for our major institutions to officially regulate and integrate these tools, we could cripple their progress, slowing ourselves down by years, if not decades.
We’re essentially creating a judo-like situation where individuals can use privacy-preserving tools to accelerate their own evolution, applying existing laws and precedents to move faster than the systems they work within. The implications are profound.
So, how is this all going to shake out? Will we empower our most vital workers to innovate from the ground up, or will we wait for slow-moving institutions to pave the way? What do you think?