The Human Element in the AI Age
GETTY
This article originally appeared on Forbes in March, 2025. You can see it here, and you can see more of my writing for Forbes right here.
In the new world of AI, we've gained the remarkable ability to extract meaning from vast universes of data that were previously incomprehensible. During OpenAI's early open-source releases, I wrote about this capability as AI's most exciting promise.
Since then, in addition to making more sense of the data we already collect, I’ve watched the tech industry build more and more devices to collect data about some of the most personal aspects of our existences and then tell us about ourselves. They are replacing personal perception with algorithmic interpretation, and I think this shift deserves our careful attention.
Don’t get me wrong—I love data points. I trained as a scientist. I’ve built my career building with technology. And I use all the monitors. My Oura ring tells me how I slept. My heart rate monitor tells me how hard I’m working. I have an AI agent sit in on my meetings and tell me how engaged everyone was.
Yet these experiences have revealed telling disconnects between technology's measurements and my lived experience:
• After my toddler woke me at 4 a.m., leaving me sleepless and foggy the next day, my Oura ring cheerfully reported: Score 77/100, "Pretty Good," "Readiness Solid." The cognitive dissonance was jarring—I was being gaslit by my device.
• During an Orange Theory workout, I reached my absolute physical limit during a sprint following multiple inclines. As I fought to complete the interval, convinced my heart might explode, my monitor just broached “orange”—meaning it felt I was finally working kind of hard but certainly not my hardest. I...disagreed.
• In a focused co-working session with clients, we tackled complex systems and solved challenging problems. The meeting's seriousness and concentration—the very qualities that made it productive—led my AI assistant to assign it a low score, citing "low participant sentiment." The technology completely missed the value of this interaction.
We navigate two simultaneous realities: the measurable and the personal. One objective, one subjective.
Our culture privileges the measurable. The only goals worth setting are SMART—in other words, quantifiable. We question whether progress counts if it isn't backed by data, whether decisions are sound if they are not "data-driven." This bias isn't new, but emerging technologies amplify it and bring it into our minute-to-minute realities.
We are asking technology to tell us if we’re “good”: if we slept well, if we exercised hard enough, if we connected meaningfully on Zoom.
I see tech running fast on the firmly held belief that now that we can turn even our stress into ones and zeroes, we should. And further, that this is where the more truthy-truth lies.
I see helpfulness in additional data points, but after using these products, and also building a few (one can do this in an afternoon on Replit), I know how fallible they are. Let’s take their feedback but lightly. We know ourselves better than our monitors do.
As our experiences increasingly transform into data points, we must ensure that our perceptions, feelings and self-understanding maintain their rightful primacy. Technology should augment our self-knowledge, not replace it.