On thinking alongside machines —
I keep returning to the same question: what does it actually mean to think alongside a machine? Not to delegate to it, not to be replaced by it — but to genuinely collaborate, in the way two people with different strengths might work through a hard problem. That edge, where the division of labor becomes interesting, is where I spend most of my time.
There are things AI does that still surprise me. Pattern recognition across vast, noisy datasets. Holding contradictory framings in parallel without discomfort. Producing a first draft at three in the morning without losing patience. But it also flattens. It reaches for the probable. It has no skin in the game. And those absences matter more than they might first appear.
What humans bring is harder to articulate but easy to notice when it's missing: genuine stakes, embodied experience, the ability to care about the outcome in a way that shapes how you reason. I think the most interesting work happens not when we try to make AI more human, or humans more machine-like, but when we understand the actual shape of each and design the interaction accordingly.
I don't have a clean thesis here. Mostly I'm curious, and I build and write in the direction of the questions. If any of this resonates, I'd be glad to hear from you.