Everyone Needs an AI Translator
Feb 10, 2026
You’re not bad at communicating. You’re just missing a translator.
I had one of those conversations with my partner last week. Two hours. We argued. We paced. We brought up examples. We used our voices and our hands and our tired faces. And at the end, after we’d each laid out our case like lawyers presenting evidence, we realized we’d been saying the same thing.
We ended up in the same place, but we’d taken completely different roads to get there. And neither of us had heard the other.
We’ve been together ten years. We know each other’s embarrassing stories, have inside jokes, and use “shorthand” phrases that mean complex things. And still, this happens. Not because we’re bad at talking. But because we’ve never really been talking in the same language.
When my partner says “bad day,” she means the kind where nothing went right, where the universe conspired, where she needs to decompress. When I say “bad day,” I mean I had two suboptimal meetings and my coffee was cold.
Completely different scales. Same word.
So when she tells me she had a bad day and I say “that sucks, want to talk about AI?” I think I’m being supportive. She thinks I didn’t listen.
Nobody talks about this gap in granularity, but every person walks around with their own internal dictionary. Every experience, every culture, every boss you’ve ever had has shaped the words you use.
“Ambitious” means something different to a first-generation immigrant kid than to someone who grew up in a house where money was never the problem. “Family” means something different if you lost yours early. “Safe” depends entirely on where you first learned the word.
You don’t just speak English. You speak your English. Everyone around you speaks theirs.
We already know this between languages. We “get” translation apps. We hire interpreters. We accept that “je t’aime” doesn’t land the same way “I love you” does.
But within a language, we pretend words are universal. We assume the other person’s dictionary matches ours. When it doesn’t, we get frustrated. We blame the person.
The failure was never in the talking. It was in assuming that what I mean is what you hear.
So what does any of this have to do with AI?
Right now, large language models work with one embedding space. One way of mapping concepts to vectors. They learn from human text, which means they learn the averaged-out, aggregate meaning of words. Useful for general tasks. But the model is always thinking in someone else’s English. Not yours. Not mine.
I think the future is personalized embeddings.
Instead of one model with one way of seeing the world, imagine everyone has their own. A file that represents how you specifically map concepts. The weights and associations you give to “ambition” and “family” and “bad day.” Your own little semantic universe.
Now put a translation layer in the middle. Maybe an email router, a bridge. You say something. The layer maps it from your meaning-space into mine. What I hear is not what you said, but what you meant, run through my lens so it actually lands.
This is technically hard. You’d need alignment procedures to build personal embeddings. Standardized interchange formats so translation layers work across architectures. Probably some clever fine-tuning to make the mapping meaningful rather than just smoothed-out noise. But the core idea is simple: stop forcing everyone through the same semantic bottleneck.
We’re not going to agree on definitions. We never will. But we could at least stop pretending we share a dictionary. Your partner has their embedding. You have yours. The gap between them isn’t a bug. It’s just what two different people look like.
Now if you’ll excuse me, my partner and I have some translating to do.