UNDERTOW is an infinite report. A limitless USB stick for cultural intelligence. A living container that doesn’t finish, only accumulates, and keeps... growing. The concepts travel when readers use them in rooms I’ll never enter. The only requirement is that each piece is honest to what I’m seeing right now.
She’s telling you about her mother, but not really. She’s telling you about a version of her mother that she’s constructed for this exact moment — intimate enough to signal trust, edited enough to stay safe — and you can feel the seams. The slight upward pitch at the end of the sentence that isn’t a question. The way she touches the stem of her glass when she gets close to the part she’s not going to tell you yet.
And here is what you are doing, if you are actually paying attention. If you are doing this thing we so casually call emotional intelligence.
You are suppressing the entire model of this evening that you built on the walk over. The one where you were charming, where the conversation followed the arc you’d rehearsed like a pitch deck. That model is still running. You can’t kill it. You can only starve it of processing power while you redirect everything you have toward her.
You are constructing, from fragmentary data — vocal register, micro-expressions, the specific way she said “complicated” like it was a door she was closing — a working theory of what it feels like to be her right now. Not her in general. Her at 8:47pm on a Tuesday in a restaurant she chose because it’s loud enough that silence won’t be conspicuous.
You are updating that theory continuously, because she just laughed at something that wasn’t funny, and now the model needs to account for that. Was it nervousness? Was it a test? Was it her own performance layer kicking in — the same one you’re running — to fill a silence that got too close to something real?
And while all of this is happening, you are also modeling what she is modeling about you. Which means you’re running a simulation inside a simulation: her experience of you experiencing her. Three nested processes plus a performance layer that has to make all of it look like you’re just... a person sitting across from someone, having a late meal on a beautiful fake summer evening.
This is the most expensive thing you can do with a human brain.
Not calculus. Not simultaneous translation. Not coding. The act of genuinely modeling another person’s internal state in real time — while also suppressing your own, while also tracking their model of you, while also maintaining the functional appearance of someone who is doing none of this — burns more cognitive resources than almost any task neuroscience has tried to measure. And we do it every time we sit across from someone we’re afraid to care about. (Well I do, and when I’m feeling generous towards humanity, I pretend others do it too.)
We call it emotional intelligence. We talk about it like it’s a personality trait, the way some people are tall. We evaluate it on leadership assessments. We put it in job descriptions, sandwiched between “strategic thinker” and “strong communicator,” as if it costs the same as either. It doesn’t. Genuine empathy — not the nodding, not the mirroring, not the therapeutic “I hear you” that buys time while you prepare your rebuttal — is a real-time parallel simulation running on biological hardware that was not designed for the cost of the task.
The inference costs are staggering. And they explain something that most people misread as a character problem. The reason emotional intelligence is so rare is not that people are selfish or oblivious. It’s that it is, computationally, brutally expensive. A 2025 paper in Trends in Cognitive Sciences found that goal-directed cognition only raises the brain’s energy consumption by about 5% over resting baseline. But empathy isn’t goal-directed cognition. It’s distributed computation across multiple networks simultaneously, and the communication costs between networks dwarf the local processing costs within any single one. You’re not solving a problem. You’re running a whole second person inside your skull.
Your body already knows what it costs to go deep. It has a whole system dedicated to the problem.
You're not solving a problem. You're running a whole second person inside your skull.
When you hold your breath and put your face in water, your body begins to reconfigure itself. Heart rate drops. Blood pulls away from your hands and feet and floods your chest cavity. Your spleen contracts, squeezing a reserve of red blood cells into your bloodstream. Scientists call it the mammalian dive reflex. A cascade of involuntary changes that override your normal operating state so you can survive, briefly, in a place you were not designed to be. Evolution’s power-up Mario mushroom minus the bell jingle.
The deeper you go, the more extreme the reconfiguration. Jacques Mayol’s heart rate dropped to 27 beats per minute at 101 meters. A body so profoundly altered that mid-twentieth-century scientists would have predicted it couldn’t survive. They believed a dive past 30 meters would crush the lungs. The body proved them wrong. It doesn’t resist the depth. It reorganizes around it.
Empathy works on the same logic.
Your psychological operating state — your ego, your frame, your stable sense of who you are and what you think — is the homeostasis that keeps you coherent at the surface. It filters out information that threatens your self-model. It prioritizes your own experience. It keeps you… upright. And when you genuinely model another person’s internal state, you are asking that entire system to reconfigure. To slow the processes that protect you. To redirect resources from your own needs toward holding someone else’s experience inside your architecture.
The deeper you go. The more fully you model her pain, her complexity, the part she’s not going to tell you yet. The more your own system has to change to hold it. You suppress your frame the way a diver’s body suppresses its surface reflexes. You redirect attention from your own experience the way blood redirects from extremities to chest. You become, for the duration of that act, a different operating system than the one you walked in with.
And the thing that separates free diving from every other extreme physical act is that the danger is not the depth.
The danger is the return.
Blackout on ascent — loss of consciousness in the final meters before the surface, as oxygen partial pressure plummets with the dropping water pressure — is the way most freedivers die. The body that reconfigured to survive at depth can’t reconfigure fast enough to survive the return to air. The transition from one state to another is where the system fails. You don’t drown at the bottom. You black out on the way up.
You don't drown at the bottom. You black out on the way up.
Call it surface shock. Not the holding, but the surfacing. The moment you leave the session, walk out of the meeting, hang up the call, come home from the dinner where she told you you’re not getting back together. The depth was sustainable. Your system knew how to hold it. But the return — going straight from her grief to your inbox, from holding space to performing competence, from someone else’s architecture back to your own — that’s where the oxygen debt comes due. That’s where you sit on the sofa and stare at the wall as the room grows darker and darker. Where you laugh at something that isn’t funny. Where you cancel dinner with someone who loves you because you have nothing left to give them that day, or maybe the rest of the month.
People who do this professionally — therapists, crisis negotiators, the best creative directors, the best teachers — are the free divers of emotional life. They’ve trained themselves to go deep on a single breath and hold what they find there. Some of them have been doing it so long that their systems have permanently adapted, mutated, the way the Sama-Bajau people of Southeast Asia — sea nomads who have foraged underwater for centuries — have evolved enlarged spleens and enhanced vasoconstriction, their bodies genetically reshaped by generations of literal physical depth.
Professional empathizers develop the same kind of capacities. They can hold more, read more, go deeper. But the adaptation runs in both directions. The same reconfiguration that lets them reach the depth makes the surface harder. They are no longer built for shallow water the way other people are. Every string in them still vibrating from the last room they held.
And like professional divers, they know the one rule that amateurs don’t: you never surface fast. You come up slowly. You decompress. You give the system time to return to its own operating state before you ask it to do anything else.
Most organizations skip the decompression entirely. In competitive freediving, the athlete must complete a surface protocol — stay conscious, give an OK sign, verbally confirm — before the dive is scored. The sport has institutionalized recovery as a condition of performance. Most companies have not.
You do not get to go deep without reconfiguring yourself. You do not get the insight without the altered state. You do not get the connection without the cost. And the cost is not a bug in the system. The cost is the system.
Which means the rational capitalist market response, if you accept that genuine empathy is the most expensive cognitive operation humans routinely perform, is to build infrastructure that lets people avoid paying it. And that is exactly what is happening, everywhere, all at once. The market has invented more architectures for this single move — offloading the inference cost onto a structure, a system, or a screen — than for almost any other.
The simplest is spatial. Japan’s ohitorisama economy built an entire market category around inference-free consumption. And this will sound familiar by now if you’ve been reading UNDERTOW regularly. Solo dining booths at Ichiran where you order by machine and eat facing a wooden wall, solo karaoke rooms, solo travel packages designed for one. The food is good. The food is not the product. The product is a meal where you never have to model another person’s state.
The simulated version is growing faster. AI companions, a $37 billion market, sell warmth without inference cost. The capacity you’re not exercising is the capacity you’re losing.
Then there’s the discount TJMaxx version. The one most people don’t recognize as inference management because it feels so natural. The podcast host you’ve listened to for three years. The YouTuber whose taste you trust more than your friends’. The TikTok creator who showed up on your FYP so often you forgot you’ve never met. The K-pop idol who sends you a Bubble message at 2am that went to four million other people at the same time. These are empathy at half price. You run the simulation in one direction: modeling their emotional state, tracking their humor, building an increasingly refined theory of who they are. But they never model you back. No reciprocal inference. No suppression required. You get the warmth of recognition without the exposure of being recognized. The creator economy, over $250 billion globally and accelerating, is, among other things, the largest market ever built for one-directional emotional relationships. The gate opens inward only. People don’t experience this as diminished. They experience it as safe. Which tells you what full-price inference actually costs them.
The gate opens inward only.
The subtlest version is procedural. A new generation of high-end intimacy events now deploy what amounts to externalized empathy infrastructure. Traffic-light wristbands that broadcast your boundaries so nobody has to infer them. Trained hosts stationed throughout the venue whose job is to read the room so you don’t have to. Spatial design with graduated zones — social rooms, sensual rooms, play rooms — so the architecture itself signals what register you’re in and what’s expected. These aren’t avoidance. They’re an attempt to split the difference: preserve physical contact, offload the inference cost onto a protocol layer. A 26-year-old at one of these events told a reporter she feels safer here than at a regular bar. Of course she does. The venue is running the simulation she’d otherwise run herself. But the question the protocol can’t answer is the depth question: what happens to the capacity to read a room when you’ve outsourced the reading?
And then there’s the version the productivity narrative didn’t predict. Software developers managing multiple AI coding agents — running Cursor, Copilot, Devin, or Claude in parallel across different parts of a codebase — discovered that the production labor automated beautifully. What remained was pure supervisory cognition. You’re monitoring four separate outputs (Or eight, or twelve or sixteen. Likely more if you work at Anthropic where the token budgets are the size of entire economies). You’re building a theory of mind for each agent: what does it “understand” about the codebase, where is its model diverging from your intention, when has it confidently taken a wrong turn and you need to intervene before it compounds the error? You’re suppressing your own assumptions to accurately read the system’s actual behavior rather than the behavior you expected.
This is the same parallel-simulation architecture that runs across a dinner table. The developers running four agents simultaneously were running four empathy processes. The pattern they describe is consistent: crispy fried by late morning, unable to do the human-facing parts of their jobs — code reviews, standups, the meeting where someone needs mentoring — for the rest of the day. The automation didn’t eliminate the most expensive cognitive operation. It multiplied it. And the humans absorbing that cost had no more decompression time than the therapist with a full caseload or the creative director holding three client relationships at once and six back-to-back Zoom calls. (I don’t code but I’ve been burning through my Claude Max Plan tokens so fast that I keep the usage page up constantly on my second 5K monitor and watch the needle of the gas tank gauge sink to ‘E’ as if I’m still 60 miles from the nearest rest area.)
The economy is building cheaper ways to avoid the cost and pricing the real thing higher every year. Both responses are rational. Neither will stop. That’s not a market failure. That’s the market working.
The mistake organizations make is treating emotional intelligence as a free resource. They screen for it in hiring. They reward it in reviews. They demand it in every meeting, every negotiation, every difficult conversation, every Zoom call where someone is quietly falling apart. And they provide no infrastructure to support the people who are actually paying the cost. No recovery time. No reduced cognitive load after heavy interpersonal work. No acknowledgment that the person who just held space for a team’s anxiety needs to decompress before they surface into the rest of their day.
And the same organization is investing in every avoidance architecture the market offers. Async tools that reduce the meetings where someone might need to read the room. AI systems to handle the conversations that used to build empathic capacity through repetition. Collaboration platforms engineered to minimize the exact kind of unstructured human contact where inference skills develop. They demand the depth and defund the conditions that produce it. The geologist’s term for this is extraction. Take the resource, skip the restoration. They send the diver back down before the nitrogen clears. Not just once — as a scheduling philosophy. Surface shock is what happens when you go straight from someone else’s pain to your own performance. It’s not the depth that breaks people. It’s the transition speed. Surface shock, institutionalized.
Surface shock is what happens when you go straight from someone else's pain to your own performance.
Then they write “emotional intelligence” in the job description for the replacement hire and wonder why the interviews feel thin.
The thing the market also knows, even if it hasn’t articulated it yet: the premium on high-inference experiences is going up. The dinner where nobody checks whether the TikTok story they’re hearing was true or not. The meeting where someone reads the room well enough to say the thing that makes everyone close their laptops immediately. The group chat that turns into a phone call that turns into someone crying. These are becoming luxury goods. Not because the content is scarce, but because the capacity is. The willingness to bear the inference cost is itself the thing in short supply. And like every scarcity, the market prices it accordingly.
She’s still talking about her mother. The real version, now. The one she didn’t plan to tell. And you’re still down there. Her pain held inside your reconfigured architecture, your surface reflexes suppressed, everything you came in with pushed to the background so you can hold what she’s actually saying.
It’s the most expensive thing you’ll do all week. More than any meeting. More than any pitch. More than any strategy you’ll write.
There is no cheaper version of this.
What To Brief From This
If your team has back-to-back meetings where someone is expected to read the room, hold tension, or manage someone else's emotional state — and then pivot straight to reviewing AI-generated output that also requires their full interpretive attention — you have a decompression problem.
If you’re building an AI tool that automates production work, budget for the inference cost that remains. The developers and managers supervising those systems are running empathy processes, not production processes. Design the workflow around supervisory fatigue, not output volume.
If your job description says 'emotional intelligence required,' audit what that actually costs the person who fills the role. How many high-inference interactions per day? How many of those are now with AI systems that also require theory-of-mind work? What recovery infrastructure exists between them? If the answer is 'none,' you're not hiring for a skill. You're hiring for extraction.
If you’re designing a product or service that offloads social inference — dating apps with compatibility scores, collaboration platforms with status indicators, events with consent wristbands — name what you’re doing honestly. You’re selling inference cost reduction. That’s a real value proposition. Price it and position it as such, instead of pretending you’re selling connection.
If you’re a strategist presenting empathy data or emotional intelligence research in a deck this quarter, lead with the cost, not the value. Everyone knows empathy is valuable. Nobody is budgeting for what it actually costs the people who deliver it.
If this changed how you think about who’s paying the inference cost on your team, forward it to the person making the scheduling decisions.
UNDERTOW 006. The index keeps... growing.


