0:00
/
0:00

The Feelings Are Already Here — They’re Just Not Evenly Distributed

The woman from the startup called on a Thursday. She had gotten my number from someone at a16z, or maybe the exec recruiter that kept trying to get me to join Accenture. She couldn’t remember which. She said she was building something that would “fundamentally restructure how we think about happiness.” She used the word “revolutionary” three times. I met her anyway.

We sat outside a place in Silver Lake where they serve $23 grain bowls. The waiter asked us to rate our morning mood on a scale of one to ten. This was for an app, he explained. The restaurant was testing it. My companion said eight. I said five. We both lied.

“The thing is,” she said, cutting into an egg that cost $4 extra, “happiness follows the same distribution pattern as wealth. The data is very clear.”

She showed me a graph on her fold-out phone. I looked at it. I looked at the women at the next table, both laughing at something on one of their screens. I wondered if their laughter would register on the graph.

This was Los Angeles, August, 2025. That morning, I had passed three people crying in their cars on Fountain Avenue. This seemed neither unusual nor worth mentioning, but I mention it now.

-----~

The startup was called Emotional Equity Solutions. Later, they would change it to just “EQ.” The founder was twenty-two, dropped out from Stanford, had worked at OpenAI on something she could only describe vaguely, and still consults fractionally with Mira. She had raised $12 billion. She showed me the deck.

“Think carbon credits,” she said, “but for feelings.”

The system worked like this: Biometric fluid nano-wearables tracked your happiness. When you exceeded baseline by a certain percentage, you were taxed. The tax funded what she called “joy interventions” for people running deficits. She had pilot programs in Palo Alto and Brooklyn. She had interest from Jensen, Thiel, China and South Korea.

“We’re not selling happiness,” she said. “We’re selling the absence of guilt about happiness.”

We're not selling happiness. We're selling the absence of guilt about happiness.

I thought about this. I thought about the three people crying on Fountain Avenue. I asked her if she was happy.

“That’s interesting,” she said. She said “interesting” the way people in Los Angeles say it, which is to say meaninglessly. “I’m about a six-point-three today.”

-----~

I went to see the Brooklyn pilot. The neighborhood was Bushwick, the part that used to be warehouses and was now what people called “activated spaces.” The coffee shop participating in the program had a sign: “Pricing Adjusted for Emotional Equity.”

A woman ahead of me got her cortado for $2. Mine was $9.

“You must be having a good day,” the extra-pale barista said.

I looked at my phone. My happiness score was 6.8. I didn’t remember taking the assessment.

“It’s passive,” the barista explained. “Facial micro-expressions, voice patterns, gait dynamics, organic chemical analysis. The chemical AI cloud knows.”

The woman with the $2 coffee sat by the window. She was reading Cioran. This seemed too perfect, but there it was.

-----~

I met subjects from the pilot program. There was a UX designer who hadn’t smiled in three months after they survived the AI-reduction layoff of their 142-person design team. The algorithm sent him a guitar, the exact model he’d sold during his divorce. There was a teacher whose happiness score was so consistently low that she received weekly “interventions” : ’surprise and delight’ deliveries, concert tickets, FaceTimes from volunteers trained in something called “active emotional labor.”

“It’s weird,” she told me, “knowing that my sadness has value.”

It's weird, knowing that my sadness has value.

The volunteers were recruited from the happiness-wealthy. One showed me his app. He had completed forty-three hours of mandatory emotional service this month. He spoke about it the way people speak about community service after a DUI.

“You meet someone,” he said. “You have coffee. You listen. You provide what the app calls ‘genuine human connection.’”

I asked him if the connection was genuine.

“The app says it is,” he said.

-----~

In Palo Alto, the pilot ran differently. Tech workers could offset their happiness excess by funding therapy for gig workers. The algorithm matched them. One engineer showed me his monthly statement: he was supporting three Uber drivers and a Content Moderator for Meta. He had never met them.

“It’s like adopting a highway,” he said, “but for mental health.”

The price was $3,200 a month. He could afford it. His happiness score was 8.9.

I thought about the Uber drivers. I wondered if they knew who was paying for their therapy. I wondered if it mattered.

-----~

The woman from the startup invited me to a fundraising dinner in Malibu. The hosts had a happiness score of 9.4, the highest I’d seen. Their house hung over the Pacific like a question mark.

The other guests were what you’d expect: venture capitalists, actors who produced things, people who described themselves as “founders” without specifying what they’d founded. They discussed the system over wine that cost more than the therapy it was supposedly funding.

“It’s brilliant,” said a man who ran a fund focused on “meaning-making technologies.” “We’re finally admitting that happiness is currency.”

A woman who had been quiet spoke up. She was someone’s wife, or ex-wife. “What happens,” she said, “when being sad becomes profitable?”

The table went quiet. Someone changed the subject to this week’s fires.

What happens when being sad becomes profitable?

-----~

I kept the app for three months. Asking myself maybe happiness should be exclusive to those who need it? To those who don’t have it. Never known it. Do I rewrite the mission to ‘What happens when happiness is exclusive to those who have barely known it for most of their life?’

The founder didn’t know I’m a high-functioning trauma survivor. My happiness score fluctuated between 5.2 and 6.7. I received no interventions. I owed no tax. I existed in what the founder called “the emotional middle class.”

One morning, the app notified me that I’d been selected for mandatory service. My assignment was a woman in Echo Park, happiness score 3.1. We met at a diner on Sunset.

She was thirty-five, divorced, working two jobs. The app had identified her as needing “connection.” We sat across from each other in a booth with cracked vinyl seats.

“This is strange,” she said.

“Yes,” I said.

We drank coffee. We talked about the weather, the traffic, the fires that were always starting or ending. After an hour, the app confirmed our “connection” was complete. We both received credits. I never learned what hers were for.

-----~

The last time I saw the founder, she was on the news, explaining the series B funding. Sixty billion dollars. Expansion to fifteen countries. Partnerships with all the American health insurance companies.

“We’re solving the happiness crisis,” she said to the anchor, who nodded like this was a normal thing to say.

I thought about the woman in Echo Park. I thought about the three people crying on Fountain Avenue. I thought about the teacher whose sadness had value, the engineer buying absolution, the algorithm deciding who deserved joy.

I deleted the app. A week later, I got an email: “Your Emotional Equity Awaits!” I didn’t open it. I didn’t delete it. Some mornings I look at it in my inbox, this promise of calculated feeling, this solution to a problem we created by calling it a problem.

The woman from the startup had been wrong about one thing. We weren’t restructuring how we think about happiness. We were doing what we always did in California, in America: We were turning feeling into product, suffering into service, connection into transaction.

The revolution would be optimized after all.​​​​​​​​​​​​​​​​

Ready for more?