The AI ‘Glass’: half-empty or half-full?
AI, microplastics, a foot of snow in New Orleans, inauguration, tiktok banned-not-banned, fire contained-not-contained, Stargate!? My morning media input mood was flickering back and forth from enthusiastic optimism to looming doom. Action is my default coping mode so I started researching what kinds of glass or metal bottles to replace our Nalgene (plastic) water bottles and ran across a post by Professor Ethan Mollick noting that “another $500 billion USD has been committed towards achieving AGI and most of the labs are genuinely convinced they can build an AI that beats a human at most intellectual tasks in the next couple of years.”
But what gave me pause was his question, “What does that actually mean for most people what does their life look like in the future?”
“What does that actually mean for most people what does their life look like in the future?”
I immediately jumped to perilous conclusions (January has been rough!). But quickly countered that there was also plenty of promise that should be privileged instead.
The transformation would likely begin with how we interact with knowledge and solve problems. Imagine waking up to a world where every human has access to a personalized AI tutor/advisor that deeply understands their thinking patterns and learning style. Rather than Googling solutions, you'd engage in contextual conversations that help you grasp complex topics by relating them to your existing knowledge and interests. (I simulate this with every morning reading session now, though it’s a very manual process.) Yet this same capability might spark a crisis of meaning - when AI can instantly provide any knowledge, does the human journey of learning and discovery lose its value?
Education could evolve into something wildly different from today's structured system. Children might pursue complex creative projects like designing a sustainable Brooklyn or modeling evolution of new species, with AI helping them learn multiple subjects organically through hands-on experience. They could have conversations with AI-powered historical figures and experience past events through immersive simulations. However, this same system might make traditional education feel pointless - why spend years mastering fundamentals when AI can instantly provide whatever knowledge you need? The motivation to deeply learn anything might erode. But what if they could chat, and dance, with their favorite tiktok creators, would that make it better?
The nature of work could shift more radically than pandemic WFH. Instead of executing tasks, humans might focus on defining goals, providing ethical judgment, and synthesizing insights across domains. A lawyer might concentrate on understanding client needs and making ethical decisions, while AI handles research and document preparation. But this could also create a strange theater of pretense - humans maintaining the illusion of control while knowing AI could do everything better. Imagine sitting in meetings providing "human oversight" to AI decisions you can't fully comprehend, your job becoming an exercise in nodding along.
Creativitity could simultaneously expand and contract. On one hand, the barriers to creation might disappear - someone with a story idea but no writing experience could work with AI to develop their narrative skills while handling technical aspects. Creative Directors could sketch rough concepts and have AI help realize their vision while preserving their unique style. The boundary between thought and action might blur, with the ability to "think" designs into existence. Yet the same capabilities might make human creativity feel futile - every time you have an idea, you'd discover AI has already explored it in countless better variations. The joy of original creation could be replaced by awareness of human limitations.
The physical world might become more responsive and adaptive. Cities could function like living organisms, with buildings that physically reconfigure based on usage patterns and transportation systems that morph to match real-time needs. Your home environment might automatically adjust to your mood and needs. But this same automation might make humans feel like tourists in their own world - everything running perfectly but none of it really needing us.
Healthcare could transform from treatment to the ultimate business school Kaizenification or continuous optimization, with nanosensors and AI monitoring cellular processes in real-time and addressing issues before symptoms appear. Yet this same capability might further erode human agency - every choice about your health and lifestyle being guided by American insurance companies’ AI recommendations that you can't meaningfully question.
Social relationships might develop new dimensions or wither entirely. AI could help translate not just languages but entire cultural contexts in real-time, enabling deeper cross-cultural understanding. Social gatherings could feature AI-facilitated activities that help people discover unexpected connections. But why bother with messy human relationships when AI companions can understand you perfectly and never disappoint? Human interaction might start feeling clumsy and frustrating in comparison.
Entertainment could become fully participatory, with stories where every character remembers your past interactions and adapts to your choices. Spotify might finally become a collaboration between human emotion and AI composition. But this same capability could make traditional human-created entertainment feel limited and unsatisfying, further eroding spaces for human expression.
Economic and social structures might undergo radical change. Imagine if value shifted entirely to unique human elements like original creative directions and ethical judgments! Or we might see extreme concentration of power and wealth around those who control AGI systems, with most humans becoming economically obsolete - not just unemployed, but unemployable in any meaningful sense.
the ultimate challenge might be maintaining human agency and meaning in a world of unprecedented capability.
The psychological impact could be #$@*$&!!!. Each person might face an ongoing existential challenge: how to find meaning and purpose in a world where AI can do everything better? This could lead to either a beautiful flowering of human potential - freed from practical constraints to focus on philosophical and creative pursuits - or a crushing crisis of self-worth and purpose. My daily news and social feeds do this to me now, for better or worse.
But the ultimate challenge might be maintaining human agency and meaning in a world of unprecedented capability. We might need to develop entirely new frameworks for understanding human value and purpose. Imagine if the scarce resource might become not technical capability but wisdom - the ability to choose meaningful directions in a world of infinite possibilities!
This suggests a future balanced on a butter knife's simultaneously sort of sharp, but also dull edge. The same capabilities that could free humans to focus on higher-level thinking and creativity could also make human contribution feel meaningless. The technology that could enable deeper human connections might also make human relationships obsolete. The systems that could optimize our world might also strip it of human agency and spontaneity.
Maybe the key to navigating this future lies in actively choosing which aspects of human life we want to preserve and enhance, rather than letting capability automatically drive adoption.
Maybe the key to navigating this future lies in actively choosing which aspects of human life we want to preserve and enhance, rather than letting capability automatically drive adoption. We might need to deliberately maintain spaces for human learning, creativity, and connection, even when AI could do these things "better." The challenge becomes not what we can do with AGI, but what we should do - and how to maintain human flourishing in a world where it's no longer technically necessary.
So in a Park Slope dinner party, will the AGI glass be declared half-empty or half-full?
An engineer might say the glass was poorly designed with twice the necessary capacity. A stoic philosopher counters that the contents of the glass are indifferent - it’s how you use them that matters. A startup founder has a keynote slide saying the glass has room for exponential growth.
I choose to enjoy sitting next to the vintage t-shirt wearing musician who says “The glass is resonating at the perfect frequency - the water level creates just the right pitch when you run your finger along the rim.”
But will toast the poet dipped head-to-toe in The Row as she says: “The glass holds not water, but a liquid mirror reflecting both what was and could be. Its emptiness is as full of meaning as its fullness is empty of certainty.”
“The glass holds not water, but a liquid mirror reflecting both what was and could be. Its emptiness is as full of meaning as its fullness is empty of certainty.”