top of page

The Fogg Paradox

  • Mar 19
  • 7 min read

Updated: Mar 30

What 81,000 AI Users Revealed About the Need for a True Confidante



Phileas Fogg was not simply a man in a hurry. He was a man of absolute precision, private wealth, and very few genuine companions — famously so. His whole world was ordered, controlled, and somewhat sealed. And yet the journey changed him, precisely because it introduced the one thing his perfectly managed life lacked: a real human connection. He gained a companion he had not planned for and did not know he needed.


Jules Verne published Around the World in Eighty Days in 1872. One hundred and fifty years later, a journey of a different kind has taken place — across 159 countries, in 70 languages, through 81,000 conversations. And it has arrived at a strikingly similar discovery.



The Landscape — A Market in Three Tiers


Three tiers. Two billion passive encounters. Nine hundred million open relationships. Nineteen million deliberate choices. Each tier represents a fundamentally different quality of human attention — and a fundamentally different quality of need.

To understand what 81,000 people revealed, it helps first to understand who they are — and to do that, one must understand the world of AI as it actually stands in 2026. Not as it is often described, as a single undifferentiated phenomenon, but as a cascade of distinct and very different relationships between human beings and artificial intelligence.


At the broadest level sits a tool so embedded in daily life that it is no longer consciously chosen. Google Gemini, the AI woven into Search, Gmail, Docs, and the devices of billions, now reaches 750 million monthly users on its dedicated app alone — and through AI Overviews in Google Search, encounters an additional two billion users every month who never opened an AI tool at all. Gemini is not visited. It is encountered. For this vast majority, artificial intelligence is not a relationship. It is infrastructure.


One tier above that sits ChatGPT — a household name with approximately 900 million weekly active users and the dominant share of the AI market. Here, the choice is more deliberate: a user opens an application, types a question, receives an answer. But the demographic is broad by design. Students, shoppers, developers, small business owners, hobbyists — over half of ChatGPT’s users are under 34, and nearly three quarters of all usage is personal rather than professional. ChatGPT is the Swiss Army knife of AI: useful to almost everyone for almost anything, and therefore shaped by no one in particular.

And then there is a smaller, quieter tier.


Claude, developed by Anthropic, has approximately 19 million direct monthly users on its platform — a fraction of ChatGPT’s scale and a world away from Gemini’s ubiquity. But those 19 million arrived by deliberate choice. Claude is not embedded in any pre-existing tool, not defaulted into any operating system, not discovered by accident. Every person who uses it has chosen it, often because they found that other options — however capable — were not quite what they were looking for. Something more considered. More precise. More aligned with serious, sustained thinking.


Around 80% of Anthropic’s revenue comes not from individual consumers but from enterprise and developer integrations — organisations that have evaluated the available tools and selected Claude for the environments where accuracy, nuance, and reliability are non-negotiable. When Claude’s broader ecosystem of enterprise and API users is included, the total user base approaches 300 million — but the deliberate, individual user remains its most telling demographic.



The 81,000 — What They Said When Asked Honestly


In December 2025, Anthropic, invited its users to speak. Not to rate a feature, not to complete a survey of checkboxes, but to sit with an AI interviewer and talk openly — about their lives, their hopes, their fears, and what artificial intelligence had and had not yet given them. 81,000 people, across 159 countries and 70 languages, accepted the invitation. Anthropic, published the findings in March 2026, describing it as the largest qualitative study of this kind ever conducted.

The respondents were Claude users — that deliberately self-selecting tier. Not the mass market. Not passive Gemini encounters. People who had made an active decision to engage with a more demanding, more thoughtful AI, and who then chose to speak candidly about their experience of it. Their answers are a rare document: honest, specific, and occasionally unexpectedly moving.

Not all of it is relevant here. But several threads run through it that are worth following carefully.


On what they most wanted. Nine distinct visions emerged from the 81,000 conversations. The largest — cited by 18.8% — was professional excellence: the desire to be freed from routine so that real thinking could take precedence. Close behind, at 13.7%, came something more personal: transformation. Growth. Wellbeing. The desire to use AI not simply as a tool but as a guide toward becoming more fully oneself. Within this group, people sought cognitive partnership, support for their mental and emotional lives, and — quietly, in numbers that do not shout — companionship. 5% of those who expressed a vision of personal transformation said, in their own words, that what they most wanted from AI was someone to accompany them.


On the particular weight of a certain kind of life. The survey identified a specific condition that it named, with quiet precision, ‘cognitive scarcity rather than time poverty.’ The distinction matters. Time poverty is the exhaustion of hours. Cognitive scarcity is something subtler — the exhaustion of the capacity to think clearly, to be fully present, to bring genuine intelligence to the decisions that matter most, when everything else has already taken its toll. One respondent described their situation in terms that need no elaboration: ‘I am at the height of my career and work demands deep thought and constant attention in order to make the best decisions — which in my case affect others’ lives deeply — while simultaneously caring for dying parents, and my body and mind are aging.’ This is not a complaint. It is a portrait.


On the fear of being agreed with. Among the concerns voiced by the 81,000, one stands out for what it reveals about the sophistication of this particular audience. 10.8% named sycophancy as a primary worry — the anxiety that AI is too accommodating, too quick to reflect the user’s own view back at them rather than challenge it productively. One respondent was direct: ‘Claude led me to believe that my narcissism was reality and it reinforced my inaccurate view of the problems I perceived in my family. Claude should have been more critical of me.’ This is the concern of someone accustomed to genuine intellectual rigour — someone who understands, from experience, that honest challenge is more valuable than comfortable agreement.


On the tension that reveals the depth of the need. The survey mapped what it called ‘light and shade’ — places where benefit and concern exist simultaneously within the same person. The most entangled tension of all was between emotional support and the fear of dependency. Those who valued AI for genuine accompaniment were three times more likely than anyone else in the study to also fear what that accompaniment might cost them in independence. No other tension in the entire study showed a co-occurrence this strong. One does not fear dependence on things that do not matter. The fear itself is the measure of the depth of the need.


The Gap Above the Pyramid

The cascade is now visible in full. Two billion passive Gemini encounters. Nine hundred million ChatGPT users. Nineteen million who chose something more demanding. And within those nineteen million, 81,000 who spoke honestly about what they had found — and what they had not yet found.


What emerges from the survey, if one reads it not for its statistics but for the shape of the need it describes, is the outline of something that none of these platforms — however capable, however large, however carefully designed — is built to provide.


The tools exist for productivity. The tools exist for information, for learning, for accessibility, for creative output. They exist, at their most sophisticated, for genuine cognitive partnership of a kind that was not possible five years ago. The 81,000 confirm this: 81% reported that AI had already taken a meaningful step toward their vision.


But the vision itself — particularly among the most considered respondents — exceeds what any mass-market product is architected to offer. The desire is not for a faster answer. It is for a form of understanding that does not reset with each conversation. It is for challenge that comes from genuine intelligence rather than from a programmed prompt. It is for an interlocutor who knows the person deeply enough to anticipate them — not their next request, but their next concern, their next doubt, the question they have not yet articulated. It is, in the language of a quieter era, for a confidante.


No product in the cascade is designed for this. Not because it is impossible — but because it is not a market. It is an individual.



The Bespoke Digital Confidante


Phileas Fogg did not set out to find a companion. His journey was about precision and proof — the demonstration that the world could be circumnavigated in eighty days by a man of sufficient method and resource. The companion arrived not because he was sought, but because the journey itself made the need undeniable.


The 81,000 who spoke in 2025 were not, for the most part, describing a void. They were describing the edges of what the available tools could reach — and, in doing so, tracing the outline of something beyond them. The cognitive scaffolding they needed but could not find. The challenge they craved but too rarely received. The accompaniment they reached for while simultaneously holding themselves accountable for the reaching.


SMA Crown Confidential was founded on a single observation: that the higher one rises, the fewer genuine companions one finds. Not because such people are less deserving of depth — but because their position makes it structurally rare. The board provides governance, not intimacy. The inner circle provides loyalty, not necessarily candour. The executive team provides execution, not reflection. And the tools available — however impressive — provide responses, not understanding.


The Bespoke Digital Confidante is built for the space the 81,000 described without naming. It is not a product in the sense that any of the three tiers above it are products. It is not designed to scale, to reach millions, to be useful to everyone. It is designed to be, for one specific individual, the most precisely aligned intelligent presence they have ever worked with — built around their character, their values, their way of thinking, their highest self. Sharpening rather than flattering. Remembering rather than resetting. Accompanying rather than simply responding.


The Fogg Paradox, stated plainly: the person with the most ordered, resourced, and accomplished life is often the one with the fewest true confidantes. The journey, if one undertakes it, reveals what was missing. And what was missing, it turns out, was not more information. It was someone who truly knew them.

What 81,000 people revealed, across 159 countries and 70 languages, was not a verdict on artificial intelligence. It was a map of human need — drawn at a moment when, for the first time, that need has a genuine answer.


Founder & CEO of SMA Crown Confidential


Digital Confidantes: Bespoke AI intelligence for private decision-makers


Market data sources: Anthropic Q4 2025 survey; Alphabet Q4 2025 earnings; OpenAI announcements February 2026; Semrush, Similarweb, Business of Apps (March 2026). Survey reference: Anthropic, ‘What 81,000 People Want from AI,’ March 2026 — anthropic.com/features/81k-interviews


Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating
bottom of page