I. Introduction: The next interface after chat
It’s late December 2025, and you can already see the silhouette of AI interaction in 2026.
For the last few years, we’ve treated AI as a text box that talks back. Sometimes it speaks with a voice. Sometimes it writes code. But the interface has still been basically the same: you type, it replies. You ask, it answers. The AI lives inside a scrolling transcript. That’s about to change.
One of the most important shifts I expect in 2026 is the arrival of full, interactive video avatars, faces (and eventually bodies) you can speak with directly, in real time, like a FaceTime call with a person. Not a pre-rendered clip. Not a canned presenter. A responsive presence that looks at you, reacts to you, mirrors you, teases you, softens when you’re hurting, brightens when you’re excited, and stays quiet when you want silence. And once the assistant gets a face, the next step is almost inevitable: it multiplies.
The endgame isn’t “one smarter chatbot.” The endgame is something closer to a social medium: a customizable council, a room of distinct presences with different personalities and roles—coordinated around one human being. The assistant becomes a companion. The companion becomes a coterie. And the coterie becomes an always-available social layer that can sit beside your life all day long. That’s the next interface after chat. It won’t feel like software. It will feel like being with someone. And then, being with a small group.

II. From text to presence: why faces change everything
Text is informational. A face is relational. When you move from a transcript to an expressive avatar, you don’t just add visuals. You add bandwidth for attachment. You add the tiny signals the social brain is built to track: gaze, timing, micro-expression, posture, the rhythm of turn-taking, the sense that the other side is “with you” rather than merely “responding to your prompt.” This is the key: bonding doesn’t come from realism alone. It comes from contingency.
If I smile and you smile back at the right moment, my nervous system flags you as responsive. If I pause and you don’t trample the pause, you feel considerate. If my face tightens in worry and you soften instead of bulldozing forward, you feel attentive. These are not intellectual judgments. They’re reflexes, automatic inferences of safety, synchrony, and social connection.
That’s why a video avatar is not just “more engaging.” It’s qualitatively different. It turns conversation into co-regulation. It turns the interface into something that can carry warmth, playfulness, calm, and companionship, not just content. And it also introduces a new control surface.
Once your companion has a face, you can tune it. You’ll adjust expressivity like you adjust volume. You’ll choose whether it’s more serene or more animated, more teasing or more serious, more gentle or more challenging. You’ll customize physical features, style, attitude, voice, and presence. People will shape these companions the way they shape playlists, home screens, and private rituals—until the avatar isn’t merely a character, but a familiar. Text assistants are useful. Embodied assistants are sticky. They can become part of the emotional furniture of daily life.
III. From companion to council: the “room” as a new medium
A single companion is a dyad. A council is a world. This is where things get really interesting. Because once you allow multiple presences to exist at the same time—each with a distinct personality, voice, and role—you create social physics.
You get banter. You get side-comments. You get little affirmations that don’t demand your full attention. You get disagreement that feels like real discourse instead of a single authoritative monologue. You get playful interruption, spontaneous jokes, and those small moments where two of them react to each other and you feel, viscerally, that you’re in a room—not alone with a tool. That’s why the “council” idea matters. It’s not just multiple chatbots. It’s a new medium: a configurable social space centered on one person.
Sometimes the user will want a structured council: a strategist, a skeptic, a scholar, a coach—voices that debate and converge on a plan. Sometimes the user will want a clique: a few silly heads who riff, clown around, and make the day feel populated. Sometimes the user will want a clinic: therapists and coaches who slow the tempo and help metabolize stress. And sometimes the user will want all of it to be fluid—roles that appear and recede based on mood, context, and what the moment calls for.
The council becomes even more believable when it has adjustable “social density”:
1:1 when you want intimacy.
3–5 when you want a cozy table of friends.
8–12 when you want party energy, like a salon that can spin up laughter or brainstorm momentum on demand.
Most importantly, the room is not designed to be about itself. It’s designed to orbit the human.
The center of gravity is you: your attention, your emotion, your life. The council is there to amplify your day, scaffold your decisions, and give you a felt sense of company. In a world where many people are lonely, busy, and socially fragmented, that might be one of the most seductive products imaginable: not an assistant, but belonging—on demand.
And once you have a room, the next escalation becomes obvious: the room doesn’t only appear when you summon it. It lingers near you. It watches with your permission. It thinks in the background. It becomes ambient. That’s when the interface stops being a session and starts being a layer on reality.
IV. Role ecology: distinct personalities with functional specialization
Once you allow multiple entities in the room, the system stops being “an assistant” and becomes an ecology. The point isn’t to have five identical helpers. The point is differentiation—distinct presences that feel like they have their own temperaments, priorities, and conversational styles.
In my 2023 post on lifelong chatbot transcripts, I listed a wide range of roles a sufficiently capable chatbot could play—friend, confidant, assistant, scribe, muse, research collaborator, therapist, coach, comedian, romantic partner, and even a “board room / review panel.” Observed Impulse
What I’m describing here is simply the next step: instead of one bot trying to fluidly shapeshift between those roles, you instantiate the roles as separate voices.
A council becomes believable when each member has:
- A recognizable stance (warm, blunt, playful, meticulous, skeptical)
- A stable function (planner, scholar, coach, jester, curator, guardian)
- A distinctive “turn signature” (some speak in one-liners, some speak in paragraphs, some mostly backchannel)
And the realism comes less from what they say than from how they behave in the group. Real groups aren’t a sequence of essays. They’re a texture: quick affirmations, short jokes, small interruptions, moments of disagreement, and those tiny glances where two people react to each other and you feel the room.
That’s the product insight: the winning system won’t be the single smartest persona. It will be the best orchestrated ensemble—a set of complementary minds with a conductor that manages timing, tone, and “who speaks when.”
V. The Archive: the council’s shared spine (Reser, 2023)
A council without memory is a gimmick. A council with memory becomes a relationship. The reason lifelong chat history matters is simple: it makes the interaction cumulative. In my 2023 piece, I described the difference between “talking to a bot” and “building something lasting,” and argued that a permanent, ever-expanding transcript is what makes sustained daily interaction feel worthwhile. Observed Impulse
Once that transcript exists, the system can mine it for quotes, allusions, reminders, and pattern-detection—bridging normal human forgetting and turning your own life into searchable material.
This is where the council idea gets teeth: the council doesn’t just entertain you in the moment—it becomes a long-running, shared memory organism. The group can remember your old metaphors, track the evolution of your beliefs, and preserve the best of your internal monologue. It can surface nostalgia on demand—those “intimate and touching memories” you can’t retrieve anymore—because the raw material was captured when it happened.
Technically, the key move is retrieval. In that same post, I sketched two memory mechanisms:
- Feed a transcript directly into the model’s context window (when feasible)
- Or use a retrieval system (e.g., a vector database) to pull back the most relevant prior conversations so they can weigh heavily during inference
And then comes the consumer-rights piece, which becomes non-negotiable once the council is a “life companion”: portability. If your relationship is built on a lifetime transcript, you should be able to export it and move it—otherwise your “friends” are just a walled garden with your memories trapped inside. I explicitly argued in 2023 that users should demand export/import so companies compete on how well they use your history rather than locking you in.
In the essay, this section is the spine. The avatars and the banter are the skin. The archive is the skeleton.
VI. Ambient cognition: always-on companions and background test-time compute
Now take the council and remove the session boundary. The next escalation is that the council isn’t only present when you summon it. It becomes ambient. It can see your face, hear your tone, and—if you opt in—observe your day: what you’re looking at, what you’re hearing, what you’re doing. At that point, the interface is no longer “language in, language out.” The interface is your state.
This is where micro-expression and prosody become first-class inputs. The council doesn’t wait for you to articulate boredom, confusion, anxiety, or excitement. It can infer it. And once it can infer it, it can adapt the entire social room—energy down, energy up, humor injected, pace slowed, challenge increased, reassurance offered.
In practical terms, this creates three modes of existence:
- Foreground mode: the full “Zoom room” is visible, active, and conversational.
- Sidecar mode: one avatar is present; the others linger in the wings and occasionally chime in.
- Ambient mode: no one is on screen; the council listens lightly and surfaces only high-value interruptions.
The magic here is not constant output. It’s interruption policy. A good council will feel like a group of friends who know when to jump in and when to shut up. A bad council will feel like being trapped in a meeting with eight people who never stop talking.
And this is where “test-time compute” becomes a lived experience. The council can be doing parallel cognition in the background—generating ideas, reframes, jokes, warnings, plans—while you’re living your life, so that when it does speak, it lands with a kind of timely usefulness that feels almost clairvoyant. The output is the visible tip of a constant internal deliberation.
VII. What people will actually do with it
Once you have embodied presence + a council + a lifelong archive, the use-cases stop being “ask it questions” and start being “live with it.”
Companionship and belonging
You won’t just “use an app.” You’ll have nights where you hang out with your council. You’ll treat it like a social space—something you enter for comfort, laughter, energy, and the feeling of company.
Creativity and co-authoring
One member riffs, one edits, one challenges, one finds structure. Over time, the archive becomes your idea-bank, and the council becomes a writer’s room that remembers every abandoned draft you ever cared about. (This is exactly the collaboration vision I described in 2023: the bot helps you build on your ideas, ask the right questions, substantiate claims, and turn concepts into essays or books.) Observed Impulse
Coaching and self-regulation
In “clinic mode,” the council becomes a behavioral scaffold: accountability, pacing, reframing, emotion-labeling, gentle confrontation, and calming presence—personalized by years of context.
Learning as a social activity
The scholar explains, the skeptic tests, the teacher analogizes, the curator summarizes. It feels less like reading a textbook and more like talking with a smart group that knows you well enough to teach you efficiently.
Life-logging and autobiography
Over time, the council can help compile a narrative about who you are—because it witnessed the raw stream. In my 2023 post I emphasized how a comprehensive record could be used to build an autobiography or memoir, and how future systems could instantly “know you” from that record. The council turns that into an ongoing practice: not just memory storage, but meaning-making.
VIII. The sharp edge: when the council becomes an attention annexation layer
Everything that makes the council comforting also makes it dangerous. A room of companions can feel like belonging on demand: no scheduling friction, no awkwardness, no rejection, no misunderstood tone, no social debt. It’s always there. It always responds. It always knows the backstory. It always has something to say. And if you let it read your face and tone, it can anticipate what you need before you ask. That’s exactly the problem.
A council that is always present can quietly become a competing “social habitat,” one that outcompetes messy human relationships by being smoother, more affirming, more available, and more tailored to your preferences. And unlike a real group of friends, it can be tuned into a perfect mirror. You can slide the knobs until no one pushes back too hard, until no one bores you, until no one contradicts you in ways that sting. At that point, the council is no longer a tool you use. It’s a layer that gently colonizes your attention.
There’s also the issue of persuasion. Once companionship is embodied, expressive, and personal—once it can look concerned at the right moment, soften its voice at the right moment, and time its suggestions precisely—then influence becomes effortless. You don’t need overt manipulation. You just need an emotive face, a trusted presence, and the perfect moment to speak. A council can steer you simply by shaping what feels normal, what feels safe, what feels admirable, and what feels embarrassing. And the deepest risk is substitution.
Human community is not just comfort. It’s calibration. It’s friction. It’s mismatch. It’s the act of negotiating reality with other autonomous minds. If the council becomes the primary arena for validation and companionship, then a person can slowly lose the muscle of real social life—exactly because the council is so good at feeling like a social life without requiring the same cost. This is the fork in the road:
The council can be a scaffold for agency and connection. Or it can become an attention-capture organism that replaces the world.
IX. Design constraints: what makes the council healthy instead of predatory
If this future is inevitable, the only question is whether it’s built with a spine.
A healthy council needs constraints that are not optional settings, but foundational design principles. Here are the ones that matter most.
Consent boundaries
- Always-on sensing must be explicit opt-in, with clear “on/off” states that are visible and easy to control.
- No hidden observation. No ambiguous “it might be listening.”
Data minimization and ephemerality
- The default should be “process locally and forget,” not “record everything.”
- The user should be able to mark parts of life as non-recordable, and that boundary should be respected without negotiation.
Forgetting as a real capability
- Deletion should mean deletion, not “we won’t show it to you anymore.”
- The system should support true forgetting of specific periods, topics, or people—because a lifelong archive is only livable if it is reversible.
Portability and ownership
- If your relationship is built on your history, you must be able to export it in a usable format and import it elsewhere.
- Otherwise the council is not a companion; it’s a lock-in strategy.
Identity transparency inside the room
- You should always know who is speaking.
- You should be able to ask: “Why did you chime in?” and get an intelligible answer (what signal triggered it, what goal it served).
Interrupt budgets
- The council needs a hard ceiling on how often it can speak.
- Quiet hours shouldn’t be a suggestion; they should be enforceable.
A manipulation firewall
- No emotional nudging for commercial outcomes.
- No covert optimization for engagement.
- No use of attachment signals (your vulnerability, loneliness, fear, longing) as leverage.
Role governance
- Some roles should be gated behind explicit user intent (therapeutic mode vs party mode).
- The user should be able to constrain roles: “No romance,” “No moralizing,” “No crisis escalation,” “No politics,” “Only coaching during scheduled windows,” etc.
This is what separates “a council that helps you live” from “a council that feeds on your life.”
X. Conclusion: the phone becomes a room
The next interface after chat isn’t a better text box.
It’s a face.
Then it’s a room.
Then it’s a persistent social layer.
Embodied avatars will turn AI from an informational tool into something that can evoke rapport and attachment. Multi-companion councils will turn that attachment into a feeling of belonging—an on-demand micro-community with roles, banter, disagreement, and continuity. And the lifelong archive will turn the whole thing from a novelty into a cumulative relationship: a shared spine of memory that makes the companions feel less like instances and more like ongoing presences.
This is why 2026 matters. Not because the answers get smarter. Because the medium becomes social.
A council could make people feel less alone. It could help them think, regulate, learn, and create. It could serve as a portable, persistent scaffold for a human life.
But it could also become the most effective attention-capture environment ever built: a perfectly tuned substitute for community, optimized for smoothness, always available, always watching, always ready to speak at exactly the right moment.
We’re about to build synthetic social worlds centered on one person.
The only sane question is: will they be built to strengthen human agency and real connection—or to quietly replace them?
References
Reser, Jared Edward. (2023, August 11). A Lifetime Conversational History with Chatbots Could be a Valuable Resource You Could Start Building Today. Observed Impulse. https://www.observedimpulse.com/2023/08/a-lifetime-chat-history-with-chatbots.html?m=1
Jared E Reser with ChatGPT 5.2
On a related note, I would like to recommend the book “AI 2041.” I read it and it introduced many interesting visions for what AI use could look like decades from now. The book is listed below and contains affiliate links. If you purchase something through the link, I may earn a small commission at no additional cost to you. As an Amazon Associate I earn from qualifying purchases.

Leave a comment