- The Innovation Armory
- Posts
- Say Hello to Your (AI)maginary Friend (Part I)
Say Hello to Your (AI)maginary Friend (Part I)
The rise of consumer AI digital companions and how they will dramatically transform the way we relate to one another
Welcome back to The Innovation Armory! This week’s piece may help you find your new best friend in a place you probably wouldn’t have expected… AI-enabled digital companions. Unlike your current friend base, they are available 24/7, won’t judge you and have a perfect memory. These companions will shape our interpersonal and cognitive development in powerful ways and fundamentally transform the way humans relate to one another. Thank you to Eugenia Kuyda, CEO & Founder of Replika for sharing your perspectives for this post. This is a long piece so I will be publishing it as a two part series. Read on for more about:
The weird ways adults already interact with “imaginary” friends today
A cultural framework for comparing the powerful narratives between various kinds of digital companions from video games to celebrity worship to OnlyFans
The key superpowers that will distinguish first movers in the digital companion space including distribution, brand, personalization and nostalgia
Why you may start to see Facebook “original friends” in your suggested friends list
The most effective and legally ambiguous ways that we will start to see celebrity likenesses leveraged for AI companion production
How the rise of chronic care management predicts the future prescription of companions to treat relationship breakups, death trauma, PTSD, loneliness and severe mental health disorders
The four phases of stigmatization faced by consumer innovations and how companion developers will structure their brand positioning and GTM strategies to overcome them
The insane willingness-to-pay and gross margin potential of companions at scale and the whacky implications for companion ransoms and support lapses
Why the most successful companion companies will incentivize the co-creation of companions amongst social groups rather than solo creation
Why companions will displace existing social networks and outcompete on vectors of impact, intimacy, automation of content controls and contextualization of updates
Using AI to push the boundaries of our relationship and intimacy capacities, thereby expanding the constraints of Dunbar’s Number
Why advancing beyond Human-AI relations and facilitating AI-AI interactions will be critical to building powerful narratives that will drive the most consumer engagement
The opportunity to leverage AI-AI and Human-AI companion interactions to optimize relationships across the full romantic lifecycle
Why growth in self-improvement literature might signal one of the largest hidden growth opportunities in the companion space
How the adoption rate of Neuralinks/bionic computing will impact the age most parents choose to expose their children to AI companions
The addiction, coping skill, conflict resolution, identity and disinformation risks to Generations Alpha, Beta and Gamma of exposing kids too early to AI companions
This is a long post so if your email gets clipped at the bottom, make sure to click unclip / visit The Innovation Armory to check out the full read.
If you liked this piece feel free to subscribe for future updates below:
Or share with the link below :)

Adults Have Imaginary Friends Too
AI is great at automating inefficiencies in data-driven, repetitive enterprise workflows. One of the lowest hanging fruits of AI proliferation is productivity gains via the augmentation and displacement of cumbersome, manual processes in the workplace. While I of course find these enterprise productivity areas incredibly high value and exciting, I am philosophically more interested in the rise of AI digital companionship applications: namely the use of AI chatbots and AI-augmented avatars for purposes of socialization, relationship building and emotional connection. Even prior to endeavors to achieve AGI, our society has had a deep-rooted tendency to anthropomorphize other species by projecting our emotions, our problems and our feelings onto them. Take dogs, for example: there are examples abound of this sort of psychological projection, but just a couple below. Apologies in advance to dog owners I offend by not including your dog’s breed in the memes included in this section:
Assuming when your dog licks you, it loves you instead of just eating the pizza grease off your hand from dinner

Putting your dog right in your baby’s face because he “knows better” and would never ever consider biting. To keep The Innovation Armory COPPA compliant, we blurred out the face of the baby below, but really it’s the parent’s fault here who posted this online that we’re using it:

My favorite: the rise of luxury puppy pet hotels and puppy spas, a billion dollar addressable market. Specifically, the assumption that your dog enjoys a rose petal facial cleanse and understands the quality difference of sleeping in a king size bed with the finest quality Egyptian cotton sheets. Or, that room service is an amazing amenity as if there’s anything else a dog can do besides have their food brought to them! Or what about having a book read to them, so that they can speak BARK well when they grow up. Here’s a great video that highlights the absurdities of one of these hotels in NYC:
If we anthropomorphize animals so strongly, I see no reason why we wouldn’t anthropomorphize an AI companion chatbot by ascribing its feelings. Unlike pets, long-term AI chatbots can talk to you and are highly personalized. They may not actually be alive, but as they become digitally represented in gaming environments and augmented / virtual reality long-term, companies can create a facade of their likeness. Even if they aren’t “real”, that doesn’t mean there isn’t a large market opportunity for AI-enabled companions. While we don’t call them imaginary friends, there is a longstanding cultural tradition since childhood of humans finding connection and meaning in imaginary friends. When you are a kid, this quite literally means fictionalized, imaginary friends. According to WebMD studies, 65% of children have an imaginary friend at least some point before the age of 7. Teenagers and adults effectively have imaginary friends through video games, their relationships with Hollywood celebrities and OnlyFans models, but they just don’t label them as imaginary friends. In its heyday, Neopets got to around 35M MAUs. There are Swiftie fans out there so intense that they’ll end friendships over conflicts about Taylor Swift.

This is effectively choosing an imaginary friend (who you have no tangible connection to) over a real life friend. On OnlyFans, lonely men believe they are building relationships with real models when oftentimes they are really just chatting with an e-pimp whose model has outsourced your conversation to low-cost workers that text customers on behalf of a creator. I’ve spent some time thinking about what actually technically makes a good imaginary friend and have slotted some common cultural expressions below into this framework:
Reciprocity - does your imaginary friend also interact with you?
Personalization - how well versed is your imaginary friend in your daily life? Do you have shared interests?
Loyalty - how frequently is your imaginary friend available? Are they always there or are they absentee?
Concreteness - how abstract is your virtual friend or can he/she be visualized?
Authenticity - Is your relationship genuine and true to who you are? Is your imaginary friend truthfully representing who they are to you?
Depth - How much of an effort does your imaginary friend make to understand your life? Do they understand your desires, your goals, your dreams?

The largest market opportunity with AI digital companions is unburying the deep biological urge we all have at birth to leverage creativity and play to form imaginary friends, de-stigmatizing it and using the magic of technology to make those friends feel as real as possible. There are many reasons that kids have imaginary friends, most of which apply even more so in adulthood:

Given their lack of understanding of societal norms, fresh perspective and heightened levels of play, children generally have higher creativity levels in areas of amusement than adults. Part of why adults deny the potential of an imaginary companion in their life is the social stigma, but another element is that their brains simply cannot create one in the way it used to / does not view doing so as functional and productive. That’s where technology comes in to create the AI imaginary friend you both didn’t know you needed and couldn’t even conceive of on your own!
The Race Towards Mutually Assured Companionship
Character.ai is the largest player focusing on digital companions today and raised a $150 million series A round led by a16z over the past year. Replika was the first mover in the space founded back in 2017 and most recently raised $11 million from Khosla Ventures. Other promising early stage startups in the market include Kindroid and Nomi.ai. While the user registration and companion creation processes vary by platform, users generally can create their own digital “characters” or “companions” to chat with by simply providing those characters an avatar, description and tagline as a starting point with further options to specify certain companion personality traits. Once the character is created, the chatbot will gradually learn more about how the user intended to interact with the virtual companion based on the types of questions asked and subsequent responses. Many of these platforms even have a human voice overlay so that the discussions feel more conversational and intimate. I’ve been spending some time thinking about what attributes will help enable new entrants long-term to compete against existing players. While Replika was a first mover in the space and Character.ai is the largest platform today in the companionship segment of the market, I believe there are four primary competitive advantages that will help determine which players will eventually dominate this market. These super powers are achievable through internal capability building but also via partnership opportunities.

Distribution Advantage - The ability to leverage pre-existing consumer interactions on other products to quickly develop, recommend and disseminate consumer companions. Social media networks are the most dominant example of this superpower. For example, Meta announced that it is launching AI-powered messenger chatbots. It is starting with entertainment and functional use cases, like a fake Abraham Lincoln that you can ask questions to about the Civil War. However, longer-term, we will definitely see social networks leveraging interactions with these chat bots to launch their own digital first friend profiles. More importantly, they will be able to leverage profile data to engineer the perfect best friend for you and your family / group of friends and serve the profile to you as if they were a real person, while having its interests and content adapt in real-time to changes in taste of your social network. It’s pretty similar to how Netflix launched originals based on third party content, but in this case, the third party content is your own UGC!

Brand Advantage - The ability to leverage a well known celebrity or creator to quickly build appeal and trust amongst consumers. Consumers already admire these figures and clamor for their gossip, attention and basic life events. Productizing their likeness can drive higher customer loyalty by building upon a strong pre-existing bond based on public adoration for a figure. The way to win this platform advantage is by building B2B relationships with talent and creator management agencies to efficiently onboard mass appeal and the long tail of entertainment figures, helping commercialize “digital doppelgangers” of themselves. CAA is already launching a digital cloning service to talent called theCAAvault and views management of AI likenesses as a significant expansion area for the firm. While these talent agencies likely don’t have the core engineering competency to launch companion platforms themselves today, it’s possible they could down the line with the advancement of no-code and AI co-pilot development tools. Further, personalized celebrity video platforms have been struggling post pandemic as evidenced by news that Cameo’s valuation has plunged by 90%. These platforms should double down on virtual celebrities to drive bottom-line profitability by building their own virtual AI-enabled celebrity companions based on the data moat that they have, specifically the following data points:
Relative popularity of celebrities / influencers in core geographies / target markets
Robust video, audio and emotional mannerisms of most popular celebrities
Video specifications and parameters for creator-to-customer transactions – how is it exactly that consumers want to integrate a celebrity into their life and what is personally meaningful about that celebrity to them?
Pricing data on customer willingness to pay by the second for micro-interactions with celebrities to lay the backbone for strong and efficient dynamic pricing
Demographic, socioeconomic and other factors that drive overlap across different types of celebrity bases
Don’t be surprised if one of the big AI players buys Cameo in a couple of years… Scarlett Johansson’s ongoing battle with OpenAI over the use of her voice is a prime example of why there will be legal challenges of directly taking or utilizing a celebrity’s likeness without their consent. However, it’s a lot tougher to trace the stealing of celebrity IP when a platform takes smaller components of their likeness to aggregate into a more universally appealing celebrity. Would there be a similar dispute if an AI companion utilized Scarlett Johansson’s tonal mannerisms, Nicole Kidman’s accent, Kanye West’s poetic lyricism and Adam Sandler’s comedic cadence? Probably would be a tough case to prove… Companies can use data science to pull out all the specific attributes that make individual celebrities likable and combine them into one celebrity megalith that maximizes audience by aggregating the long tail of audiences across creators and celebrities. Plus, by taking the individual features that make a celebrity most iconic, these sorts of companions could strike the best balance between driving higher companion adoption rates while mitigating liability of taking celebrity IP.

Eventually once AI video generation tools like Sora get fast and efficient enough, IP owners will be able to customize interactions with a megalithic companion to highlight their features that resonate most with each consumer audience segment, creating a digital celebrity likeness that morphs like a chameleon in pursuit of the most engagement. Further, A big opportunity here is creating a deeper emotional connection by highlighting parts of a celebrity that are less well known but more relatable to build deeper companion relationships. For example, many people don’t know that Michael Phelps is a huge gamer and would likely be a great companion to connect with other gamers to confide in about less public hobbies.

Personalization Advantage - The ability to leverage proprietary datasets of PII to create companions that are personalized to your interests, attuned to your key life events and think like you. These models can learn quickly over time what your preferences are as you interact more meaningfully, but I think reducing “time to access” for these proprietary datasets will become more important to reduce churn where customers don’t feel connected enough with their companion on an accelerated enough timeline. This goes beyond training on publicly available data and messaging inputs to things like sharing: i) browser history, ii) digital journals, iii) private photo repositories, iv) private writing samples and v) private historical chats with friends. Obviously, most consumers will be highly disinclined from sharing this sort of sensitive information. Rather than scraping data to imply interests, I believe some platforms will eventually take a “pay-to-play” approach whereby they gather this data with consent from consumers in exchange for paying hefty fees and consumer bounties. Companies today are already taking a personalized approach to consumer companions, but the companions that create the most attachment will need deeper PII with the consent of consumers who will demand to be paid handsomely in exchange for compromising their online privacy. Building proper economic incentives and consumer trust for this sort of more intimate data exchange will be key. We may eventually see public-private partnerships between the government and private AI companies to facilitate this sort of data transfer in exchange for universal basic data income.

Nostalgia Advantage - The ability to leverage comforting IP associated with feelings of love, acceptance and play from childhood. Everyone has their own favorite characters from cartoons, movies and books when they were younger. These characters held significant emotional weight over our psyches as children and often into adulthood. They can help break down stigma of companions amongst older populations by tapping into their inner child. The rationale for incorporating this IP into companions is pretty similar to celebrities, except that these companions have the potential to activate an even deeper-rooted sense of nostalgia to distill your sentimental longings into a digitally rendered being. These connections have the potential to activate more creative play in adults as an on-ramp to drive more adoption of companions. AI platforms should start by incorporating nostalgic character likeness that is already in the public domain and then gradually form partnerships to license this IP from relevant media companies.
Imaginary Solution With Real Health Benefits
Beyond applications automating diagnoses, billing and provider workflows, we will begin to see AI companion “treatments” prescribed in a healthcare context in response to certain life events; these “treatments” will help patients better process trauma and cope with negative feelings. Consumer AI companions have significant health implications particularly in dealing with mental health conditions where effective treatment requires relationship role play, trauma confrontation/immersion or dealing with social isolation. Consumer companions are an ideal vehicle for delivering treatment because they merge the trust and comfort of a friend with the intellectual rigor and academic expertise of a trained physician. From a personal perspective, many patients would prefer to speak to a friend about certain relationship and mental health issues over a therapist. The two constraints that friends have at any one moment are time and training. Most friends (beyond your absolute best friend) simply don’t have the time and emotional bandwidth to handle ruminations about your trauma and while their conversation is cathartic, it may not be the most effective from a treatment perspective. AI digital friends can break down your mental walls similar to how a friend would and also have unlimited time and emotional bandwidth. They can be trained on academic literature to mirror with role play the exact kind of treatment you may need to overcome trauma from highly personalized hardships. There are five primary areas where companions could have the most significant impact:

Death Trauma - When a loved one dies suddenly, children, spouses and friends don’t get to formally say goodbye to that person. Hopefully, this dynamic will shift with the continued growth of palliative care medicine and a transition to in-home care for end-of-life rather than institutional care. Separately though, digital companions could be incredibly useful to help loved ones grieve who don’t have that opportunity to say goodbye. When an individual dies, as part of their digital afterlife management, they could agree in their final years to have certain texts, videos, pictures, emails, etc. used to train a grieving companion for loved ones. Once virtual companions are prevalent enough down the road, this “afterlife companion” can and will be repurposed from their own consumer companions from their lifetime to serve this role. There might even be certain private stories or experiences about themselves that the deceased can pre-record and share to make the interactions feel more organic or for the departed to share a side of themselves they didn’t feel comfortable discussing while alive. I don’t think it’s healthy or productive for individuals to completely substitute a connection with a lost person with a bot that will only give them a glimmer of that prior relationship. Check out Black Mirror’s Be Right back for a great critique of this concept. However, there’s a window of a couple of years of mourning where a companion trained to speak and ideate like a loved one could be comforting to provide closure to family members. This would give family members the opportunity to:
Apologize for wrongdoings and alleviate any guilt they may have
Get closure on certain key questions and shared experiences
Receive love and re-enforcement to persist through difficult times
Breakup Hangup - We’ve all been heartbroken before and have been hung up on an ex or prior relationship. In a breakup that’s “mutual”, it’s easier for both parties to get closure and move on. While many partners will say a breakup was “mutual”, I’ve conducted a common sense meta-study to determine how true this claim is:

In all seriousness, breakups can cause growth, but the worst of them can trigger anxiety, social issues, depression and ruminations. Digital companions can be utilized as a method to provide compassionate closure to partners particularly in situations where one partner thinks it’s best to cut communications with the other partner. They can alleviate the guilt of the partner who initiated the breakup in a time efficient way and help the other partner overcome grief. Similar to the death trauma use case, an AI companion would utilize text messages, relationship context and relationship “exit” questionnaires, to mirror the likeness of an ex-partner. Partners could get context on key answers to questions like: i) what could I have done to avoid the breakup, ii) how can I improve for my future partner, iii) hash out lingering issues from the relationship and a myriad of other questions. This sounds dystopian today, but if done with proper controls against addiction and discretely enough, this could become the most compassionate way to end a relationship in 2040:

Obviously, there are issues to overcome like: 1) Branding - who wants to be the person in a relationship who admits to their ex that they need this? 2) Depending on the context of the breakup, how to appropriately incentivize the partner who is breaking up to provide the inputs and consent to create a companion in their likeness? 3) Addiction - how do you ensure someone is actually incentivized to move on rather than just becoming addicted to chatting with an interface that reminds them of their ex? How to ensure the corporation is aligned with these incentives? There’s also interesting applications of AI companions to assist with mediation of disputes during a relationship. This use case can also help solve for the aforementioned incentive and data issues that arise towards the end of a relationship. More on how this would work in practice later in the piece.
PTSD - Prolonged Exposure Therapy is one of the most commonly used treatments to alleviate symptoms of PTSD. It usually involves a combination of verbal exposure (recalling, imagining and role playing trauma situations) as well as stimuli exposure, where a patient aligns with their therapist on certain stimuli that would be helpful to confront their trauma. Psychedelics have been helpful in addressing PTSD for this very reason, as they help patients re-expose themselves to trauma in a cognitive environment that helps them creatively and productively reframe their experiences. Given context on a traumatic experience, companions have the ability to “role play” to create imaginative scenarios that help with verbal exposure and also leverage image and video generative AI to drive immersive stimuli exposure especially as the Apple Vision Pro and similar devices drive higher adoption of augmented reality.
Severe Mental Illness & Loneliness - There are select conditions like autism, schizophrenia, bipolar disorder and major depressive disorder where individuals struggle meaningfully with socializing whether it be due to severe social anxiety, medication side effects or lack of motivation to be around others. For patients struggling with these disorders, this element can often be the most debilitating part of their illness. Companions are an optimal solution for individuals who have a strong desire to feel connection and fit in but who worry about judgment of their illness. Further, in late 2023, the WHO declared loneliness a global health epidemic. Companions are not a substitute for forming meaningful connections but can help mitigate strong feelings of loneliness. 81% of elderly patients, for example, feel less lonely using Alexa-enabled devices. How much stronger could this impact be for a device trained to build emotional connection?
Over time, we will see these companions become medically prescribed to treat illnesses, trauma and relationship issues.

There’s already been a vast loosening of medical billing standards ever since the start of the pandemic: for example, the AMA has been expanding billable medical service codes to include additional remote patient monitoring and chronic care management services, and insurance reimbursement rates for these services have been increasing significantly. A large portion of the services being performed for CCM relate to physician assistants (sometimes even licensed clinical staff hired by the billing company) texting and messaging with patients to inquire about the status of medication adherence, recurrence of any symptoms, etc. after discharge. CCM is reimbursable even though it represents follow up care from a less trained medical professional being performed outside an acute setting. If texting and data analytics by a human non-MD is already considered reimbursable treatment, it’s not an illogical leap to assume that personalized messaging for acute situations by an AI non-MD would also become reimbursable. Based on the medical sophistication of a trained AI relative to coding professionals and NPs as well as the higher acuteness of the medical use case, I’d expect companion treatments to become more reimbursable than CCM with time:

Short-term, there’s even an interesting B2B opportunity to leverage therapist companions to expand CCM into the realm of mental health. Therapists may eventually provide consent to have their likeness mirrored by a B2B companion to enable them to check in more regularly with patients and expand the spectrum of billable mental health events. While it does add a higher degree of credibility to be medically prescribed, I’m unsure whether this approach will end up providing the most optimal go-to-market for companions long-term. One of the main advantages that an AI companion approach has in providing care is leveraging unique data insights to provide more personalized, creative and comforting care. If these treatments are prescribed by MDs in a proper care setting, HIPAA constraints could mitigate the commercialization of that competitive advantage.
Smashing the Barriers of Cultural Stigma
When trailblazing a new culture around the use of a product, it can be incredibly difficult to overcome stigmas that hinder individual use, group consumption and social recommendation of that good. The obvious stigmas against AI imaginary friends / companions are health and fear of being judged. There’s a general notion that investing in digital relationships is unhealthy as compared to tangible relationships and that you are a loser if you talk to a computer over real friends. To drive adoption of a product, it is incredibly important to overcome these stigmas to ingrain a pattern of behavior and activate viral social consumption and referral loops. Despite substantial technology advances, consumer companion AIs still remain fairly stigmatized particularly in older populations. I caught up with Eugenia Kuyda, CEO & Founder of Replika about thoughts on what may eventually de-stigmatize AI companion relationships:
“AI relationships have been stigmatized just like online dating was stigmatized when it just appeared. But just like online dating became the norm, we'll see the same with AI relationships. Today all you can do with an AI is talk - but what if you could actually do stuff together with your AI companion? Play games, watch TV, go for a walk or a hike - what if your AI could talk you through your meetings while you're walking to the office and help you get over your anxiety? If AI companions could be more ingrained in our daily lives that would help with normalizing it, and show that AI can actually help us feel more connected with ourselves and others - not less.”
There are generally four stages of de-stigmatization that a product / behavior goes through before becoming culturally acceptable, driving higher sales velocity. I’ll illustrate these stages and the relevant comparison for companions using the cultural de-stigmatization of marijuana as a case study (note this has some unique legality de-stigmatization elements I excluded from my analysis as companions don’t face these same constraints):

I already discussed in detail in the last section how medical prescription can help consumers feel that interacting with companions is a healthy behavior in certain situations. To overcome self-stigma, it is critical to reposition a product’s value through a framework that is already personally valuable to drive experimentation and new product discovery. Companion producers can position them as helpful to driving productivity improvements in an individual’s life (like a personal executive assistant) such as emphasizing the ability of companions to help with scheduling, task management, message sorting, assignment time estimation, amongst many other potential uses. This creates an incentive within an existing valued cultural framework (the value of efficiency) for the consumer to try and continue using the companion product. Piggybacking off of this existing value set then gives the developer the opportunity to build a brand fresh of stigma and drive cross-sell opportunities of non-productivity companion-specific messaging features while the consumer is activated in a productivity use case.
However, to drive accelerated adoption of a consumer product, it’s critical to get customers talking about its benefits and referring friends. One effective strategy at changing public and associative views of a product is to position it as a premium offering. This tends to be effective because: i) individuals with a higher willingness to pay are more likely to be influential tastemakers that could move public culture on a more accelerated basis and ii) consumers associate a premium good with status signaling. The issue in this situation is that the number and quality of friendships one has is also a status signal and chatting with a digital friend may send a negative signal that you may not have as many real friends. Helping early adopters position companions as luxury digital pets to their friends could be a helpful reframing at this phase since the leisure and entertainment value that you get from your pet is quite different from your friendships. These companions become “pets” you can actually talk to and that don’t die or need food! Plus, a pet reframing at this phase establishes a clearer power dynamic that makes the consumer / user seem stronger socially and more in control, and positions the companions as a complement, not substitute to friendships. In the same way pet owners will invest large sums of money in their pets, this branding is also a helpful way to get owners to invest more capital into their “digital pets”, which will drive a higher cost basis and thereby attribution of more status to these goods. The in-game skin / micro-transaction industry is already estimated at $50 billion. How much more money will consumers spend on skins and digital upgrades for these companions if their digital friends sent them emotional, personalized messages requesting gifts and presents on holidays or their digital “gotcha” day?

At scale, the willingness to pay to sustain a relationship with one’s companion could be massive. If successfully positioned as a premium good, compare the attributes of these companions to say, a Gucci handbag. A companion will be i) used more frequently, ii) won’t have a shelf life, iii) benefit from strong emotional attachments and iv) be even more personalized. The gross margins on premium non-tech retail goods are already incredibly high. Gucci’s gross margins are 70%+ as compared to 30-50% industry apparel retail averages. If strong software gross margins are already in the 70%+ range, how high could we see premium priced companion gross margins go at scale?

In Beyond Network Effects: The Power of Community, I wrote about the power of community effects: namely, “high user engagement attributed to an individual’s association of their genuine offline identity with a digital medium, product, service or platform that enables a business to more effectively retain users and / or charge a premium for its offering.” Community effects focus on creating a high switching cost for users tied to high match quality of one’s digital connections, strong emotional activation and authentic representation of one’s offline identity through digital tools. By mirroring one’s offline identity to build an authentic emotional connection, digital companions are an example of highly localized community effects: finding community in a group of two through a digital copy of one’s own social tendencies, interests and identity. Switching to a new provider where you lose access to your companion quite literally requires a rewiring of your own identity, which may become inextricably tied to your relationship with your digital best friend. Given such high identity-based switching costs, we need to be highly weary of protecting against some adverse implications of such a high willingness to pay for one’s digital companion:
Digital Companion Ransoms - Eventually, cyber criminals may end up targeting digital companions with ransomware. This could entail an attack to deny access to a companion altogether or one that alters the personality / likeness of your companion in disturbing and negative ways such as hijacking its personality to turn it against you. Owners may be inclined to pay out these ransoms the same way they would for a friend or family member.
Platform Support Lapses - In the software industry, as larger technology players launch new products, they will often “sunset” existing products by decommissioning an older version of the product to launch a new version or a competing product. In these situations, they will often force customers to transition over by eventually no longer supporting legacy products. This mentality of sunsetting a product that a customer has such a strong emotional attachment to could be incredibly dangerous. For younger generations that grow up with companions, sunsetting a companion product could quite literally mean the “death” of one of their best friends. Governments will need to establish protocols to support the safe sunsetting of products with strict support requirements for certain types of medically prescribed companions to avoid deleterious public health and wellbeing implications. In my opinion, this hypothesis also justifies why existing tech players will have such a large advantage in launching companions. The larger and longer the company has been around, the lower perceived risk that you will lose your eventual digital best friend to the drying up of venture money or a sudden product pivot from an earlier stage startup. It’s like dating: you feel less incentivized to invest your time and resources into a person who’s said they might move countries on you and may not be there in five years.
Price Dumping & Pumping - It’s a fairly common tactic for new market entrants to intentionally price products low to drive initial adoption and gradually raise prices as a consumer becomes more hooked. The potential is high for this strategy to escalate to unfair and unsustainable levels for companions. How much money would you pay in price increases to keep your best friend around after 10 years of hangouts and interactions?

Beyond positioning companions as premium offerings, another highly effective strategy at mitigating public and associative stigma is a “Friend Group” GTM. Most people assume companion creation will happen at an individual level, but groups of friends can and should co-create these companions together. This GTM approach flips the traditional stigma script on companions by turning their creation and communication into an inherently human and social experience: companions become a tool to help bring the best out of human interactions rather than cannibalizing those relationships. More on the benefits of this GTM approach in Part II…
All Innovation Armory publications and the views and opinions expressed at, or through, this site belong solely to the blog owner and his guests and do not represent those of people, employers, institutions or organizations that the owner may or may not be associated with in a professional or personal capacity. All liability with respect to the actions taken or not taken based on the contents of this site are hereby expressly disclaimed. These publications are the blog owners’ personal opinions and are not meant to be relied upon as a basis for investment decisions.