If you listen to the hype, you might think AI is poised to revolutionise every aspect of our lives. From self-driving cars making taxis and buses a thing of the past to generative AI rendering entire industries obsolete, we are entering an era of incredible automation. At least, that is what is being promised. But can it automate dating? Bumble founder and CEO Whitney Wolfe Herd thinks so. She recently proposed that AI dating personas can make dating happier, safer and easier. Sadly, if you scratch beneath the surface, I doubt that is her motivation for injecting the dating world with AI. Let me explain.
You see, Herd wants to make an AI dating concierge chatbot that is part your persona/avatar and part therapist. She recently said, “There is a world where your dating concierge could go and date for you with other dating concierge … and then you don’t have to talk to 600 people.” The idea is that you would talk to this chatbot, which would then learn to mimic you. This chatbot would then go on virtual dates with other people’s chatbots, with the AIs acting as people’s avatars. The AI could then take those virtual dates and recommend which ones you should actually date in real life. However, Herd also said she wants AI to help “create more healthy and equitable relationships” by acting as a dating therapist. Herd explained, “For example, you could in the near future be talking to your AI dating concierge, and you could share your insecurities,” you could tell the AI “‘I’ve, just come out of a break-up, I’ve got commitment issues,’ and it could help you train yourself into a better way of thinking about yourself.”
This idea has seemingly divided the internet, with some understandingly loving it and others hating it. But, there is a lot to unpack with such a concept, far more than just ‘will the public like this?’
Let’s start by clarifying that AI on dating apps isn’t inherently bad. Bumble itself already has some brilliant tools, such as its “For You” match recommendation page, its spam profile detector and its lewd image blocker. But these are tools to augment your dating life, not to do it for you or to act as a trusted therapist. Using AI in this far more direct way could be incredibly risky.
How can an AI chatbot replicate you in a virtual dating scenario? It can’t.
Chatbots can quite accurately replicate what a specific human would say in a particular situation. But this doesn’t come from understanding the person’s motivations, fears, wants or personality. Instead, it processes how a human reacts in a specific scenario over and over again and builds up a statistical model of what words and phrases they are most likely to say next to replicate how that person would respond. In this way, AI chatbots are more akin to an advanced text prediction system than an intelligent avatar.
So, there are some issues here right away. Firstly, how will this AI dating concierge get the data on what you would say on a date? It can’t scrape from your messages to potential suitors on Bumble, as messaging and going on a date are very different socially. Instead, you would have to go on dates, and your phone would have to record how you and your date interact with each other. Data privacy issues aside, you would have to do this hundreds of times before the AI could get enough data to be able to mimic you in these situations.
But even then, it won’t be close enough. Dating is a wild game, and curveballs come at you constantly. Suitors will come along with hobbies, beliefs, disabilities, sexual preferences, jobs, family situations or mental health issues that aren’t similar to anyone you have previously dated. It can be hard enough for you, a sentient, cognitive, intelligent being, to decide if these are things you are like or are willing to accept. Let alone an AI.
AI famously struggles with novel situations that don’t align with anything in the data it was trained on. AI can break down and produce incoherent gobbledygook in these novel or “edge case” situations. This is because it is trying to produce a response based on statistical analysis of data that is entirely irrelevant. This is why self-driving cars suck so much because even simple driving tasks can create novel situations which require the driver to have an innate understanding and a mental framework of what is going on and what they should do in order to make the correct decision, which AI simply doesn’t have. AI is not intelligent; it doesn’t think; it is purely statistical and can’t cope with novel situations because of that.
So, if AI can’t yet reliably cope with navigating a self-driving car around pigeons flying around the place, how on earth can it accurately replicate your reaction to unique or novel suitors? Simple, it can’t, and it never will.-
Okay, so the AI avatar virtual dating idea is likely a road to nowhere. But an AI dating therapist could be useful. After all, that is a much more confined situation, and there are reams of medical literature as to how a therapist should treat someone struggling with dating that can be used to train the AI to give medically correct care.
Well, there are two significant issues with using Bumble’s proposed AI dating concierge in this way. Firstly, while a therapy chatbot can say the medically correct thing to a patient, that isn’t the same as providing good therapy. Many forms of therapy require a deep and personal connection between therapist and patient. For example, trusting a living, breathing human enough to confess true emotions to them is part of the process. It can remove mental blockages and help unravel issues. Confessing emotions to a distant chatbot doesn’t elicit the same response, at least for many people. As such, many therapists are worried chatbot therapists can actually intensify issues associated with loneliness. Considering many people use dating apps to address their loneliness, this is a massive issue for Bumble’s proposed AI!
But the second issue is quite possibly the biggest. Data privacy. For a chatbot to give you therapy, you need to provide it with incredibly personal information. This information could cause embarrassment, stigmatisation, discrimination and financial harm to the user if this information was leaked. What’s more, this data could be used to create incredibly targeted predatory adverts and media that look to exploit the user’s mental state if it was sold to the wrong people. Imagine how powerful political adverts could be if they knew the details of your mental state, your fears and your motivations.
These potential problems are far from unrealistic or hyperbolic. Dating apps like Bumble and Tinder already collect and sell people’s data to marketing service providers (without many users realising it). This is actually a significant income source for these apps. As such, they collect a lot of data! One reporter requested their data from Tinder and got back an 800-page dossier! They also frequently have data leaks; in fact, there is currently a class action lawsuit against Bumble for a data leak. So, these nightmare scenarios of your Bumble AI dating therapist informing which adverts you see or their notes on you leaking online are 100% possible.
So, why is Herd pushing this? Well, Bumble is a publicly traded company now and has only just reached profitability. Herd, being a 17% shareholder, is now motivated to make the shares worth as much as possible, and I am happy to theorise this AI is Herd’s way of doing just that. Training the AI to go on these virtual dates and having dating therapy will likely be one and the same. It isn’t feasible for the AI to gather direct data of you on an actual date (as we discussed), so the AI will instead likely ask you a series of questions to build up an idea of what you are looking for. This will be an even less accurate way to inform the automated dating AI we previously discussed, but it will feed perfectly into the AI doing dating therapy, as the questions will be very similar. This way, the user is motivated to go through this potentially difficult and emotionally painful therapy under the guise of training their automated dating AI to find their love interest. This AI will then go on to do a below-par job and might find a handful of good enough suitors while failing to filter out non-starters; meanwhile, bumble now has more personal information on you than the NSA and Meta combined and can sell it to the highest bidder, bumping up their profits and share price.
Now, I’m not saying this is Herd’s only motivation. Having listened to several interviews with her, I genuinely feel like she wants to make dating more pleasant and safer. But you can have moral and noble goals and still use horrifically immoral and damaging ways to achieve them. After all, the utilitarian “by any means necessary” sums up the billionaire class that Herd is now a part of. But surely, the way to make dating apps better, more profitable, and more valuable is to make them more human, rather than putting an unthinking machine between people?
Thanks for reading! Content like this doesn’t happen without your support. So, if you want to see more like this, don’t forget to Subscribe and follow me on BlueSky or X and help get the word out by hitting the share button below.
Sources: NBC, Fortune, Market Watch, The Conversation, Markkula Centre, The Guardian, Mozilla Foundation, Class Action .org, Forbes