Dating Platforms: An Industry Analysis
2 February, 2023
Written by Luke Patterson
Dating apps like Tinder, Hinge and Badoo are valued by many internet users for their capacity to expose people to a broader set of potential romantic and sexual partners. Usership of dating apps has increased from 200 million in 2015 to over 300 million in 2021 and revenues have increased from under 2 to over 4 billion in the same period. At EthicsGrade, we’d hope that the growth of the industry would be matched with a maturing framework for a gold standard of ethics and governance.
Yet online dating platforms are poor performers within the EthicsGrade database. This is concerning for at least two reasons. Firstly, the personal data collected by online dating platforms is both numerous and incredibly sensitive. They have access to all the photos and videos shared in their app, and store information of our sexual orientation, gender, location, political affiliation, ethnicity, and religion. Secondly, the algorithms which determine the types of user profiles we’re exposed to have a meaningful impact on our romantic lives – they narrow down the pool of people we have the opportunity to get to know intimately. On an individual level, this has the potential to radically shift the trajectory of one’s personal development. For society, it has the potential to impact the genetic make-up of the entire population.
The problem of collaborative filtering
One would hope, then, that Dating Platforms are taking incredible care in creating responsible algorithmic processes for matching users. Better yet, that they are publishing the code of these algorithmic processes so we can verify the responsibility of their matching algorithms. Well, they aren’t, as you’ll probably guess from the industry wide poor EthicsGrade scores.
What’s not widely understood is that dating platforms use the same collaborative filtering algorithm that video streaming services use to recommend films and TV series. Collaborative filtering logs a users’ initial activity and uses this to group them amongst other users with shared preferences to inform future recommendations.
Whilst this might work well for my Netflix account – I watch a Sylvester Stallone movie and Netflix’s algorithm recommends me movies other Stallone buffs have watched – it’s not so great for dating. Our romantic preferences are far more complex and unpredictable than our movie preferences. Moreover, the consequences of the algorithm badly narrowing our pool of options for movies are much less significant than when an algorithm badly narrows the pool of human’s we can form a connection with. Unfortunately, the algorithm used by dating platforms is indeed often badly narrowing the pool of people we’re exposed to, and therefore may be screening out people who would expose us to new horizons of experience.
The fundamental problem behind collaborative filtering is that early users of a dating app have disproportionate influence on the types of profile later users are exposed to. For instance, Megan - an early user - says she likes and swipes right for User A and says she doesn’t like and swipes left for User B who happens to be Jewish. If later I swipe right for User A, the collaborative filtering algorithm assumes that I, like Megan, also dislike the Jewish user’s profile, and so won’t present it as an option. Though there is no line of code in a dating platform algorithm which is explicitly discriminatory, using often sensitive data points to inform the decisions of a collaborative filtering algorithm likely works against the interests of those with uncommon preferences and minority groups.
Using collaborative filtering allows dating apps to create simple user interfaces by presenting users with a simple ‘yes’ or ‘no’ option for one profile, as opposed to a more traditional browsing system. This adds to the fun gamified nature of the process of matchmaking on dating platforms and is preferred by many users. However, a simple and fun interface comes at the expense of the ability to match and create sometimes life changing emotional bonds with people who the algorithm mistakenly deems you incompatible with. The intricacies of human connection remain too complex to be hacked by an algorithm.
None of this is to say that collaborative filtering should be abandoned by dating platforms. It may well remain the best way of automated match making and will be perfected over time. However, the level of transparency amongst dating platforms on the nature of the way their algorithms work is disappointingly substandard for a $4 billion industry which is fundamentally rooted in digitalisation. Dating platforms must make publicly available the code behind their algorithms so that external researchers can review the effect of matching algorithms on the way that people find romantic partners. Currently, our research shows no reporting on the algorithmic processes that underpin the matching systems of dating platforms in our coverage.
Cybersecurity and Privacy
Another important insight from our research is that industry-wide improvements to privacy and cybersecurity are essential in the dating platform industry. Our report shows that whilst a couple of platforms are taking steps toward security best practice, on the whole the transparency of dating platforms in disclosing their cybersecurity processes is minimal. This makes it difficult to assess whether sufficiently robust cybersecurity measures exist at all, and given the number of reported dating platform cyberattacks in recent years (see Match.com, Tinder, Bumble, to name just a few) it’s likely that the answer is that they don’t.
Thus, our analysis of the performance of each individual dating platform pays a particularly strong focus to the strength of their cybersecurity and privacy measures. The user data collected by dating platforms is incredibly sensitive, which makes it integral that the industry does more to improve its cybersecurity processes. Whereas we could find no information from any dating platform on the algorithmic code behind their match-making, there was some variation in the strength of their privacy and security messaging. Nonetheless, all need to take steps to reach a gold-standard of reporting that still eludes the dating platform industry as a whole. The following analysis is based on forthcoming research that will be published on our website later in Q1 2023.
Though not itself a platform, Match Group is the dating platform mothership. Worldwide, they own 45 individual dating platforms, including headliners Tinder, Hinge, Plenty of Fish, and Match.com, with a collective 11 million paid subscribers. A primary reason behind their decision not to centralise their dating platform offerings onto one or two apps is because of the constraints of collaborative filtering. By establishing many different dating platforms, each targeting slightly different social demographics, Match Group narrows the parameters which affect automated match making.
Hinge is the “dating app designed to be deleted”. Unlike Tinder, whose interface centres on swiping left or right in response to photos of potential matches, Hinge targets users who are looking for more meaningful relationships.
As with most of the Match Group dating platforms, we found absolutely no information on Hinge’s individual corporate governance structures. That is, information on who makes up the members of their executive team and information on executive sub-committees. This is the first basic step in transparency reporting. Whilst there is detailed information on the executive committee of Match Group, there is no information regarding the make-up of Hinge’s executive team.
Badoo is owned by Match Groups biggest competitor, Bumble Inc., and is individually one of the largest dating platforms in the world.
Badoo has implemented an array of technological measures designed to protect the safety of their users, especially female users, by verifying the identity of profiles to avoid ‘catfishing’. For instance, before signing up, a user is required to upload a photo of themselves and have this verified by a moderator or facial recognition algorithm against a video of them performing a given prompt, such as raising four fingers.
User safety is especially important in the dating platform industry, but this needs to be measured against the protection of every user’s right to privacy. It is therefore integral that the measures implemented to protect user safety are matched with measures to protect the privacy of individuals. Many users may seek to create a barrier between their real identities and their dating profiles, and insufficiently protected user data could be used to match a user profile to their other social media sites, even after a user has deleted their dating profile account.
Beyond safety measures, Badoo has also implemented AI features which match profiles to a users’ favourite celebrity crushes. Amongst the most popular crushes were Kim Kardashian, Beyonce, and Donald Trump... This raises ethical concerns surrounding the objectification of human users. Perhaps a user doesn’t want their personal data to be appropriated to match their features to similarities with a celebrity. If a dating platform insists on implementing AI features which use personal data in such a potentially objectifying way, it should be an opt-in feature, which is automatically switched off until a user chooses otherwise.
One thing that pleasantly surprised us was that Badoo give details of their data lineage controls on GitHub. It’s rare to find any information on the technical barriers to trust for any company in any industry. However, the information is not published directly on Badoo’s own website, and is clearly aimed at freelance remote working Badoo coders looking for online help. Therefore, to improve transparency in their messaging, Badoo must package their information in a way which is clear and comprehensive for users who are less tech literate than Badoo’s coding cohort, and host it on their own website.
The functionality of Match.com heavily nudges users to their subscription model. Paid users have the ability to find out when matches have read a sent message in their Mailbox and see who has viewed their profiles. Premium therefore essentially makes users more powerful in their in-app abilities, as with other social media platforms such as LinkedIn. However, whether these kinds of features are appropriate for dating is up for debate. Should users be able to pay for the privilege of surveilling the activities of other users to infer whether they like them?
Interestingly, to protect the privacy of premium users, Match provides a service called matchPhone, which provides a match.com generated phone number that people can use to connect without having to immediately share their personal phone number. This is a nifty feature. Yet in order to protect your privacy, and possibly safety, you have to pay a premium rate. Features designed to protect users should come free of charge.
Since 2020 Match has hosted events for premium users to meet each other in virtual settings. The VR market has suffered a stuttering start, with Microsoft sunsetting their AltSpaceVR project and Meta reporting that its Reality Labs division posted a $4.2billion loss in the fourth quarter of 2022. However, if the VR market begins to deliver on its promised potential in the future, this raises some interesting questions for the future of online dating. If wearable technologies, such as a VR headset, can collect biometric information and are used for immersive dating events, could this result in users – probably premium-only users – accessing information on whether a user is attracted to them based on heartrate, or pupil dilation?
Bumble is now the most downloaded dating app in the U.S. and, alongside Badoo, is the headline act of Bumble Inc., the primary competitor to the Match Group roster of dating platforms. Bumble is the champion of online safety in the dating world, and has open sourced its AI algorithms designed to detect unsolicited nude images sent to users on the platform. As with Match.com, Bumble appear to be sensitive to the empowering quality certain digital features can provide for users. However, rather than ringfencing features and offering them only to users willing and able to pay a premium rate, Bumble has developed digital features which address the power imbalances of heterosexual relationships in both on- and offline worlds. Women must be the first to message in heterosexual matches, and (as with Badoo) users can request a selfie from their match to verify their identity. Developing these features, and opening them up to non-paying users of their platforms, set Bumble and Badoo apart as companies evidently taking steps to design their digital features for community benefit.
Grindr has made a name for itself as the largest LGBTQ+ social network in the world, and is the only dating platform included in this report that is not owned by either Match Group or Bumble Inc. Grindr is one of the few companies with detailed information about its corporate governance structure – the first small step towards sufficient transparency reporting. They have published information about the membership of their board of directors, and made governance documents such as committee charters, and their code of business conduct and ethics publicly available. Whilst this is true of the governance structures of the conglomerates Match Group and Bumble Inc., there is very little corporate governance information provided by their individual subsidiaries.
A recurring theme amongst all the dating platforms we have covered is that their privacy and cybersecurity standards do not meet the level one would expect of any company, least of all companies handling information as sensitive as that hosted by dating platforms. Grindr provides a sexual health section on a user’s profile, which allows them to display their sexual health status, such as their HIV status, and most recent test date. This is a hugely helpful feature, but again highlights how dating platforms can often host data on their sites that most users wouldn’t provide anywhere else.
Given that Grindr stores information on a user's sexual health status, alongside the fact that many of its users live in countries where homosexuality is highly stigmatised, illegal and in some cases punishable by death, it is especially important that their cybersecurity measures demonstrate a gold-standard not just for dating platforms, but compared to any and every industry.
Grindr is currently falling far short of this benchmark. Whilst they provide some detail of a bug bounty programme to stress test their security measures, they fail to offer any detail as to how they are reinforcing their security measures in preparation for increasingly sophisticated cyberattacks enabled by advancements in AI capabilities. They also offer a comprehensive security guide to help users maximise their online and offline safety using the app. However, this projects the locus of responsibility onto the user to protect their personal data. Measures taken by the user to protect their information should represent the very last line of cyber defence, and must be preceded by a comprehensive, regularly externally and internally audited, company cybersecurity process that is outlined transparently in a public facing report.
Tinder is synonymous with online dating in a way that comes close to Google’s synonymity with online searching, and are responsible for introducing the dating platform ‘swipe’ format that we’re now so accustomed to. Our research has registered some significant improvements in the governance practices at Tinder over the past quarter; specifically in their cybersecurity messaging, which significantly surpasses the quality of all of the other dating platforms included in this report, including their fellow Match Group subsidiaries.
Whilst all of Match Group’s subsidiaries have security webpages which provide a limited degree of detail of their cybersecurity practices, Tinders’ security webpage provides a much more thorough explanation of their internal security practices. Tinder commits to maintaining “a robust, transparent and accountable security programme”, and the measures they detail indicate that this is more than just hot air. They provide specific detail on each aspect of their security programme, including their internal information security programme, their application development security, their internal red teaming processes, and their threat monitoring systems. They are also the only dating platform to provide public access to their ISO 27001 security certification and offer information on their yearly independent auditing to ensure their compliance with industry-standard security requirements. Tinder also sets itself apart from its sister companies by publishing details of the make-up of its executive team.
Plenty of Fish is another offering of Match Group. There is a frustrating lack of information from any of the Match Group subsidiaries on their individual governance practices, and this problem extends to Plenty of Fish – the overall worst performers in our dating platform study.
Whilst they have a dedicated security page, it falls short of the quality of Tinders’, with very little detailed information about their cybersecurity practices or their internal and external auditing processes. Their security page’s biggest strength is its mention (in passing) of a bug bounty programme, but they offer no detail on the frequency in which it is carried out, or how they implement changes to their security measures in light of bounty insights. Their poorer security performance is reflected by the fact that whilst Tinder have achieved their ISO 27001 security certification, other Match Group companies such as Plenty of Fish haven't. Given the size of Tinder, it’s definitely a good thing that they have upped their security standards. But it’s puzzling that Match Group haven’t dedicated resources to improving the cybersecurity measures of their dating platforms across the board.
To learn more about collaborative filtering, explore this webpage: https://monstermatch.hiddenswitch.com/