DATA INSIGHTS
Dating Platforms: An Industry Analysis
2 February, 2023

Written by Luke Patterson
Dating apps like Tinder, Hinge and Badoo are valued by many internet users for their capacity to expose people to a broader set of potential romantic and sexual partners. Usership of dating apps has increased from 200 million in 2015 to over 300 million in 2021 and revenues have increased from under 2 to over 4 billion in the same period. At EthicsGrade, we’d hope that the growth of the industry would be matched with a maturing framework for a gold standard of ethics and governance.
Yet online dating platforms are poor performers within the EthicsGrade database. This is concerning for at least two reasons. Firstly, the personal data collected by online dating platforms is both numerous and incredibly sensitive. They have access to all the photos and videos shared in their app, and store information of our sexual orientation, gender, location, political affiliation, ethnicity, and religion. Secondly, the algorithms which determine the types of user profiles we’re exposed to have a meaningful impact on our romantic lives – they narrow down the pool of people we have the opportunity to get to know intimately. On an individual level, this has the potential to radically shift the trajectory of one’s personal development. For society, it has the potential to impact the genetic make-up of the entire population.
The problem of collaborative filtering
One would hope, then, that Dating Platforms are taking incredible care in creating responsible algorithmic processes for matching users. Better yet, that they are publishing the code of these algorithmic processes so we can verify the responsibility of their matching algorithms. Well, they aren’t, as you’ll probably guess from the industry wide poor EthicsGrade scores.
What’s not widely understood is that dating platforms use the same collaborative filtering algorithm that video streaming services use to recommend films and TV series. Collaborative filtering logs a users’ initial activity and uses this to group them amongst other users with shared preferences to inform future recommendations.
Whilst this might work well for my Netflix account – I watch a Sylvester Stallone movie and Netflix’s algorithm recommends me movies other Stallone buffs have watched – it’s not so great for dating. Our romantic preferences are far more complex and unpredictable than our movie preferences. Moreover, the consequences of the algorithm badly narrowing our pool of options for movies are much less significant than when an algorithm badly narrows the pool of human’s we can form a connection with. Unfortunately, the algorithm used by dating platforms is indeed often badly narrowing the pool of people we’re exposed to, and therefore may be screening out people who would expose us to new horizons of experience.
The fundamental problem behind collaborative filtering is that early users of a dating app have disproportionate influence on the types of profile later users are exposed to. For instance, Megan - an early user - says she likes and swipes right for User A and says she doesn’t like and swipes left for User B who happens to be Jewish. If later I swipe right for User A, the collaborative filtering algorithm assumes that I, like Megan, also dislike the Jewish user’s profile, and so won’t present it as an option. Though there is no line of code in a dating platform algorithm which is explicitly discriminatory, using often sensitive data points to inform the decisions of a collaborative filtering algorithm likely works against the interests of those with uncommon preferences and minority groups.
Using collaborative filtering allows dating apps to create simple user interfaces by presenting users with a simple ‘yes’ or ‘no’ option for one profile, as opposed to a more traditional browsing system. This adds to the fun gamified nature of the process of matchmaking on dating platforms and is preferred by many users. However, a simple and fun interface comes at the expense of the ability to match and create sometimes life changing emotional bonds with people who the algorithm mistakenly deems you incompatible with. The intricacies of human connection remain too complex to be hacked by an algorithm.
None of this is to say that collaborative filtering should be abandoned by dating platforms. It may well remain the best way of automated match making and will be perfected over time. However, the level of transparency amongst dating platforms on the nature of the way their algorithms work is disappointingly substandard for a $4 billion industry which is fundamentally rooted in digitalisation. Dating platforms must make publicly available the code behind their algorithms so that external researchers can review the effect of matching algorithms on the way that people find romantic partners. Currently, our research shows no reporting on the algorithmic processes that underpin the matching systems of dating platforms in our coverage.
Cybersecurity and Privacy
Another important insight from our research is that industry-wide improvements to privacy and cybersecurity are essential in the dating platform industry. Our report shows that whilst a couple of platforms are taking steps toward security best practice, on the whole the transparency of dating platforms in disclosing their cybersecurity processes is minimal. This makes it difficult to assess whether sufficiently robust cybersecurity measures exist at all, and given the number of reported dating platform cyberattacks in recent years (see Match.com, Tinder, Bumble, to name just a few) it’s likely that the answer is that they don’t.
Thus, our analysis of the performance of each individual dating platform pays a particularly strong focus to the strength of their cybersecurity and privacy measures. The user data collected by dating platforms is incredibly sensitive, which makes it integral that the industry does more to improve its cybersecurity processes. Whereas we could find no information from any dating platform on the algorithmic code behind their match-making, there was some variation in the strength of their privacy and security messaging. Nonetheless, all need to take steps to reach a gold-standard of reporting that still eludes the dating platform industry as a whole. The following analysis is based on forthcoming research that will be published on our website later in Q1 2023.
Though not itself a platform, Match Group is the dating platform mothership. Worldwide, they own 45 individual dating platforms, including headliners Tinder, Hinge, Plenty of Fish, and Match.com, with a collective 11 million paid subscribers. A primary reason behind their decision not to centralise their dating platform offerings onto one or two apps is because of the constraints of collaborative filtering. By establishing many different dating platforms, each targeting slightly different social demographics, Match Group narrows the parameters which affect automated match making.
As covered in EthicsGrade’s privacy policy performance article, Match Group’s privacy messaging is notably more comprehensive than both their competitors and their subsidiaries. They detail the guiding privacy principles behind their work, and lay out their information very comprehensively. This is one of the factors behind their better overall EthicsGrade score. As of February 2023, Match Group established a newly created Chief Technology Officer executive position. Will Wu takes on the position off the back of his role at Snap Inc. (developer of Snapchat), and will spearhead product development of Match Group’s various individual dating apps. Currently, dating platforms are some of the only tech companies not suffering declining revenues, and the appointment of Wu indicates that product innovation in the dating platform industry is on the horizon.
Hinge is the “dating app designed to be deleted”. Unlike Tinder, whose interface centres on swiping left or right in response to photos of potential matches, Hinge targets users who are looking for more meaningful relationships.
Hinge no longer requires a user to provide certain sensitive information about themselves, such as their last name or religious preferences. However, there is still some information that Hinge requires a user to include such as their employer and occupation. Given the sensitivity of the data collected by apps such as Hinge, it is integral that they up the quality of their security and privacy messaging, so as to assure their users of the safety of their personal information. Currently, their privacy policy is passable, but as with all Match Group companies, it pales in comparison to its parent company’s privacy webpage. To restructure their privacy page in a way that matches the comprehensibility of their parent company seems a small ask.
As with most of the Match Group dating platforms, we found absolutely no information on Hinge’s individual corporate governance structures. That is, information on who makes up the members of their executive team and information on executive sub-committees. This is the first basic step in transparency reporting. Whilst there is detailed information on the executive committee of Match Group, there is no information regarding the make-up of Hinge’s executive team.
Badoo is owned by Match Groups biggest competitor, Bumble Inc., and is individually one of the largest dating platforms in the world.
Badoo has implemented an array of technological measures designed to protect the safety of their users, especially female users, by verifying the identity of profiles to avoid ‘catfishing’. For instance, before signing up, a user is required to upload a photo of themselves and have this verified by a moderator or facial recognition algorithm against a video of them performing a given prompt, such as raising four fingers.
User safety is especially important in the dating platform industry, but this needs to be measured against the protection of every user’s right to privacy. It is therefore integral that the measures implemented to protect user safety are matched with measures to protect the privacy of individuals. Many users may seek to create a barrier between their real identities and their dating profiles, and insufficiently protected user data could be used to match a user profile to their other social media sites, even after a user has deleted their dating profile account.
Beyond safety measures, Badoo has also implemented AI features which match profiles to a users’ favourite celebrity crushes. Amongst the most popular crushes were Kim Kardashian, Beyonce, and Donald Trump... This raises ethical concerns surrounding the objectification of human users. Perhaps a user doesn’t want their personal data to be appropriated to match their features to similarities with a celebrity. If a dating platform insists on implementing AI features which use personal data in such a potentially objectifying way, it should be an opt-in feature, which is automatically switched off until a user chooses otherwise.
Badoo’s use of personal data to both improve safety and add ‘fun’ features like their AI celebrity crush matcher makes it all the more important that the personal data they are collecting is protected by robust privacy and security measures. Our research found very little public facing information on the cybersecurity measures at Badoo, and their privacy policy isn’t designed to be an easy, comprehensive read for a user. After a bit of searching, we found a statement confirming that they retain identity verification scans for 3 years before the data is securely deleted, and users have no clear option to request early deletion. Safety features on a dating app are vital, but safety features which use sensitive personal information must be matched with a robust privacy framework that hands autonomy to the user over the way their personal data is used and stored.
One thing that pleasantly surprised us was that Badoo give details of their data lineage controls on GitHub. It’s rare to find any information on the technical barriers to trust for any company in any industry. However, the information is not published directly on Badoo’s own website, and is clearly aimed at freelance remote working Badoo coders looking for online help. Therefore, to improve transparency in their messaging, Badoo must package their information in a way which is clear and comprehensive for users who are less tech literate than Badoo’s coding cohort, and host it on their own website.
Match.com considers itself the pioneer of the online dating industry, being the first to launch in 1995. As another subsidiary of Match Group, the quality of their reporting largely resembles that of its sister companies (it’s not very good). All Match Group companies have the same privacy policy, and the same useful safety debrief for users seeking to meet their matches in person. However, whereas Tinder performs well in terms of its cybersecurity messaging, Match.com’s security page is short on detail. This is especially confusing given Match Group report that Tinder and Match.com have both achieved ISO 27001 certifications. Match.com need to up the quality of their cybersecurity messaging, so that the first hits on Google search are demonstrations of its cyber resilience, not a plethora of news articles detailing various cybersecurity breaches.
The functionality of Match.com heavily nudges users to their subscription model. Paid users have the ability to find out when matches have read a sent message in their Mailbox and see who has viewed their profiles. Premium therefore essentially makes users more powerful in their in-app abilities, as with other social media platforms such as LinkedIn. However, whether these kinds of features are appropriate for dating is up for debate. Should users be able to pay for the privilege of surveilling the activities of other users to infer whether they like them?
Interestingly, to protect the privacy of premium users, Match provides a service called matchPhone, which provides a match.com generated phone number that people can use to connect without having to immediately share their personal phone number. This is a nifty feature. Yet in order to protect your privacy, and possibly safety, you have to pay a premium rate. Features designed to protect users should come free of charge.
Since 2020 Match has hosted events for premium users to meet each other in virtual settings. The VR market has suffered a stuttering start, with Microsoft sunsetting their AltSpaceVR project and Meta reporting that its Reality Labs division posted a $4.2billion loss in the fourth quarter of 2022. However, if the VR market begins to deliver on its promised potential in the future, this raises some interesting questions for the future of online dating. If wearable technologies, such as a VR headset, can collect biometric information and are used for immersive dating events, could this result in users – probably premium-only users – accessing information on whether a user is attracted to them based on heartrate, or pupil dilation?
Bumble is now the most downloaded dating app in the U.S. and, alongside Badoo, is the headline act of Bumble Inc., the primary competitor to the Match Group roster of dating platforms. Bumble is the champion of online safety in the dating world, and has open sourced its AI algorithms designed to detect unsolicited nude images sent to users on the platform. As with Match.com, Bumble appear to be sensitive to the empowering quality certain digital features can provide for users. However, rather than ringfencing features and offering them only to users willing and able to pay a premium rate, Bumble has developed digital features which address the power imbalances of heterosexual relationships in both on- and offline worlds. Women must be the first to message in heterosexual matches, and (as with Badoo) users can request a selfie from their match to verify their identity. Developing these features, and opening them up to non-paying users of their platforms, set Bumble and Badoo apart as companies evidently taking steps to design their digital features for community benefit.
However, our research shows once again that Bumble need to do more to improve the quality of their privacy and cybersecurity messaging. The content of Bumble’s privacy policy is better than some: they provide information of the data used by their matching algorithms, and inform the reader that they use a mixture of automated processes and a team of human moderators to remove content which violates their community standards. However, it takes a lot of searching to find out that they retain any data collected on a user for 3 years for record keeping with no clear option of deletion. Dating platforms collect and store incredibly sensitive personal information about their users, and it is therefore absolutely essential that they establish appropriate measures to protect the privacy and security of personal data and communicate these measures in a way which is easy for users to access and understand.
Our research found nothing on Bumble’s cybersecurity measures, the layout of their privacy policy makes information hard to find, and the depth of the information provided is insufficient.
Grindr has made a name for itself as the largest LGBTQ+ social network in the world, and is the only dating platform included in this report that is not owned by either Match Group or Bumble Inc. Grindr is one of the few companies with detailed information about its corporate governance structure – the first small step towards sufficient transparency reporting. They have published information about the membership of their board of directors, and made governance documents such as committee charters, and their code of business conduct and ethics publicly available. Whilst this is true of the governance structures of the conglomerates Match Group and Bumble Inc., there is very little corporate governance information provided by their individual subsidiaries.
A recurring theme amongst all the dating platforms we have covered is that their privacy and cybersecurity standards do not meet the level one would expect of any company, least of all companies handling information as sensitive as that hosted by dating platforms. Grindr provides a sexual health section on a user’s profile, which allows them to display their sexual health status, such as their HIV status, and most recent test date. This is a hugely helpful feature, but again highlights how dating platforms can often host data on their sites that most users wouldn’t provide anywhere else.
Given that Grindr stores information on a user's sexual health status, alongside the fact that many of its users live in countries where homosexuality is highly stigmatised, illegal and in some cases punishable by death, it is especially important that their cybersecurity measures demonstrate a gold-standard not just for dating platforms, but compared to any and every industry.
Grindr is currently falling far short of this benchmark. Whilst they provide some detail of a bug bounty programme to stress test their security measures, they fail to offer any detail as to how they are reinforcing their security measures in preparation for increasingly sophisticated cyberattacks enabled by advancements in AI capabilities. They also offer a comprehensive security guide to help users maximise their online and offline safety using the app. However, this projects the locus of responsibility onto the user to protect their personal data. Measures taken by the user to protect their information should represent the very last line of cyber defence, and must be preceded by a comprehensive, regularly externally and internally audited, company cybersecurity process that is outlined transparently in a public facing report.
Tinder is synonymous with online dating in a way that comes close to Google’s synonymity with online searching, and are responsible for introducing the dating platform ‘swipe’ format that we’re now so accustomed to. Our research has registered some significant improvements in the governance practices at Tinder over the past quarter; specifically in their cybersecurity messaging, which significantly surpasses the quality of all of the other dating platforms included in this report, including their fellow Match Group subsidiaries.
Whilst all of Match Group’s subsidiaries have security webpages which provide a limited degree of detail of their cybersecurity practices, Tinders’ security webpage provides a much more thorough explanation of their internal security practices. Tinder commits to maintaining “a robust, transparent and accountable security programme”, and the measures they detail indicate that this is more than just hot air. They provide specific detail on each aspect of their security programme, including their internal information security programme, their application development security, their internal red teaming processes, and their threat monitoring systems. They are also the only dating platform to provide public access to their ISO 27001 security certification and offer information on their yearly independent auditing to ensure their compliance with industry-standard security requirements. Tinder also sets itself apart from its sister companies by publishing details of the make-up of its executive team.
However, Tinder's privacy policy is a carbon copy of the other Match Group platforms – that is, average and in need of a glow up. It isn’t that the necessary information isn’t included within Tinders’ privacy policy, but that the information isn’t laid out in a way which encourages users to engage with it.
Plenty of Fish is another offering of Match Group. There is a frustrating lack of information from any of the Match Group subsidiaries on their individual governance practices, and this problem extends to Plenty of Fish – the overall worst performers in our dating platform study.
Whilst they have a dedicated security page, it falls short of the quality of Tinders’, with very little detailed information about their cybersecurity practices or their internal and external auditing processes. Their security page’s biggest strength is its mention (in passing) of a bug bounty programme, but they offer no detail on the frequency in which it is carried out, or how they implement changes to their security measures in light of bounty insights. Their poorer security performance is reflected by the fact that whilst Tinder have achieved their ISO 27001 security certification, other Match Group companies such as Plenty of Fish haven't. Given the size of Tinder, it’s definitely a good thing that they have upped their security standards. But it’s puzzling that Match Group haven’t dedicated resources to improving the cybersecurity measures of their dating platforms across the board.
To learn more about collaborative filtering, explore this webpage: https://monstermatch.hiddenswitch.com/