- The night prior to pre-announcing, a horrifying lawsuit was filed against MEET for gross negligence involving an alleged child kidnapping/rape incident stemming from a MeetMe encounter.
- Recent case precedent disfavors MEET – a similar lawsuit in the same county filed against Facebook survived a motion to dismiss in May 2019 and appears to be heading to trial in 2020
- The lawsuit aims to hold MEET accountable for “knowingly and willingly allow[ing] children under the age of 13 to use the mobile application without any prevention or oversight”.
- In recent weeks, MEET quietly rolled back new safety features implemented in June 2019, causing us to question whether MEET is putting revenues ahead of safety.
- MEET plays up its involvement with a safety-oriented non-profit called FOSI – therefore a public statement on a July 2019 Northland Securities call from FOSI’s founder regarding young women and dating deserves significant scrutiny.
PLEASE READ: IMPORTANT LEGAL DISCLAIMER
Meet Group Sued in Harris County
In recent months, the Meet Group (MEET) has repeatedly touted alleged enhancements to its platform safety. In fact it issued press release after press release since June 2019 talking about its safety measures, recently culminating with an October 3, 2019 press release in which the company claimed that the headwinds it faced from implementing new safety features are now largely behind the company.
What the company did not mention in the October 3, 2019 pre-announcement is that one day earlier, on October 2, 2019, the company was sued in Texas court by the parent of a pre-teen victim of an alleged kidnapping and sexual assault that allegedly arose out of a MeetMe encounter. The lawsuit can be accessed here. The lawsuit is graphic in nature.
Source: Harris County, Texas Case 2019-72363 / Court: 215
We think this lawsuit highlights highlights the risk of putting revenues ahead of user safety and the novel legal theories emerging to hold online platforms liable for vetting their users. The lawsuit also suggests that the company is far from being “off the hook” as it relates to its relationships with Apple (and to a lesser extent Google) as troubling press surrounding the company is clearly not going away anytime soon. Even the company’s recent share repurchase needs to be significantly scrutinized – in an effort to appease shareholders, the company spent over $12 million on buybacks since June 2019 – capital that could have otherwise been deployed to improve user safety.
The lawsuit is noteworthy because it closely resembles a suit brought against Facebook last year in the same county. A Jane Doe victim (14 at the time of the purported incident) sued Facebook in October 2018, alleging that Facebook was grossly negligent, effectively allowing her to become the victim of a sex trafficker. Facebook moved to dismiss the suit in March 2019. However, Facebook’s attempt to dismiss the suit was unsuccessful – the judge’s May 2019 ruling allowing the suit to move forward can be found here.
Source: Harris County Texas, 334 th Judicial District, No. 2018-69816
The analogous Facebook case appears to already be in discovery and is scheduled for trial next year:
It goes without saying that The Meet Group is not Facebook. The fact that Facebook – with arguably the world’s best attorneys – was unable to get a similar suit dismissed at the pleading stage does not bode well for MEET’s position. As a result, we believe the October 2 lawsuit against MEET may have teeth. The broader implications of the lawsuit may require substantially more investments into safety at MEET.
The topic of holding online dating companies liable for the users they host is becoming an increasing topic of interest in society as online dating continues to rapidly grow.
If you invited people over to a dinner party while knowing your home was unsafe and you did not properly disclose the safety risks to the invitees, you can be held liable for eventual injury caused to your guests. Why should social media portals not be held accountable for knowingly hosting predators that go on to harm other guests?
With MEET Now Facing Lawsuit Over Its App’s Safety, Why Has It Quietly Rolled Back Key Safety Feature Touted in June 2019?
MEET’s pre-announcement attempted to paint the company’s safety issues as legacy problems; we think that the company is far from done in terms of investing into safety. We also think the company is having a hard time actually implementing safety changes without impacting revenues. For example, in June 2019, the company touted its “ one-click” reporting feature. Notably, while the iPhone version of MEET apps continues to allow users to report problems using one-click, the company’s Android app rolled back the one click feature quietly in the past few weeks. As a reminder, Android (Google) represents a higher proportion of MEET revenues than iPhone (Apple). When it first issued 3Q guidance, it called out the impact certain new safety features were having on top-line, making it clear that top-line was in fact impacted by safety tweaks.
Reporting content problems now brings up a second “are you sure” screen in the Android version of the company’s app that does not appear in the iPhone version of the app:
Source: MEET App on Android phone retrieved 10/8/19
Without mentioning that it has rolled back some safety features, the company now claims that its safety problems are behind it.
For a period of time following rumors of Apple considering a MEET ban, the company also did away with its “New Streamer” tab which, based on our time on the app, tends to be the most problematic of all tabs. New streamers appear to be more likely to stream vulgar or prohibited content. Often, new streamers were streamers that had previously been kicked or banned and were simply returning to MEET using a new user handle, trying to bypass a prior user account block. The company quietly reinserted the new tab months ago (but after 3Q19 guidance was issued) into both versions of its app. In light of the recent lawsuit against the company, we view this decision as a key risk. We also believe that this reinstatement may at least in part be due to pressure on top-line that was caused by removing this problematic app feature – our review of message boards suggests that the New Streamer tab is popular with the users actually buying virtual currency on MeetMe given the propensity of new streamers to take risks in order to make quick money.
We also just spent a whopping five minutes on MEET in the 36 hours prior to publishing this story to see if the company has actually cleaned up the content. Reality? Comment streams are still full of extremely vulgar content. Users are still regularly soliciting streamers to expose their body parts or are posting their sexual fantasies publicly. User profiles also contain language and emojis that any AI filter should have been able to catch as vulgar (e.g. phallic symbol emojis). One user bio even proposed a specific price tag for “video call sex”. While MEET appears to have added a “Modbot” warning post when you first open a stream, the alleged Modbot did not immediately kick users we saw who were making very obviously vulgar comments. We are not sure if Modbot is legitimate AI or just a psychological warning system.
Just this morning, we found four separate users with publicly displayed biographies that exclusively referred to word descriptors of their genitalia. We are leaving the screenshots out of the story but again encourage readers to just take a trip to the MEET family of apps, open a “New” streamer tab, and browse the profiles.
These issues are troubling because the content we found was largely tied to static rather than dynamic content. From our conversations with experts in social media content moderation, we are led to believe that filtering static content is significantly easier than dynamic content because AI is more effective for static content. MEET also publicly claimed that it uses these AI systems to monitor content using regular screen shots – we have a hard time understanding how the screenshots we identified above could have passed a robust content filter.
In light of the lawsuit that we found, it is also noteworthy that you do not need to look hard to find recent examples of MeetMe hosting streams with users who appear far too young to be on the platform. Because some of the content is incredibly sensitive and includes graphic images of men exposing themselves to what appears to be a minor, we are NOT posting screenshots in this story but recommend that readers who want verification of our claim visit Twitter and visit the profile for user @MeetMeMod. The particularly troubling posts are dated August 2019, after the company allegedly improved its safety features.
The posts from Twitter’s @MeetMeMod are graphic and include actual visuals of adult male genitalia being exposed to what appears to be an underage streamer. The stream apparently lasted for a long enough duration that the Twitter user was able to capture multiple screenshots, suggesting that moderators were slow to act.
In the recent pre-announcement press release, Geoff Cook MEET’s CEO stated:
We believe that the moderation-related headwinds related to our recent safety enhancements are largely behind usGeoff Cook, CEO MEET
This suggests to us that MEET does not believe it needs to do much more work on the safety side. Yet in this story, we have surfaced a host of actionable and recent examples of clearly objectionable content that should not have passed any robust filtering system, and surfaced evidence that MEET actually rolled back safety features after disclosing them to the market.
We believe part of the problem with MEET is the company’s fundamental philosophy on safety as we evidence with the anecdote below relating to the Family Online Safety Institute aka FOSI, an organization that MEET looks to for “safety guidance”.
Meet Group Touts Safety Commitment While Relying on Out-of-Touch Safety Expert
Stephen Balkam is the head of FOSI, the Family Online Safety Institute. FOSI is a non-profit that claims to focus on the “advancement of the protection of the public, particularly children and young people, to protect them from harm arising from contact with unsuitable media on the internet and similar media”.
For those unfamiliar, MEET has proudly touted its involvement in FOSI as a sign of its strong safety standards. Northland Securities, a brokerage firm, even hosted an online dating safety panel (replay of webcast HERE) during which Geoff Cook of MEET participated alongside Balkam. During the recorded webcast, Balkam played up his involvement with MEET, stating that he has been “involved with Skout and Meet Group since 2012” and that he is “very impressed with both the technical but also policy steps they’ve taken”.
While we disagree with that statement, we chalk it up to corporate puffery on the part of the company’s outside advisor. While that statement was largely uncontroversial, we were particularly struck by a comment Balkam made during the July webcast (again, available for all the still listen to), around the 19:30 mark.
As an aside: yes, what Balkam said on the call is absolutely fair game as it relates to MEET shares. Balkam appeared on a stock-oriented conference call that was setup by Northland Securities to appeal to individuals interested in MEET shares. MEET also PR’d the existence of the call. MEET continues to be plagued by safety concerns, and just got sued for allegedly operating a social network in which “there exist no safety features built into the mobile application interface or functionality for the protection of minor children” (para 8 of Washington et al v Meet Group). Therefore, what Balkam had to say is absolutely relevant to both MEET shares and to the business culture at MEET.
Our transcription of the relevant excerpt is provided below (starts approx. min 19:30):
We’re seeing a generational shift in the way in which young people are approaching dating apps in particular, but social media more generally…I think it would be extremely rare for a young woman to go and meet someone for the first time without taking a friend with her. That is just so common that they would look at you as if to say well of course…having said that something that researcher Danah Boyd has often stressed is that there are at risk kids and at risk young adults that will play out in a dangerous way both online and offline and those are the cases that of course no number of safety features will necessarily captureStephen Balkam, Head of Family Online Safety Institute on Northland Securities Recorded Conference Call – Around 19:30 mark
When we were listening live to the July 2, 2019, Northland Securities “Safety Panel Discussion”, we almost fell out of our chairs in reaction to the statement above. There has certainly been a generational shift towards online dating. However, the notion that it is “extremely rare for a young woman to go meet someone for the first time without taking a friend” is totally out of touch and connotes shades of victim blaming. As members of the generation that participated in the shift to online dating, we can unequivocally state the following:
No Mr. Balkam, it is NOT the norm for for a young woman to take a friend to meet someone for the first time offline. Your out of touch comment says to us that you have never actually held any such conversations with young women.
We think that Mr. Balkam is by implication putting the onus on young women to remain safe rather than holding a) perpetrators of crimes accountable, and b) holding mobile apps that host predators accountable.
In our opinion, these types of views are completely and utterly out of touch and suggest that Balkam is an apologist for the tech industry’s bad actions rather than a safety advocate.
You can listen to Balkam’s actual words in this recorded webcast.
In our opinion, his comments fundamentally disqualify him from providing safety advice to a dating app.
Source: MEET Business Wire
We asked FOSI to provide us comment on Mr. Balkam’s statement prior to publication but did not receive any reply. If Mr. Balkam changes his mind and provides us with a reply, we will post it on Twitter.
We believe that the Harris County lawsuit against MEET needs to be considered in context of Mr. Balkam’s statement.
We suggest juxtaposing the comment from Mr. Balkam regarding “no number of safety measures” being enough with the allegations from the lawsuit against Meet Group in which the plaintiff alleges that the sexual predator was communicating with the victim for “several weeks” via MeetMe. Certainly some form of user communication monitoring could be implemented to significantly reduce these types of incidences. Certainly the app could require users to provide a credit card number in order to attempt to age verify its users and at least better attempt to confirm their identities.
One thing is clear. The Meet Group has touted its ties to FOSI. It has trotted FOSI out as some form of “rubber stamp” that suggests Meet Group is on the cutting edge of safety issues. With Mr. Balkam making comments like the one above in public settings, we believe that he should be disqualified from serving in any safety advisory capacity to a dating website. We offered Balkam an opportunity to defend his comments and did not receive a reply.
Notably, CEO of MEET Geoff Cook was present on the conference call during which Balkam made the comments above and undoubtedly heard the same statements.
We think the right decision for Mr. Cook following that call was to pull out of FOSI.
Our research continues to show that MEET’s content moderation cannot even remove basic and static vulgar content from profiles in a timely manner. Users are able to bombard live streamers with vulgar content with MEET’s purported “Modbot” apparently unable or unwilling to kick these streamers from rooms. The company’s age verification appears to remain utterly lacking, with apparently underage streamers still entering the company’s social media platforms. MEET’s ties to FOSI are also troubling given Balkam’s July 2019 commentary. In light of the recent lawsuit against MEET, we think the coast is far from clear and continue to expect significant pressure on the company’s ability to provide a safe network for users. The company could have spent its excess cash flows on safety enhancements but instead initiated a large share repurchase. We think that decision is likely to come back to haunt the company in coming quarters.