Snapchat Has Become A ‘Haven For Child Abuse’ With Its ‘Self-Destructing Messages’

20,384 views|May 26, 2019,05:13am EDT


Zak DoffmanContributorCybersecurity

Not a good week for Snapchat. On Thursday, Motherboard reported that “several departments inside social media giant Snap have dedicated tools for accessing user data, and multiple employees have abused their privileged access to spy on Snapchat users.” And now the Sunday Times has published an investigation into allegations that predators are “flocking” to the social media platform, which has become a “haven for child abuse.”

Motherboard’s article cited two former employees who claimed that “multiple Snap employees abused their access to Snapchat user data several years ago.” This included the use of “internal tools that allowed Snap employees to access user data, including in some cases location information, their own saved Snaps and personal information such as phone numbers and email addresses.”

SnapLion, one of the tools referenced in the Motherboard article, was designed to gather information for “valid law enforcement requests. Claims that this tool was involved in the alleged misuse have not been verified.

A Snap spokesperson told me that “any perception that employees might be spying on our community is highly troubling and wholly inaccurate. Protecting privacy is paramount at Snap. We keep very little user data, and we have robust policies and controls to limit internal access to the data we do have, including data within tools designed to support law enforcement. Unauthorized access of any kind is a clear violation of the company’s standards of business conduct and, if detected, results in immediate termination.”

Ironically, it is this limited user data that is central to the Sunday Timesinvestigation. The newspaper’s investigation has uncovered “thousands of reported cases that have involved Snapchat since 2014,” including “pedophiles using the app to elicit indecent images from children and to groom teenagers,” as well as “under-18s spreading child pornography themselves.” This has now resulted in U.K. police “investigating three cases of child exploitation a day linked to the app, [with] messages that self-destruct allowing groomers to avoid detection.”

The Sunday Times quotes Adam Scott Wandt from John Jay College of Criminal Justice in New York calling Snapchat a “haven” for abusers, arguing that the “self-destruct” nature of Snapchat’s messages “makes it difficult for the police to collect evidence.”

Wandt claims that in this way “Snapchat has distinguished itself as the platform where abuse of children happens… The problem was that adults realized you could do a simple Google search and find out that most Snapchat messages are unrecoverable after 24 hours, even by law enforcement with a warrant.”

The U.K. children’s charity, the NSPCC, rates Snapchat as a high risk, with a spokesperson for the charity explaining that predators intent on grooming children “cast the net wide in the expectation that a small number of children will respond.”

The charity has also warned on self-generated images taken and shared by children themselves. “As soon as that image is shared or screenshotted, the child loses control over it… those images may start on a site like Snapchat, but they could very easily end up circulating among technologically sophisticated offenders, making their way onto the dark web.”

Snap told me that “we care deeply about protecting our community and are sickened by any behavior which involves the abuse of a minor. We work hard to detect, prevent and stop abuse on our platform and encourage everyone – young people, parents and caregivers – to have open conversations about what they’re doing online. We will continue to proactively work with governments, law enforcement and other safety organizations to ensure that Snapchat continues to be a positive and safe environment.”

A similar investigation in March focused on Instagram, with the NSPCC claiming that Facebook’s photo-sharing app has become the leading platform for child grooming in the country. During an 18-month period to September last year, there were more than 5,000 recorded crimes “of sexual communication with a child,” and “a 200% rise in recorded instances in the use of Instagram to target and abuse children.” The charity’s CEO described the figures as “overwhelming evidence that keeping children safe cannot be left to social networks. We cannot wait for the next tragedy before tech companies are made to act.”

This latest investigation makes the same point and comes a little over a month after the U.K. Government published proposals for “tough new measures to ensure the U.K. is the safest place in the world to be online,” claiming these to be the world’s “first online safety laws.” The proposals include an independent regulator with the “powers to take effective enforcement action against companies that have breached their statutory duty of care.” Such enforcement will include “substantial fines” as well as, potentially, the powers “to disrupt the business activities of a non-compliant company… to impose liability on individual members of senior management… and to block non-compliant services.”

The regulation of social media has been in and out of the headlines for most of this year. The prevalence of social media use by under-age children, and the risky interactions those children expose themselves to, has been one of the most disturbing aspects disclosed thus far. Regulation is coming. But the open question is how do the platforms prevent users from deliberately circumventing their security controls with little understanding of the risks they might then face.


Follow me on Twitter or LinkedIn

Zak Doffman

I am the Founder/CEO of Digital Barriers—developing advanced surveillance solutions for defence, national security and counter-terrorism. I write about the intersection of geopolitics and cybersecurity, and analyze breaking security and surveillance stories. Contact me at zakd@me.com .


RETRIEVED https://www.forbes.com/sites/zakdoffman/2019/05/26/snapchats-self-destructing-messages-have-created-a-haven-for-child-abuse/#595142c2399a

4 WAYS PEDOPHILES EXPLOIT INSTAGRAM TO GROOM KIDS

April 19, 2019/Chris McKenna/No Comments

Pedophiles trade Child Porn through Dropbox Links on Instagram

The Atlantic first reported that teenagers stumbled upon a network of Instagram accounts that were sharing Dropbox links of child porn (Atlantic article). The way it worked is that pedophiles were using certain hashtags on images that advertised how to get in touch. Teens discovered this and proceeded to spam the offending hashtags with hundreds of memes, making it difficult for pedophiles to find each other and trade illegal content.

Brilliant. Kids defending other kids! 

And, although it was an admirable diversion, unfortunately these criminals are resourceful. And, with over a billion monthly users, it’s impossible for Instagram to keep pace with nefarious activity.

Maybe your kid already uses Instagram. Great! I’m not saying you need to rip it away. In fact, that is often counterproductive. Instead, we hope this post will help you better understand that the way the app is designed creates risks.

Because remember, not all kids using Instagram end up being groomed and abused.

But, if grooming and child exploitation are easy on the app, my guess if you would want to know. Even CNN recently reported that Instagram is the #1 app for child grooming.

If your son or daughter receives a private, DM (direct message) from a stranger, does he/she know how to respond? It’s easier to do than you think. Remember, wherever the kids are is where the predators are.

Instagram Direct Message

We simply want this post to flash a light in dark places. Since Apple’s App Store Descriptiondoesn’t say anything about predatory activity, it’s our job to tell the truth.

**Warning. Some of the screenshots you will see in this post are not safe for work (NSFW) and include some of the most disturbing content we’ve ever encountered during over four years of researching social media. Nothing has been censored.

Four Grooming Paths on Instagram – Comments, Hashtags, Likes, and DMs

If Instagram leadership reads this post, they’ll try really hard to point to their community guidelines and their reporting channels, saying that they don’t allow predatory activity. But we would argue that the very way in which Instagram is designed creates grooming pathways. In other words – no amount of moderation or guidelines can change Instagram’s features. Allow us to explain.

Oh, and one more thing. Many parents who read this might think, “my child has a private account, so they’re fine.” That’s a common, but incorrect conclusion. None of the four feature issues we discuss below are impacted in any way by the privacy of an account. Anyone, whether private or not, can post comments and search hashtags, and anyone can be seen through the like count and sent a message via DM.

Pedophiles exploit Instagram’s comments to network with each other and fish for victims.

Instagram Comments Header

Within the comments, pedophiles find other pedophiles and peddle their illegal and disgusting content with each other. Here are a few samples from an endless number of comments (warning – these comments are extremely disturbing)

Pedophiles Exploit Instagram - Comments

You also see comments that go directly at young people as a form of “fishing” for victims, waiting for a kid to bite.

Pedophiles Exploit Instagram - Comments

Pedophiles exploit Instagram’s hashtags to drop horrible content into good, clean places.

Instagram Hashtags Header

Almost all social media platforms use #hashtags. Think of them as a card catalogue for social media content – a way to categorize millions and millions of images into groups so that I can find exactly what I’m looking for. We love them! Some people use them as a sort of witty, second language.

But the problem is that they can be used by anyone.

Let’s say for a minute that I’m a teen girl who’s interested in modeling. Or cheerleading. And my mom even made me have a private Instagram account (good job, mom!).

I take a photo at the beach with my friends, and I attach the hashtags #teen #teengirl #teenmodel #snapchat. Fabulous. Later on, with my girlfriends, I’m thumbing through the #teenmodel and #snapchat hashtags, and I see this:

Instagram Hashtags Grooming (for blog)

See, any predator can attach #teenmodel and #snapchat to their photo. This allows that photo to show up in front of millions of teen girls, thumbing through #snapchat photos, hoping one will “bite.”

Notice in the one photo how part of the “sell” is to convince a girl to join him in Snapchat, which is a very secure environment for secretive activity. After all, >75% of teens have Instagram and >76% (AP Article) of teens have Snapchat, so there’s a good chance that if a kid has one, then they probably have the other.

In other words, #hashtags allow predators to hover over good places like a drone and drop their smut whenever they want. Pay attention to those screenshots – there’s nothing pornographic about them. There’s no swear words. No use of “sex.” But, the very nature of #hashtags as a feature create this grooming path.

Instagram Hashtags - Grooming

And if someone reports the “daddy” posts you see above and Instagram takes them down, no problem. Since Instagram doesn’t require any identity verification, including birthday, real email, credit card, NOTHING, a predator can create another fake account in seconds. This is yet another huge design flaw that creates a situation where pedophiles don’t mind taking great risks and getting shut down – their attitude is, “I’ll just start over.”

[Note: we experienced this with “daddy,” who we reported multiple times. His account would be shut down, and then he popped up with a slightly different username seconds later, posting the same horrifying images of him masturbating and asking kids to connect with him “live.”]

Related post: We Tested Instagram’s “No Nudity” Rule. We Can’t Show You the Results

Predators exploit Instagram’s likes (the heart) to identify potential victims.

Instagram Likes Header

Going back to our #teenmodel example, if you click on one photo, you might find that it has hundreds of likes (hearts) similar to the photo of the young boy below (sorry, but if you don’t want your photo in blog posts, then keep your account private).

Predators can click on the likes and see everyone who has liked this photo. Everyone. Even if they have a private account. From that list, a predator can identify someone young who looks interesting and send him/her a direct message (DM) – we’ll explain the whole DM feature in more detail next. But, note how the “likes” feature creates a target audience for sexual predators. This is shown in the image below.

Pedophiles Exploit Instagram - Likes

Again, it’s a design flaw. The very nature of the likes feature creates a pool of young people for predators to target (to Instagram’s credit, they are considering dropping the “like” count attached to photos, but so far, this has only been speculated).

Which leads us to DMs. Direct Messages.

Pedophiles exploit Instagram DMs (direct messages) to groom kids. And they’re doing it very successfully.

Instagram DMs

Two weeks ago, PYE created a test Instagram account. This account was clearly for a young girl, who posted two selfies on the first day of existence. Tagged on these photos were hashtags #teen, #teengirl, #teenmodel. This account went out and “liked” a few photos with similar hashtags and followed accounts that were like mine.

Not much happened for the first six days of the account.

Then, one week later, something in Instagram’s algorithm triggered. It was as if some combination of the test account’s activity unleashed a tsunami of DM activity that hasn’t let up over the past four days, averaging over 10 DMs per day. The screenshots below show some of the activity, including a very creative porn link. Note – PYE is the one who scribbled out the man masturbating in the image below. The photo was sent to our test account as a DM, completely exposed.

Can Instagram Fix their Predator Problem?

Maybe. In order to clean up the issues above, Instagram would have to significantly alter numerous, core features. If Instagram were to create a “Safe Mode,” it might have to:

  1. Remove the ability to DM to or with anyone who isn’t an approved follower.
  2. Allow parents to create a whitelisted set of contacts. That means the child can ONLY like, comment, and DM with people who are on the whitelist.
  3. Remove the ability to add hashtags.

I just don’t foresee Instagram making those changes.

What Can Parents do About the Instagram Pedophile Problem?

1. If your kid uses social media, including Instagram, be curious and involved. Remember, not every kid misuses these platforms. But, if you know the risks, then get involved and talk openly with your children about how they’re using the app.

2. Use monitoring tools like Bark (7-days free!) and Covenant Eyes (30-days free!) to monitor their smartphone social media and texting activity. Bark actually monitors images within the app for appropriateness and alerts parents when kids venture into inappropriate images.

Bark Parental Controls

3. Talk to your kids specifically about direct messages and give them guidance for what to do if someone tricky reaches out to them.

4. Visit our FixAppRatings.com campaign and push for change. Let’s embrace the reality that given Instagram’s current feature set, that it’s a 17+ app. It’s an app created by adults and for adults. Will you visit your state representative this month to share your concerns? Show him/her our draft #fixappratings resolution.

The only way anything will change with big tech companies is if the government does something. We’re convinced of it.

—————————————————->

Parents, we love BARK and how it helps parents AND kids. Here’s a real story…

“We knew our son was having some issues with school and in his social circle but he doesn’t talk to us about anything…he googled “What is it called when there’s a war going on inside your brain?”…The fact that he used the word “war” prompted BARK to mark it as violence…Call it depression or anxiety or regular mood swings teens experience, he wasn’t opening up to anyone about this and never mentioned it…I have a psych evaluation setup for him in a few days and I just have to say how grateful I am that BARK caught this. I would otherwise have no idea that this was even an issue for him and we can now get some professional help to ensure that it doesn’t become a true problem.”

Bark Parental Controls

Parents, do you want a better idea of what your kids are doing on social media? What about the comments on your daughter’s Instagram photos? Or, iMessage activity on your son’s iPhone? Then, look no further than Bark. You can start a 7-day free trial today.

Protect Young Eyes Logo (2020)

*Note – links in this post might connect to affiliates who we know and trust. We might earn a small commission if you decide to purchase their services. This costs you nothing! We only recommend what we’ve tested on our own families. Enjoy!

Chris Photo

Chris McKenna

I love life. Seriously! Each. Day. A. Gift. Former CPA, business advisor, youth pastor, development director. Manage marketing efforts for Covenant Eyes and CEO of PYE. God shares wild ideas with me about life while I run. I have a relentless drive to help families use technology well.


RETRIEVED https://protectyoungeyes.com/4-ways-pedophiles-exploit-instagram-groom-kids/