Grooming

://www.childsafety.gov.au/about-child-sexual-abuse/grooming

On this page

Content warning: This page contains information that readers may find confronting or distressing.

Help is available if you or someone you know has experienced or is at risk of child sexual abuse. Our Get support page has a list of dedicated services if you need help or support. For information on reporting child safety concerns, visit our Make a report page.

If you or a child are in immediate danger, call Triple Zero (000).

In order to keep children and young people safe, it is important to understand what grooming is and how to prevent it. The term ‘grooming’ refers to behaviours that manipulate and control a child, as well as their family, kin and carers, other support networks, or organisations in order to perpetrate child sexual abuse.

The intent of grooming is to:

  • gain access to the child or young person to perpetrate child sexual abuse
  • obtain sexual material of the child or young person
  • obtain the child or young person’s trust and/or compliance
  • maintain the child or young person’s silence, and/or
  • avoid discovery of sexual abuse.1

Grooming can occur online or in-person. Online child grooming is the process of establishing and building a relationship with a child or young person while online, to facilitate sexual abuse that is either physical (in person) or online.2This is achieved through the internet or other technologies such as phones, social media, gaming, chat and messaging apps.

Online grooming may involve perpetrators encouraging children and young people to engage in sexual activity or to send the perpetrator sexually explicit material. It may lead to perpetrators meeting the child or young person in person or blackmailing them to self-produce explicit materials. To evade detection while grooming children and young people, perpetrators may also convince them to use different online platforms, including those using encrypted technologies.3 Encrypted technologies are used to protect data from being stolen, changed, or compromised by scrambling data into a secret code that hides the information’s true meaning. Only a unique digital key can unlock the secret code.

Socialising online is a great way for children and young people to build friendships and have fun, but it is important to ensure online technologies are being used in a way that keeps children and young people safe. You can find resources about how to stay safe online on the eSafety website- external site.

How grooming occurs

Child sexual abuse and grooming can occur within families, by other people the child or young person knows or does not know, in organisations, and online. Behaviours related to grooming are not necessarily explicitly sexual, directly abusive or criminal, and may be consistent with behaviours or activities in non-abusive relationships. They can often be difficult to identify and may only be recognised in hindsight. In these cases, the main difference between acceptable behaviours and grooming behaviours is the motivation behind them.4

Grooming of a child or young person, online or in-person, may include:

  • building their trust, sometimes through special attention or gifts
  • treating them like an adult to make them feel different and special
  • gaining the trust of their parents, family or carers
  • isolating them from supportive and protective family and friends
  • coercing them, including through threats, stalking and asking them to keep secrets
  • manipulating them to blame themselves for the situation
  • encouraging them to produce child sexual abuse imagery or enticing them to participate in sexualised virtual chats
  • non-sexual touching of the child or young person that develops into sexual behaviour over time.

Signs of grooming

Being aware of the signs of grooming can help protect children and young people from child sexual abuse. A child or young person may show signs of being a victim of grooming in different ways. They may show all or some of the following signs:

  • developing an unusually close connection with an older person
  • having gifts or money from new friends that they cannot account for
  • being very secretive about their phone, internet or social media use
  • going missing for long periods of time
  • appearing extremely tired, including at school
  • being dishonest about who they have been with and where they have been
  • substance misuse
  • assuming a new name, having false identification, a stolen passport or driver licence, or a new phone
  • being collected from school by an older or new friend.5

How to prevent grooming

Teaching children and young people what is appropriate and inappropriate contact (both online and offline), and encouraging open and honest communication, without shame or stigma, will help to better protect them. This includes supporting children and young people to:

  • understand safe and unsafe behaviours and situations, including being able to identify early warning signs and their body’s natural reactions when they feel unsafe, worried, or scared. These may include feeling butterflies, and having sweaty palms and a racing pulse
  • practice safe online behaviour, including deleting and blocking requests and messages from people they don’t know, and reviewing and updating privacy settings
  • know what to do and who to talk to if something feels uncomfortable, as well as what support services are available if they are unsure or if something has happened
  • say no to requests to engage in unsafe behaviours or sexual advances
  • block unsafe users, make a complaint to social media companies and report online grooming
  • understand body boundaries, respectful relationships and consent
  • feel safe and protected when disclosing what is happening to them.

What to do about suspected grooming

Your child may not understand they are being groomed, and may not tell you that they are being groomed directly. It is important to understand the signs of grooming and talk to your child if you notice changes in their behaviour and suspect something isn’t right.

If you suspect a child or young person is being groomed or is at risk of being groomed, contact your relevant state or territory child protection agency. Visit our Make a report page to find out more.

You can also report online grooming or inappropriate contact to the Australian Centre to Counter Child Exploitation- external site.

For information about what to do if something goes wrong online, visit the eSafety website- external site.

Our Get support page provides a list of dedicated support and assistance services.

Helpful resources

eSafety- external site is Australia’s national independent regulator and educator for online safety. It provides tools and resources for parents- external site and carers to help keep children safe online, including access to free webinars. Issues covered include:

For young people (secondary school age), eSafety’s page about unsafe or unwanted contact- external site has specially tailored advice. eSafety also has resources forkids- external site (primary school age).

Educators can also use the unwanted contact and grooming- external site scenarios with students – these are designed to start conversations that help build online safety skills.

You can also find out more about grooming on these websites:


1 Royal Commission into Institutional Responses to Child Sexual Abuse 2017, Final Report: Our Inquiry – Royal Commission into Institutional Responses to Child Sexual Abuse Volume 1, page 323.

2 ECPAT International 2016, Terminology Guidelines for the Protection of Children from Sexual Exploitation and Sexual Abuse, Interagency Working Group on Sexual Exploitation of Children. Accessed November 2020 from: https://www.ohchr.org/Documents/Issues/Children/SR/TerminologyGuidelines_en.pdf- external site.

3 Five Country Ministerial 2020, Voluntary Principles to Counter Online Child Sexual Exploitation and Abuse, page 4. Accessed November 2020 from: https://www.weprotect.org/wp-content/uploads/11-Voluntary-principles-detailed.pdf- external site.

4 Royal Commission into Institutional Responses to Child Sexual Abuse 2017, Final Report: Our Inquiry – Royal Commission into Institutional Responses to Child Sexual Abuse Volume 1, page 323.

5 Victorian Department of Education and Training, Child Sexual Exploitation and Grooming. Accessed April 2021 from: https://www.education.vic.gov.au/school/teachers/health/childprotection/Pages/expolitationgrooming.aspx


RETRIEVED https://www.childsafety.gov.au/about-child-sexual-abuse/grooming

THE LIGHT OF THE WORLD CHURCH: 100 YEARS OF SYSTEMATIC ABUSE

Posted on  by Jessica Guerrero 

The Light of the World Church. Photo: Google

By JESSICA GUERRERO

MORELIA, MichoacĂĄn — June 9, 2022, marked a milestone in the history of the Church of La Luz del Mundo (Light of the World), the most prolific Mexican cult that for almost 100 years has ruled the life and customs of at least 5 million people around the world (1.5 million in Mexico), all of whom witnessed the arrest and imprisonment of their leader NaasĂłn JoaquĂ­n GarcĂ­a.

The self-proclaimed “Apostle of Jesus Christ,” JoaquĂ­n GarcĂ­a, finally fell. The hegemony of three messianic generations of alleged abuse, unlimited power and every manner of sexual aberration within this cult, founded by JoaquĂ­n GarcĂ­a´s grandfather in the 1920s in Central MĂŠxico, reached its most vulnerable moment after its current leader’s detention in 2019.

To the faithful believers of La Luz del Mundo, JoaquĂ­n GarcĂ­a is more than the cult’s leader. He is seen as the Messiah, the envoy of God and the last apostle of Jesus Christ on Earth. However, to the long list of former cult members and now-victims who are crying out for justice, JoaquĂ­n GarcĂ­a is a twisted mind capable of perpetrating the most terrible abuses against innocent adolescents and children.

The circumstances in which these alleged sexual crimes against minors took place were within a power relationship exercised by JoaquĂ­n GarcĂ­a over those young victims who saw in him a supreme being before whom they owed, not only their spiritual fidelity but their bodies as well, so that he could dispose of them at his discretion regardless of their childishness.

Sadly, according to the victims, most of these abuses occurred before the eyes of the community and the families of the victims themselves, who agreed to having their children accompany and serve JoaquĂ­n GarcĂ­a at all times, encouraging them to do whatever was needed to please him, as he was basically seen as God’s representation on Earth.

According to allegations made by some of the victims and former members of Light of the World Church, the administrative structure of the church even had a group dedicated exclusively to the recruitment and grooming of young parishioners, who would later be prepared to serve, entertain and intimately pleasure JoaquĂ­n GarcĂ­a.

The Unconditionals, as the women in that circle close to JoaquĂ­n Garcia were known, in addition to serving as escorts, recruiters and personal assistants to the so-called Apostle of Jesus Christ, enjoyed a privileged status within the church and carried out important activities, such as managing the cult’s internal media content and organizing mass events, having to travel from the United States to Mexico on a regular basis.

However, these women were also considered exclusive to JoaquĂ­n GarcĂ­a, and he was the one who had to give his approval so that they could marry another member of the community.

It was not until 2019 that one of JoaquĂ­n GarcĂ­a’s closest Unconditionals, Sochil MartĂ­n, gave him the kiss of Judas and exposed the recurring sexual children exploitation system within the cult to U.S. authorities, resulting in the arrest of the Apostle of Jesus Christ and two of his closest Unconditional servants, Alondra Ocampo and Susana Medina Oaxaca, both of whom were accused of complicity.

Despite the fact that it was not the first scandal of alleged sexual abuse within the Church of the Light of the World that has become known since its foundation, this time it had a great impact and serious consequences due to the first-hand evidence that the plaintiffs presented, as well as the enormous evidence that U.S. authorities found against JoaquĂ­n GarcĂ­a in his personal electronic devices.

Although JoaquĂ­n GarcĂ­a’s defense has been led by Alan Jackson, one of the most sought-after lawyers in the United States, the biggest turning point came when Ocampo pleaded guilty to charges of sexual crimes against three teenagers.

According to the lawsuit filed by the victims, Ocampo took three young womento JoaquĂ­n GarcĂ­a, inducing them, through biblical texts, to dance naked and then participate in sexual acts with JoaquĂ­n GarcĂ­a.

Ocampo was sentenced to four years in prison for these crimes and was released in early December 2022 for having shown repentance for her actions and for having collaborated with the California prosecutors in the investigation to prosecute the cult’s leader, JoaquĂ­n GarcĂ­a.

On the other hand, Medina Oaxaca was saved from going to jail after reaching a plea bargain with the California prosecutor, being sentenced to one year of probation after paying a $150,000 bail.

Finally, JoaquĂ­n GarcĂ­a was accused of 19 crimes, among which were: sexual abuse of minors, rape, possession of child pornography and human trafficking. However, the defendant, on the advice of his lawyer, admitted guilty to three of those crimes, for lewd acts committed with a 15-year-old girl, and for doing the same with another 18-year-old girl.

This strategy allowed the expected sentence for JoaquĂ­n GarcĂ­a to be reduced to 16 years and eight months, which generated great discontent among the victims, who had expected a maximum sentence for the defendant given the seriousness of the crimes of which he was accused and the impact they have had on their lives.

However, the three accused, as well as the wife and children of JoaquĂ­n Garcia still face a civil lawsuit filed by the victims before the Superior Court of Los Angeles.

According to this lawsuit, JoaquĂ­n GarcĂ­a and other members of the church systematically abused their victims, who were mostly young women, and used religion as a weapon against them. This lawsuit, filed last September, seeks payment for damages caused to the victims.

Nonetheless, in the meantime, the millions of members of this cult are demanding the immediate release of their still-leader Joaquín Garcia. They insist that he is an innocent victim who has been framed and unfairly accused by the California Attorney General’s Office with false evidence.


/ / /

RETRIEVED https://pulsenewsmexico.com/2023/01/27/church-of-la-luz-del-mundo-100-years-of-systematic-abuse/

Snapchat Has Become A ‘Haven For Child Abuse’ With Its ‘Self-Destructing Messages’

20,384 views|May 26, 2019,05:13am EDT


Zak DoffmanContributorCybersecurity

Not a good week for Snapchat. On Thursday, Motherboard reported that “several departments inside social media giant Snap have dedicated tools for accessing user data, and multiple employees have abused their privileged access to spy on Snapchat users.” And now the Sunday Times has published an investigation into allegations that predators are “flocking” to the social media platform, which has become a “haven for child abuse.”

Motherboard’s article cited two former employees who claimed that “multiple Snap employees abused their access to Snapchat user data several years ago.” This included the use of “internal tools that allowed Snap employees to access user data, including in some cases location information, their own saved Snaps and personal information such as phone numbers and email addresses.”

SnapLion, one of the tools referenced in the Motherboard article, was designed to gather information for “valid law enforcement requests. Claims that this tool was involved in the alleged misuse have not been verified.

A Snap spokesperson told me that “any perception that employees might be spying on our community is highly troubling and wholly inaccurate. Protecting privacy is paramount at Snap. We keep very little user data, and we have robust policies and controls to limit internal access to the data we do have, including data within tools designed to support law enforcement. Unauthorized access of any kind is a clear violation of the company’s standards of business conduct and, if detected, results in immediate termination.”

Ironically, it is this limited user data that is central to the Sunday Timesinvestigation. The newspaper’s investigation has uncovered “thousands of reported cases that have involved Snapchat since 2014,” including “pedophiles using the app to elicit indecent images from children and to groom teenagers,” as well as “under-18s spreading child pornography themselves.” This has now resulted in U.K. police “investigating three cases of child exploitation a day linked to the app, [with] messages that self-destruct allowing groomers to avoid detection.”

The Sunday Times quotes Adam Scott Wandt from John Jay College of Criminal Justice in New York calling Snapchat a “haven” for abusers, arguing that the “self-destruct” nature of Snapchat’s messages “makes it difficult for the police to collect evidence.”

Wandt claims that in this way “Snapchat has distinguished itself as the platform where abuse of children happens… The problem was that adults realized you could do a simple Google search and find out that most Snapchat messages are unrecoverable after 24 hours, even by law enforcement with a warrant.”

The U.K. children’s charity, the NSPCC, rates Snapchat as a high risk, with a spokesperson for the charity explaining that predators intent on grooming children “cast the net wide in the expectation that a small number of children will respond.”

The charity has also warned on self-generated images taken and shared by children themselves. “As soon as that image is shared or screenshotted, the child loses control over it… those images may start on a site like Snapchat, but they could very easily end up circulating among technologically sophisticated offenders, making their way onto the dark web.”

Snap told me that “we care deeply about protecting our community and are sickened by any behavior which involves the abuse of a minor. We work hard to detect, prevent and stop abuse on our platform and encourage everyone – young people, parents and caregivers – to have open conversations about what they’re doing online. We will continue to proactively work with governments, law enforcement and other safety organizations to ensure that Snapchat continues to be a positive and safe environment.”

A similar investigation in March focused on Instagram, with the NSPCC claiming that Facebook’s photo-sharing app has become the leading platform for child grooming in the country. During an 18-month period to September last year, there were more than 5,000 recorded crimes “of sexual communication with a child,” and “a 200% rise in recorded instances in the use of Instagram to target and abuse children.” The charity’s CEO described the figures as “overwhelming evidence that keeping children safe cannot be left to social networks. We cannot wait for the next tragedy before tech companies are made to act.”

This latest investigation makes the same point and comes a little over a month after the U.K. Government published proposals for “tough new measures to ensure the U.K. is the safest place in the world to be online,” claiming these to be the world’s “first online safety laws.” The proposals include an independent regulator with the “powers to take effective enforcement action against companies that have breached their statutory duty of care.” Such enforcement will include “substantial fines” as well as, potentially, the powers “to disrupt the business activities of a non-compliant company… to impose liability on individual members of senior management… and to block non-compliant services.”

The regulation of social media has been in and out of the headlines for most of this year. The prevalence of social media use by under-age children, and the risky interactions those children expose themselves to, has been one of the most disturbing aspects disclosed thus far. Regulation is coming. But the open question is how do the platforms prevent users from deliberately circumventing their security controls with little understanding of the risks they might then face.


Follow me on Twitter or LinkedIn

Zak Doffman

I am the Founder/CEO of Digital Barriers—developing advanced surveillance solutions for defence, national security and counter-terrorism. I write about the intersection of geopolitics and cybersecurity, and analyze breaking security and surveillance stories. Contact me at zakd@me.com .


RETRIEVED https://www.forbes.com/sites/zakdoffman/2019/05/26/snapchats-self-destructing-messages-have-created-a-haven-for-child-abuse/#595142c2399a

4 WAYS PEDOPHILES EXPLOIT INSTAGRAM TO GROOM KIDS

April 19, 2019/Chris McKenna/No Comments

Pedophiles trade Child Porn through Dropbox Links on Instagram

The Atlantic first reported that teenagers stumbled upon a network of Instagram accounts that were sharing Dropbox links of child porn (Atlantic article). The way it worked is that pedophiles were using certain hashtags on images that advertised how to get in touch. Teens discovered this and proceeded to spam the offending hashtags with hundreds of memes, making it difficult for pedophiles to find each other and trade illegal content.

Brilliant. Kids defending other kids! 

And, although it was an admirable diversion, unfortunately these criminals are resourceful. And, with over a billion monthly users, it’s impossible for Instagram to keep pace with nefarious activity.

Maybe your kid already uses Instagram. Great! I’m not saying you need to rip it away. In fact, that is often counterproductive. Instead, we hope this post will help you better understand that the way the app is designed creates risks.

Because remember, not all kids using Instagram end up being groomed and abused.

But, if grooming and child exploitation are easy on the app, my guess if you would want to know. Even CNN recently reported that Instagram is the #1 app for child grooming.

If your son or daughter receives a private, DM (direct message) from a stranger, does he/she know how to respond? It’s easier to do than you think. Remember, wherever the kids are is where the predators are.

Instagram Direct Message

We simply want this post to flash a light in dark places. Since Apple’s App Store Descriptiondoesn’t say anything about predatory activity, it’s our job to tell the truth.

**Warning. Some of the screenshots you will see in this post are not safe for work (NSFW) and include some of the most disturbing content we’ve ever encountered during over four years of researching social media. Nothing has been censored.

Four Grooming Paths on Instagram – Comments, Hashtags, Likes, and DMs

If Instagram leadership reads this post, they’ll try really hard to point to their community guidelines and their reporting channels, saying that they don’t allow predatory activity. But we would argue that the very way in which Instagram is designed creates grooming pathways. In other words – no amount of moderation or guidelines can change Instagram’s features. Allow us to explain.

Oh, and one more thing. Many parents who read this might think, “my child has a private account, so they’re fine.” That’s a common, but incorrect conclusion. None of the four feature issues we discuss below are impacted in any way by the privacy of an account. Anyone, whether private or not, can post comments and search hashtags, and anyone can be seen through the like count and sent a message via DM.

Pedophiles exploit Instagram’s comments to network with each other and fish for victims.

Instagram Comments Header

Within the comments, pedophiles find other pedophiles and peddle their illegal and disgusting content with each other. Here are a few samples from an endless number of comments (warning â€“ these comments are extremely disturbing)

Pedophiles Exploit Instagram - Comments

You also see comments that go directly at young people as a form of “fishing” for victims, waiting for a kid to bite.

Pedophiles Exploit Instagram - Comments

Pedophiles exploit Instagram’s hashtags to drop horrible content into good, clean places.

Instagram Hashtags Header

Almost all social media platforms use #hashtags. Think of them as a card catalogue for social media content – a way to categorize millions and millions of images into groups so that I can find exactly what I’m looking for. We love them! Some people use them as a sort of witty, second language.

But the problem is that they can be used by anyone.

Let’s say for a minute that I’m a teen girl who’s interested in modeling. Or cheerleading. And my mom even made me have a private Instagram account (good job, mom!).

I take a photo at the beach with my friends, and I attach the hashtags #teen #teengirl #teenmodel #snapchat. Fabulous. Later on, with my girlfriends, I’m thumbing through the #teenmodel and #snapchat hashtags, and I see this:

Instagram Hashtags Grooming (for blog)

See, any predator can attach #teenmodel and #snapchat to their photo. This allows that photo to show up in front of millions of teen girls, thumbing through #snapchat photos, hoping one will “bite.”

Notice in the one photo how part of the “sell” is to convince a girl to join him in Snapchat, which is a very secure environment for secretive activity. After all, >75% of teens have Instagram and >76% (AP Article) of teens have Snapchat, so there’s a good chance that if a kid has one, then they probably have the other.

In other words, #hashtags allow predators to hover over good places like a drone and drop their smut whenever they want. Pay attention to those screenshots – there’s nothing pornographic about them. There’s no swear words. No use of “sex.” But, the very nature of #hashtags as a feature create this grooming path.

Instagram Hashtags - Grooming

And if someone reports the “daddy” posts you see above and Instagram takes them down, no problem. Since Instagram doesn’t require any identity verification, including birthday, real email, credit card, NOTHING, a predator can create another fake account in seconds. This is yet another huge design flaw that creates a situation where pedophiles don’t mind taking great risks and getting shut down – their attitude is, “I’ll just start over.”

[Note: we experienced this with “daddy,” who we reported multiple times. His account would be shut down, and then he popped up with a slightly different username seconds later, posting the same horrifying images of him masturbating and asking kids to connect with him “live.”]

Related post: We Tested Instagram’s “No Nudity” Rule. We Can’t Show You the Results

Predators exploit Instagram’s likes (the heart) to identify potential victims.

Instagram Likes Header

Going back to our #teenmodel example, if you click on one photo, you might find that it has hundreds of likes (hearts) similar to the photo of the young boy below (sorry, but if you don’t want your photo in blog posts, then keep your account private).

Predators can click on the likes and see everyone who has liked this photo. Everyone. Even if they have a private account. From that list, a predator can identify someone young who looks interesting and send him/her a direct message (DM) – we’ll explain the whole DM feature in more detail next. But, note how the “likes” feature creates a target audience for sexual predators. This is shown in the image below.

Pedophiles Exploit Instagram - Likes

Again, it’s a design flaw. The very nature of the likes feature creates a pool of young people for predators to target (to Instagram’s credit, they are considering dropping the “like” count attached to photos, but so far, this has only been speculated).

Which leads us to DMs. Direct Messages.

Pedophiles exploit Instagram DMs (direct messages) to groom kids. And they’re doing it very successfully.

Instagram DMs

Two weeks ago, PYE created a test Instagram account. This account was clearly for a young girl, who posted two selfies on the first day of existence. Tagged on these photos were hashtags #teen, #teengirl, #teenmodel. This account went out and “liked” a few photos with similar hashtags and followed accounts that were like mine.

Not much happened for the first six days of the account.

Then, one week later, something in Instagram’s algorithm triggered. It was as if some combination of the test account’s activity unleashed a tsunami of DM activity that hasn’t let up over the past four days, averaging over 10 DMs per day. The screenshots below show some of the activity, including a very creative porn link. Note – PYE is the one who scribbled out the man masturbating in the image below. The photo was sent to our test account as a DM, completely exposed.

Can Instagram Fix their Predator Problem?

Maybe. In order to clean up the issues above, Instagram would have to significantly alter numerous, core features. If Instagram were to create a “Safe Mode,” it might have to:

  1. Remove the ability to DM to or with anyone who isn’t an approved follower.
  2. Allow parents to create a whitelisted set of contacts. That means the child can ONLY like, comment, and DM with people who are on the whitelist.
  3. Remove the ability to add hashtags.

I just don’t foresee Instagram making those changes.

What Can Parents do About the Instagram Pedophile Problem?

1. If your kid uses social media, including Instagram, be curious and involved. Remember, not every kid misuses these platforms. But, if you know the risks, then get involved and talk openly with your children about how they’re using the app.

2. Use monitoring tools like Bark (7-days free!) and Covenant Eyes (30-days free!) to monitor their smartphone social media and texting activity. Bark actually monitors images within the app for appropriateness and alerts parents when kids venture into inappropriate images.

Bark Parental Controls

3. Talk to your kids specifically about direct messages and give them guidance for what to do if someone tricky reaches out to them.

4. Visit our FixAppRatings.com campaign and push for change. Let’s embrace the reality that given Instagram’s current feature set, that it’s a 17+ app. It’s an app created by adults and for adults. Will you visit your state representative this month to share your concerns? Show him/her our draft #fixappratings resolution.

The only way anything will change with big tech companies is if the government does something. We’re convinced of it.

—————————————————->

Parents, we love BARK and how it helps parents AND kids. Here’s a real story…

“We knew our son was having some issues with school and in his social circle but he doesn’t talk to us about anything…he googled “What is it called when there’s a war going on inside your brain?”…The fact that he used the word “war” prompted BARK to mark it as violence…Call it depression or anxiety or regular mood swings teens experience, he wasn’t opening up to anyone about this and never mentioned it…I have a psych evaluation setup for him in a few days and I just have to say how grateful I am that BARK caught this. I would otherwise have no idea that this was even an issue for him and we can now get some professional help to ensure that it doesn’t become a true problem.”

Bark Parental Controls

Parents, do you want a better idea of what your kids are doing on social media? What about the comments on your daughter’s Instagram photos? Or, iMessage activity on your son’s iPhone? Then, look no further than Bark. You can start a 7-day free trial today.

Protect Young Eyes Logo (2020)

*Note – links in this post might connect to affiliates who we know and trust. We might earn a small commission if you decide to purchase their services. This costs you nothing! We only recommend what we’ve tested on our own families. Enjoy!

Chris Photo

Chris McKenna

I love life. Seriously! Each. Day. A. Gift. Former CPA, business advisor, youth pastor, development director. Manage marketing efforts for Covenant Eyes and CEO of PYE. God shares wild ideas with me about life while I run. I have a relentless drive to help families use technology well.


RETRIEVED https://protectyoungeyes.com/4-ways-pedophiles-exploit-instagram-groom-kids/