When Jacob Tremblay, a 12-year-old Canadian actor, posted that sharing spoilers about Avengers: Endgame is a form of bullying, a typical Twitter argument ensued. Several users started to debate the merits of Tremblay’s point, and anonymous accounts (those without a “real” photo or name), jumped in to troll, mock, and name call.
Tremblay often tweets about his work as an anti-bullying advocate, but the irony is that his call for kindness led to cruel comments from adults — which is, unfortunately, pretty typical on social media today. And while being bullied by adults is an obvious concern, it’s not the only one parents have to face when their young kids spend time on social media.
On average, children join social media when they’re 12 years old, the same age as Tremblay. But basically every social media platform (Twitter, Instagram, Facebook, YouTube, TikTok, WhatsApp, Snapchat) is actually intended for users age 13 and older — it’s even stated in each of these app’s terms of service. Still, many parents (like Tremblay’s, who monitor his tweets) allow their kids to use social media before this recommended age.
Before we decide whether our kids can start using social media at age eight or 18, let’s dig into why the 13+ rule is in place.
Keeping up with online privacy regulations
With the rise of technology, countries around the world have worked hard to protect the privacy of their young citizens: Canada’s PIPEDA and the E.U.’s GDPR consider the online safety of both kids and adults. And in the U.S., the Children’s Online Privacy Protection Act (COPPA), passed by Congress in 1998, was among the first pieces of privacy legislation designed specifically to address the rise in digital marketing.
According to the Federal Trade Commission (FTC), “the primary goal of COPPA is to place parents in control over what information is collected from their young children online” (specifically those under the age of 13). In short, it prohibits digital platforms and services from tracking children’s online behavior and selling this data to third parties. And if a recent proposal to update the law is approved, it will also ban targeted advertising towards children and implement more safety controls, including an “Eraser Button” that would allow parents to remove their children’s data from an app.
Understanding social media risks
In general terms, when an app rates itself as 13+, it doesn’t have to comply with COPPA. However, the fact that children are increasingly using technology at a younger age has created a gray area—how can apps that claim to be intended for older users include features that appeal to young kids? Even though Snapchat, Instagram, and YouTube haven’t faced FTC interventions because of the 13+ legalese in their terms of service, recent activities have shown that regulators are beginning to address children’s use of these platforms.
The Federal Trade Commission recently issued a $5.7 million dollar fine against TikTok for its data collection practices (which resulted in the introduction of a separate app experience for young users, in addition to more robust safety controls); Playdum, a multiplayer game developer, was fined for similar reasons in 2011.
Google is also currently under scrutiny for “tracking and targeting” ads towards children, and for violating COPPA regulations in the “Family” section of its Play Store; 22 consumer advocacy groups have accused the tech giant of hosting apps that “engage in bad behavior” like the aggressive monetization of children’s games, ads for alcohol, graphic images, and transmitting data to third parties.
All of this goes to show that the FTC’s laws are both broad and enforceable, a positive sign for the many parents who have started to distrust and question some of the biggest names in tech.
Finding safe alternatives
As parents, it’s important to look for apps that are COPPA-compliant and designed with kids in mind. These platforms take personal data — how it’s collected, how it’s distributed, how it’s stored — seriously. They also give parents a higher degree of control when it comes to privacy settings.
Many COPPA-certified apps have also been approved by kidSAFE, an independent seal-of-approval program for game sites, educational platforms, mobile apps, and other digital services that have been designed with children in mind.
There are three kidSAFE seals of safety to familiarize yourself with:
- kidSAFE Listed: designed and intended for children and families.
- kidSAFE-Certified: follows kidSAFE’s basic rules for parental controls, safety procedures, and advertising.
- kidSAFE + COPPA-Certified: follows the same rules as above, plus additional COPPA guidelines regarding data collection and privacy.
However, keeping children safe online goes beyond adhering to the kidSAFE and COPPA regulations. Tech companies truly need to take ownership when designing apps and platforms for kids, which includes thinking about the risks that may not already be accounted for in our laws. And as parents, it’s important to seek out and support the brands that do make privacy a priority.
Everyone has a role to play when it comes to online safety
In addition to protecting children’s privacy, parents also need to consider the risks around how social media affects children’s mental and physical well being, and the potential for them to be exposed to online bullying, among other things. Taking time to understand an app’s terms of service, and talking to your kids about their internet activity — even spending valuable time with them online — is the best way to know that they’re using platforms that enhance their education and connection to others.
Keeping children safe online is a responsibility shared by parents, developers, advertisers, and tech leaders. If we all put the safety of our children first, we can continue to let them experience the best of what technology has to offer, while protecting them from the worst of it.
Photo credits: goodluz / Shutterstock, Pressmaster / Shutterstock, India Picture / Shutterstock