In February, The Seattle Times described the sex trafficking of an 11-year-old and a 15-year-old, lured from Oregon to Washington by people they discovered on social media. We are struck by the eerie familiarity of the story. The details describe the experience of a child who was recently taken care of in the emergency room – a girl who was trafficked via Instagram.
As pediatricians, certain moments crystallize in our memories. In an emergency room full of worried parents, he was escorted only by the police. The boy sat quietly, expressing sadness and regret for his own actions. Never once did he blame the people who had sold their bodies or the apps that allowed them to do so.
While surprising, the story is not entirely unique. According to one 2021 study, almost two out of every 100 children will be commercially exploited online; 15 out of 100 will suffer some form of online sexual abuse. And sexual exploitation is just one of the many ways that poorly regulated technology harms children. We have witnessed time and time again the impact it has on our patients’ mental, physical and emotional well-being. Without accountability from tech companies, this will not stop.
Why are children at risk?
Children, like many adults, now spend a significant part of their lives online. Unlike adults, children often cannot distinguish safe from unsafe digital content.
Research shows that the human brain is not mature until the age of 25. A less developed prefrontal cortex means that children and adolescents have problems controlling impulses and are more likely to engage in risky behavior, such as meeting strangers or sharing personal information online. .
Despite the risks, social media can strengthen communities; For example, one of the girls described above was only rescued by the police after someone found her through Snapchat. The solution isn’t as simple as banning kids from all platforms outright. But we need to be proactive about creating a safe online environment.
While the American Academy of Pediatrics provides useful tools for families to limit media consumption, we believe that it is the responsibility of technology companies to make platforms safer for children. Like childproofing pill bottles or seat belts in cars, social media safety features can and should be part of product design.
There is significant momentum in Congress to encourage these practices and hold tech companies and data brokers accountable. Three active pieces of legislation have recently been discussed in Congress: the American Privacy Act, the Children’s Online Privacy Protection Act and the Kids Online Safety Act. KOSA, in particular, calls on the platform to take reasonable steps in the design and operation of its products to prevent and reduce sexual exploitation.
Some tech companies are taking voluntary steps to make products safer for children, but bills establishing mandatory safety standards are sure to face hurdles as they pass Congress. The industry is spending $68 million on lobbying in 2023 alone. And our state’s identity is more tied to technology than most — it makes up more than 20% of Washington’s economy. This is the highest percentage in the US
What now?
With the general election in November approaching, Washingtonians should call on the legislature not only to support but to lead efforts to ensure the safety of children online. Recently, we followed at least 10 other countries that have made AI-generated child pornography illegal. In our country, the law will come into effect this year, but many children have been exploited by this practice.
Contact your representative for specific advice on KOSA or other bills. Share stories about the impact of social media on your children. Ask your pediatrician about safe strategies for media consumption.
Existing regulations are not strong enough to protect the girls described in this story. If we don’t take immediate action, we will create more risks.