Can Google be family friendly? That’s a value judgment Google can’t even seem to agree on within its own ranks. In 2019, a Google vice president had to address employee concerns over the use of the phrase “family friendly” when discussing a product for children at a company-wide meeting. Said one employee who stormed out in disgust, using the word “family” to mean “household with children” was “offensive, inappropriate, homophobic, and wrong.” Okay then.
With that as the backdrop, it’s unsurprising that Google and its host of platforms, which include YouTube, aren’t necessarily aligned with the priorities of most families, particularly parents: specifically keeping kids safe online, maintaining their privacy, and going to lengths not to expose them to smutty content.
Google, like most Big Tech platforms, views children first as future clients, and only second as a class to be protected from harm. And this approach informs how they tailor their products toward kids. More than half the nation’s public schools use free Google education apps like Gmail and Google Docs. More than 25 million students and teachers also use Chromebooks.
What you see as tech-centered learning, Google sees as a profit behemoth. While federal law makes it illegal for corporations to track children under 13 without parental consent, Google has taken advantage of a loophole in the law that allows the school system to consent on behalf of the parent.
And, according to the state of New Mexico, which sued Google for violating child privacy, Google has a feature that syncs its Chrome browser with other devices used by a student on that same account — effectively blending a student’s school and personal web activities into a single pool of data for Google to harvest. And that is exactly what they’ve been busted doing. In 2017, the company admitted they track students in schools, including “scanning and indexing” student email messages sent on Google’s platforms and tracking the activities of student users outside of Google’s education suite.
Google’s commoditization of children for cash is hardly its only problem. Google’s platforms, particularly YouTube, are consistently on the hook for mainlining disturbing, sexualized content to kids.
In 2017, YouTube updated its policies to address “ElsaGate,” where creators were taking some of the most popular children’s characters and drawing them in bizarre and inappropriate situations — and YouTube’s algorithm was recommending them to kids. That same year, YouTube closed comment sections on videos with children where pedophiles were engaging in predatory behavior. In years past, Google has had to change its search algorithm to prevent exploitative content from appearing in searches on both Google and YouTube
At the heart of this problem is YouTube’s powerful recommendation algorithm, which feeds users more of what they want. YouTube is estimated to be second only to its parent platform, Google, in web traffic, and 70 percent of what users watch is fed to them through recommendations. YouTube, rather than address the exploitative and addictive nature of its algorithm, is reportedly trying to capitalize on it — that is, make it even more addictive.
But this shouldn’t come as a surprise. Google is not a family-friendly company first, they are a for-profit company first. And as their platforms grow in ubiquity and access to kids, that presents a danger to America’s children that demands a public policy response.