Article

Optimize algorithms to support kids online, not exploit them

Copyright

Donald Iain Smith/Getty Images

Donald Iain Smith/Getty Images

By Joi Ito

This column is the second in a series about young people and screens. Read the first post, about connected parenting, here.

When I was in high school, I emailed the authors of the textbooks we used so I could better question my teachers; I spent endless hours chatting with the sysadmins of university computer systems about networks; and I started threads online for many of my classes where we had much more robust conversations than in the classroom. The first conferences I attended as a teenager were conferences with mostly adult communities of online networkers who eventually became my mentors and colleagues.

I cannot imagine how I would have learned what I have learned or met the many, many people who’ve enriched my life and work without the internet. So I know first-hand how, today, the internet, online games, and a variety of emerging technologies can significantly benefit children and their experiences.

That said, I also know that, in general, the internet has become a more menacing place than when I was in school. To take just one example, parents and other industry observers share a growing concern about the content that YouTube serves up to young people. A Sesame Street sing-along with Elmo leads to one of those weird color ball videosleading to a string of clips that keeps them glued to screens, with increasingly stranger-engaging content of questionable social or educational value, interspersed with stuff that looks like content, but might be some sort of sponsored content for Play-Doh. The rise of commercial content for young people is exemplified by YouTube Kidfluencers, which markets itself as a tool that gives brands using YouTube “an added layer of kid safety,” and their rampant marketing has many parents up in arms.

In response, Senator Ed Markey, a longtime proponent of children’s online privacy protections, is cosponsoring a new bill to expand the Children’s Online Privacy Protection Act (COPPA). It would, among other things, extend protection to children from age 12 to 15 and ban online marketing videos targeted at them. The hope is that this will compel sites like YouTube and Facebook to manage their algorithms so that they do not serve up endless streams of content promoting commercial products to children. It gets a little complicated, though, because in today’s world, the kids themselves are brands, and they have product lines of their own. So the line between self-expression and endorsements is very blurry and confounds traditional regulations and delinations.

The proposed bill is well-intentioned and may limit exposure to promotional content, but it may also have unintended consequences. Take the existing version of COPPA, passed in 1998, which introduced a parental permission requirement for children under 13 to participate in commercial online platforms. Most open platforms responded by excluding those under 13, rather than take on the onerous parental permission process and challenges of serving children. This drove young people’s participation underground on these sites, since they could easily misrepresent their age or use the account of a friend or caregiver. Research and everyday experience indicates that young people under 13 are all over YouTube and Facebook, and busy caregivers, including parents, are often complicit in letting this happen.

That doesn’t mean, of course, that parents aren’t concerned about the time their young people are spending on screens, and Google and Facebook have responded, respectively, with the kid-only “spaces” on YouTube and Messenger.

But these policy and tech solutions ignore the underlying reality that young people crave contact with bigger young people and grown-up expertise, and that mixed-age interaction is essential to their learning and development.

Not only is banning young people from open platforms an iffy, hard-to-enforce proposition, it’s unclear whether it is even the best thing for them. It's possible that this new bill could damage the system like other well-intentioned efforts have in the past. I can’t forget the overly stringent Computer Fraud and Abuse Act. Written a year after the movie War Games, the law made it a felony to break the terms of service of an online service, so that, say, an investigative journalist couldn’t run a script to test on Facebook to make sure the algorithm was doing what they said it was. Regulating these technologies requires an interdisciplinary approach involving legal, policy, social, and technical experts working closely with industry, government, and consumers to get them to work the way we want them to.

Given the complexity of the issue, is the only way to protect young people to exclude them from the grown-up internet? Can algorithms be optimized for learning, high-quality content, and positive intergenerational communication for young people? What gets less attention rather than outright restriction is how we might optimize these platforms to provide joy, positive engagement, learning, and healthy communities for young people and families.

Children are exposed to risks at churches, schools, malls, parks, and anywhere adults and children interact. Even when harms and abuses happen, we don’t talk about shutting down parks and churches, and we don’t exclude young people from these intergenerational spaces. We also don’t ask parents to evaluate the risks and give written permission every time their kid walks into an open commercial space like a mall or grocery store. We hold the leadership of these institutions accountable, pushing them to establish positive norms and punish abuse. As a society, we know the benefits of these institutions outweigh the harms.

Based on a massive EU-wide study of children online, communication researcher Sonia Livingstone argues that internet access should be considered a fundamental right of children. She notes that risks and opportunities go hand in hand: “The more often children use the internet, the more digital skills and literacies they generally gain, the more online opportunities they enjoy and—the tricky part for policymakers—the more risks they encounter.” Shutting down children’s access to open online resources often most harms vulnerable young people, such as those with special needs or those lacking financial resources. Consider, for example, the case of a home- and wheelchair-bound child whose parents only discovered his rich online gaming community and empowered online identity after his death. Or Autcraft, a Minecraft server community where young people with autism can foster friendships via a medium that often serves them better than face-to-face interactions.

As I was working on my last column about young people and screen time, I spent some time talking to my sister, Mimi Ito, who directs the Connected Learning Lab at UC Irvine. We discussed how these problems and the negative publicity around screens were causing caregivers to develop unhealthy relationships with their children while trying to regulate their exposure to screens and the content they delivered. The messages caregivers are getting about the need to regulate and monitor screen time are much louder than messages about how they can actively engage with young people’s online interests. Mimi’s recent book, Affinity Online: How Connection and Shared Interest Fuel Learning,features a range of mixed-age, online communities that demonstrate how young people can learn from other young people and adult experts online. Often it’s the young people themselves that create communities, enforce norms, and insist on high-quality content. One of the cases, investigated by Rachel Cody Pfister, as her PhD work at the University of California, San Diego, is Hogwarts at Ravelry, a community of Harry Potter fans who knit together on Ravelry, an online platform for fiber arts. A 10-year-old girl founded the community, and members ranged from 11 to 70-plus at the time of Rachel’s study.

Hogwarts at Ravelry is just one of a multitude of examples of free and open intergenerational online learning communities of different shapes and sizes. The MIT Media Lab, where I work, is home to Scratch, a project created in the Lifelong Kindergarten group. Millions of young people around the world are part of a safe, creative, and healthy space for creative coding. Some Reddit groups like /r/aww for cute animal content, or a range of subreddits on Pokemon Go, are lively spaces of intergenerational communication. Like with Scratch, these massive communities thrive because of strict content and community guidelines, algorithms optimized to support these norms, and dedicated human moderation.

YouTube is also an excellent source of content for learning and discovering new interests. One now famous 12-year-old learned to dubstep just by watching YouTube videos, for example. The challenge is squaring the incentives of free-for-all commercial platforms like YouTube with the needs of special populations like young people and intergenerational sub-communities with specific norms and standards. We need to recognize that young people will make contact with commercial content and grown-ups online, and we need to figure out better ways to regulate and optimize platforms to serve participants of mixed ages. This means bringing young people’s interests, needs, and voices to the table, not shutting them out or making them invisible to online platforms and algorithms. This is why I’ve issued a call for research papers about algorithmic rights and protections for children together with my sister and our colleague and developmental psychologist, Candice Odgers. We hope to spark an interdisciplinary discussion of issues among a wide range of stakeholders to find answers to questions like: How can we create interfaces between the new, algorithmically governed platforms and their designers and civil society? How might we nudge YouTube and other platforms to be more like Scratch, designed for the benefit of young people and optimized not for engagement and revenue but instead for learning, exploration, and high-quality content? Can the internet support an ecosystem of platforms tailored to young people and mixed-age communities, where children can safely learn from each other, together with and from adults?

I know how important it is for young people to have connections to a world bigger and more diverse than their own. And I think that developers of these technologies (myself included) have a responsibility to design them based on scientific evidence and the participation of the public. We can’t leave it to commercial entities to develop and guide today’s learning platforms and internet communities—but we can’t shut these platforms down or prevent children from having access to meaningful online relationships and knowledge, either.

Related Content