Hired Moderators Keep the Peace and Encourage Safe Play in Kids’ Virtual Spaces
3.2.11 | With the influx of social networking sites and virtual worlds aimed at kids, a new industry of online moderators has developed to keep young users happy and safe.
The New York Times recently looked at the role of Metaverse Mod Squad, a California-based company founded in 2007 by Amy Pritchard, a savvy Second Life user and lawyer by training. Employees work mainly from home as moderators on a variety of client sites, using an approach that is “more camp counselor than cop.”
“Anybody can buy a profanity filter, but kids have all kinds of work-arounds,” Anne Collier, co-director of connectsafely.org, told the NYT. “There really is no substitute for human moderation.”
In addition to focusing on behavior issues and safety, the moderators also sometimes serve as the kids’ proxy, intervening, for example, when the owner of Webosaurs, an online virtual world for children age 5 to 12, wanted to make certain costumes available only to paying members. The moderators recommended against that idea, noting that it might provoke jealous responses and free users might leave Webosaurs.
The NYT’s Christine Larson writes:
Metaverse has a client list that includes the Cartoon Network, the National Football League, Nickelodeon and the State Department. It employs an army of workers — often stay-at-home moms — to monitor and moderate Web sites where children create their own characters, or avatars, and can interact with thousands of other users. Metaverse’s employees frequently create their own avatars to help maintain the peace.
Ms. Pritchard says the stakes are higher in online worlds intended for children, like Webosaurs. In more adult-oriented sites like Second Life, users must be at least 16 and are presumably more equipped to deal with the threats of online interaction.
She has found that keeping children safe has a lot to do with keeping them entertained. “If you just release kids into these online playgrounds with no one to monitor them and no rules, it’s ‘Lord of the Flies,’ ” she says. “But if you can balance safety with fun and engage the kids, I guarantee you’ll have a site with a great group of kids and no cyberbullying.”
In three and a half years, Metaverse has grown from a whimsical idea hatched in a Second Life virtual bar into an agency that has been profitable since 2009 and had revenue “in the millions” last year, she says, declining to be more specific. The company is private.
And over at CNET.com, Collier’s co-director at ConnectSafely, Larry Magid, is advocating for taking the “cyber” out of “cyberbullying.” Bullying is “not about technology,” he argues, “it’s about the way that people behave toward one other.”
“I don’t think we need special cyberbulllying laws to protect people from threats that have long been illegal,” continues Magid. “In addition to the law, schools have the right to intervene if off-campus behavior affects life at school.”
Leave a comment
Comments are moderated to ensure topic relevance and generally will be posted quickly.