Архив рубрики: content moderation

Auto Added by WPeMatico

TikTok brings in outside experts to help it craft moderation and content policies

In October, TikTok href=»https://techcrunch.com/2019/10/15/tiktok-taps-corporate-law-firm-kl-gates-to-advise-on-its-u-s-content-moderation-policies/»> tapped corporate law firm K&L Gates to advise the company on its moderation policies and other topics afflicting social media platforms. As a part of those efforts, TikTok said it would form a new committee of experts to advise the business on topics like child safety, hate speech, misinformation, bullying and other potential problems. Today, TikTok is announcing the technology and safety experts who will be the company’s first committee members.
The committee, known as the TikTok Content Advisory Council, will be chaired by Dawn Nunziato, a professor at George Washington University Law School and co-director of the Global Internet Freedom Project. Nunziato specializes in free speech issues and content regulation — areas where TikTok has fallen short.
“A company willing to open its doors to outside experts to help shape upcoming policy shows organizational maturity and humility,” said Nunziato, of her joining. “I am working with TikTok because they’ve shown that they take content moderation seriously, are open to feedback and understand the importance of this area both for their community and for the future of healthy public discourse,” she added.
TikTok says it plans to grow the committee to around a dozen experts in time.
According to the company, other committee members include:
Rob Atkinson, Information Technology and Innovation Foundation, brings academic, private sector, and government experience as well as knowledge of technology policy that can advise our approach to innovation
Hany Farid, University of California, Berkeley Electrical Engineering & Computer Sciences and  School of Information, is a renowned expert on digital image and video forensics, computer vision, deep fakes, and robust hashing
Mary Anne Franks, University of Miami Law School, focuses on the intersection of law and technology and will provide valuable insight into industry challenges including discrimination, safety, and online identity
Vicki Harrison, Stanford Psychiatry Center for Youth Mental Health and Wellbeing, is a social worker at the intersection of social media and mental health who understands child safety issues and holistic youth needs
Dawn Nunziato, chair, George Washington University Law School, is an internationally recognized expert in free speech and content regulation
David Ryan Polgar, All Tech Is Human, is a leading voice in tech ethics, digital citizenship, and navigating the complex challenge of aligning societal interests with technological priorities
Dan Schnur, USC Annenberg Center on Communication and UC Berkeley Institute of Governmental Studies, brings valuable experience and insight on political communications and voter information
Nunziato’s view of TikTok — of a company being open and willing to change — is a charitable one, it should be said.
The company is in dangerous territory here in the U.S., despite its popularity among Gen Z and millennial users. TikTok today is facing a national security review and a potential ban on all government workers’ phones. In addition, the Dept. of Defense suggested the app should be blocked on phones belonging to U.S. military personnel. Its 2017 acquisition of U.S.-based Musical.ly may even come under review.
Though known for its lighthearted content — like short videos of dances, comedy and various other creative endeavors — TikTok has also been accused of things like censoring the Hong Kong protests and more, which contributed to U.S. lawmakers’ fears that the Chinese-owned company may have to comply with “state intelligence work.” 
TikTok has also been accused of having censored content from unattractive, poor or disabled persons, as well as videos from users identified as LGBTQ+. The company explained in December these guidelines are no longer used, as they were an early and misguided attempt to protect users from online bullying. TikTok had limited the reach of videos where such harassment could occur. But this suppression was done in the dark, unasked for by the “protected” parties — and it wasn’t until exposed by German site NetzPolitik that anyone knew these rules had existed.
In light of the increased scrutiny of its platform and its ties to China, TikTok has been taking a number of steps in an attempt to change its perception. The company released new Community Guidelines and published its first Transparency Report a few months ago. It also hired a global General Counsel and expanded its Trust & Safety hubs in the U.S., Ireland and Singapore. And it just announced a Transparency Center open to outside experts who want to review its moderation practices.
TikTok’s new Advisory Council will meet with the company’s U.S. leadership to focus on the key topics of importance starting at the end of the month, with an early focus on creating policies around misinformation and election interference.

“All of our actions, including the creation of this Council, help advance our focus on creating an entertaining, genuine experience for our community by staying true to why users uniquely love the TikTok platform. As our company grows, we are focused on reflection and learning as a part of company culture and committed to transparently sharing our progress with our users and stakeholders,” said TikTok’s U.S. general manager, Vanessa Pappas. “Our hope is that through thought-provoking conversations and candid feedback, we will find productive ways to support platform integrity, counter potential misuse, and protect the interests of all those who use our platform,” she added. 

TikTok brings in outside experts to help it craft moderation and content policies