Apps like Signal are proving invaluable in these days of unrest, and anything we can do to simplify and secure the way we share sensitive information is welcome. To that end Signal has added the ability to blur faces in photos sent via the app, making it easy to protect someone’s identity without leaving any trace on other, less secure apps.
After noting Signal’s support of the protests occurring all over the world right now against police brutality, the company’s founder Moxie Marlinspike writes in a blog post that “We’ve also been working to figure out additional ways we can support everyone in the street right now. One immediate thing seems clear: 2020 is a pretty good year to cover your face.”
Fortunately there are perfectly good tools out there both to find faces in photographs and to blur imagery (presumably irreversibly, given Signal’s past attention to detail in these matters, but the company has not returned a request for comment). Put them together and boom, a new feature that lets you blur all the faces in a photo with a single tap.
Image Credits: Signal
This is helpful for the many users of Signal who use it to send sensitive information, including photos where someone might rather not be identifiable. Normally one would blur the face in another photo editor app, which is simple enough but not necessarily secure. Some editing apps, for instance, host computation-intensive processes on cloud infrastructure and may retain a copy of a photo being edited there — and who knows what their privacy or law enforcement policy may be?
If it’s sensitive at all, it’s better to keep everything on your phone and in apps you trust. And Signal is among the few apps trusted by the justifiably paranoid.
All face detection and blurring takes place on your phone, Marlinspike wrote. But he warned that the face detection isn’t 100% reliable, so be ready to manually draw or expand blur regions in case someone isn’t detected.
The new feature should appear in the latest versions of the app as soon as those are approved by Google and Apple.
Lastly Marlinspike wrote that the company is planning on “distributing versatile face coverings to the community free of charge.” The picture shows a neck gaiter like those sold for warmth and face protection. Something to look forward to then.
Архив рубрики: Security
TikTok brings in outside experts to help it craft moderation and content policies
In October, TikTok href=»https://techcrunch.com/2019/10/15/tiktok-taps-corporate-law-firm-kl-gates-to-advise-on-its-u-s-content-moderation-policies/»> tapped corporate law firm K&L Gates to advise the company on its moderation policies and other topics afflicting social media platforms. As a part of those efforts, TikTok said it would form a new committee of experts to advise the business on topics like child safety, hate speech, misinformation, bullying and other potential problems. Today, TikTok is announcing the technology and safety experts who will be the company’s first committee members.
The committee, known as the TikTok Content Advisory Council, will be chaired by Dawn Nunziato, a professor at George Washington University Law School and co-director of the Global Internet Freedom Project. Nunziato specializes in free speech issues and content regulation — areas where TikTok has fallen short.
“A company willing to open its doors to outside experts to help shape upcoming policy shows organizational maturity and humility,” said Nunziato, of her joining. “I am working with TikTok because they’ve shown that they take content moderation seriously, are open to feedback and understand the importance of this area both for their community and for the future of healthy public discourse,” she added.
TikTok says it plans to grow the committee to around a dozen experts in time.
According to the company, other committee members include:
Rob Atkinson, Information Technology and Innovation Foundation, brings academic, private sector, and government experience as well as knowledge of technology policy that can advise our approach to innovation
Hany Farid, University of California, Berkeley Electrical Engineering & Computer Sciences and School of Information, is a renowned expert on digital image and video forensics, computer vision, deep fakes, and robust hashing
Mary Anne Franks, University of Miami Law School, focuses on the intersection of law and technology and will provide valuable insight into industry challenges including discrimination, safety, and online identity
Vicki Harrison, Stanford Psychiatry Center for Youth Mental Health and Wellbeing, is a social worker at the intersection of social media and mental health who understands child safety issues and holistic youth needs
Dawn Nunziato, chair, George Washington University Law School, is an internationally recognized expert in free speech and content regulation
David Ryan Polgar, All Tech Is Human, is a leading voice in tech ethics, digital citizenship, and navigating the complex challenge of aligning societal interests with technological priorities
Dan Schnur, USC Annenberg Center on Communication and UC Berkeley Institute of Governmental Studies, brings valuable experience and insight on political communications and voter information
Nunziato’s view of TikTok — of a company being open and willing to change — is a charitable one, it should be said.
The company is in dangerous territory here in the U.S., despite its popularity among Gen Z and millennial users. TikTok today is facing a national security review and a potential ban on all government workers’ phones. In addition, the Dept. of Defense suggested the app should be blocked on phones belonging to U.S. military personnel. Its 2017 acquisition of U.S.-based Musical.ly may even come under review.
Though known for its lighthearted content — like short videos of dances, comedy and various other creative endeavors — TikTok has also been accused of things like censoring the Hong Kong protests and more, which contributed to U.S. lawmakers’ fears that the Chinese-owned company may have to comply with “state intelligence work.”
TikTok has also been accused of having censored content from unattractive, poor or disabled persons, as well as videos from users identified as LGBTQ+. The company explained in December these guidelines are no longer used, as they were an early and misguided attempt to protect users from online bullying. TikTok had limited the reach of videos where such harassment could occur. But this suppression was done in the dark, unasked for by the “protected” parties — and it wasn’t until exposed by German site NetzPolitik that anyone knew these rules had existed.
In light of the increased scrutiny of its platform and its ties to China, TikTok has been taking a number of steps in an attempt to change its perception. The company released new Community Guidelines and published its first Transparency Report a few months ago. It also hired a global General Counsel and expanded its Trust & Safety hubs in the U.S., Ireland and Singapore. And it just announced a Transparency Center open to outside experts who want to review its moderation practices.
TikTok’s new Advisory Council will meet with the company’s U.S. leadership to focus on the key topics of importance starting at the end of the month, with an early focus on creating policies around misinformation and election interference.
“All of our actions, including the creation of this Council, help advance our focus on creating an entertaining, genuine experience for our community by staying true to why users uniquely love the TikTok platform. As our company grows, we are focused on reflection and learning as a part of company culture and committed to transparently sharing our progress with our users and stakeholders,” said TikTok’s U.S. general manager, Vanessa Pappas. “Our hope is that through thought-provoking conversations and candid feedback, we will find productive ways to support platform integrity, counter potential misuse, and protect the interests of all those who use our platform,” she added.
TikTok brings in outside experts to help it craft moderation and content policies
We need startups to build democracy tech
It’s time to actually make the world a better place.
Silicon Valley was birthed from an existential threat to the world. Nazi radar defense technology was decimating the Allied air forces. But American engineers heeded the call, and in a Harvard lab led by Stanford professor Frederick Terman, invented radar jammers that helped win the war.
Terman brought the engineering talent back to… Read More
Copy and paste trick could unlock iOS 10 devices in Lost Mode
Lost and stolen iOS devices could be at risk if ne’er-do-wells learn of this blunt-force method of getting past Activation Lock. No special equipment or technical know-how is required, which means any geek off the streets can do it. Fortunately, it’s easily fixed — but until that happens, you might want to be a little extra careful about leaving your phone unattended. Read More
Copy and paste trick could unlock iOS 10 devices in Lost Mode