NextDoor, the neighborhood app that is intended to operate like a coffee shop bulletin, has fostered a community that can occasionally be less than neighborly.
User complaints began to surface surrounding the nationwide George Floyd protests last summer that content moderators were deleting posts discussing racial injustice and support for the Black Lives Matter movement.
In response, Nextdoor improved its AI systems to identify racism, removed certain features, and offered new unconscious bias training for leads, or unpaid content moderators who live in the communities registered on the app. Despite its efforts, the surge in active daily users spurred by the COVID-19 pandemic has forced Nextdoor to deal with racism, discrimination, and misinformation that its platform is often riddled with.
With vaccination rollouts in full swing, Nextdoor is a source for many on scheduling appointments and other related information. Yet, many are worried that the app is not equipped to handle these issues at the hyperlocal level. Today, we discuss changes that Nextdoor has made to address these issues and where they are still prevailing.
GUESTS:
Arielle Pardes, senior writer at WIRED; she tweets @pardesoteric
Will Payne, assistant professor in geographic information science at Rutgers University who researches spacial data and urban inequality; he tweets @willbpayne
Ralinda Harvery Smith, freelance writer based in Santa Monica and Nextdoor user; last summer she wrote the LA Times Op-Ed “I’m the Black person Nextdoor, trying to sort the site’s value from its ugliness”; she tweets @ralinda