Tag: trust and safety

Partnering To Combat the Spread of Violent Content Online

At Spotify, our Platform Rules strictly prohibit content that promotes terrorism or violent extremism. We take action on content that violates our Platform Rules, and we work closely with third parties with extremism expertise to ensure we are making the most informed decisions throughout these processes, taking local, regional, and cultural context into account.

Today, to advance these efforts, we are announcing the addition of Moonshot to our Spotify Safety Advisory Council (SSAC). Moonshot’s work on understanding the motivations for online hate and extremism—and how to reduce the threat of violence both online and offline—has been critical in building safeguards on our platform. 

Growing our Spotify Safety Advisory Council

At Spotify, we design products and develop policies with safety in mind, and our longstanding partnership with Moonshot has provided critical insights to get ahead of emerging online trends.

“Since 2021, we have partnered with Spotify to produce innovative product solutions and meaningful action in response to the threat of online hate and extremism,” says Clark Hogan-Taylor, Director of Tech Partnerships at Moonshot. “From protections against racial violence to fighting organized crime, our team has worked tirelessly to ensure Spotify is a safe platform for all. We are proud to be joining the Safety Advisory Council to further contribute to this important mission.”

Moonshot joins other members of the SSAC, such as the Institute for Strategic Dialogue (ISD), to help Spotify better understand the global landscape of hate and extremism as harmful trends across borders and platforms. 

These partnerships and ongoing engagement with our SSAC reflect our commitment to constantly evolve our approach as we combine the expertise of our internal safety teams, the scale of our technology, and the insights of our trusted external partners.

Violent extremist content is prohibited on Spotify

Spotify addresses potential violent extremist content through multiple policies, which include, but are not limited to:

  • Our hate policies, which prohibit content that explicitly incites violence or hatred toward people based on protected characteristics, including race, sex, ethnicity, or sexual orientation; and
  • Our dangerous content policies, which clearly outline that content promoting or supporting terrorism or violent extremism is strictly not allowed on the Spotify platform.

We identify potentially violative content by using proactive monitoring methods, leveraging human expertise, and reviewing and responding to user reports. When it comes to enforcement, we may take various actions, including removing content or the creator’s account, reducing distribution, and/or demonetization.

This space is nuanced, complex, and always evolving, and we are committed to continually improving our processes. You can read more about our approach to violent extremism and our safety work here.

Partnering With Industry To Create a Safer Online Environment for Young People

By Marcelina Slota – Head of Platform Integrity, Spotify

Updated September 13, 2024

To continue our work providing positive experiences for young people, we’re proud to announce that PROJECT ROCKIT is joining our Spotify Safety Advisory Council (SSAC). Through their global advocacy, PROJECT ROCKIT empowers young people to combat bullying at school, online, and beyond, and their expertise will help inform how Spotify identifies and manages bullying and harassment however it may arise on our platform. Our work with PROJECT ROCKIT will build upon our long-standing partnerships with experts worldwide to advise how we continue building Spotify with safety by design at every step. 

“It’s a privilege for PROJECT ROCKIT to join Spotify’s Safety Advisory Council,” says Lucy Thomas, Cofounder and CEO of PROJECT ROCKIT. “For years, we’ve worked tirelessly to create safer spaces for young people, and this collaboration represents an incredible opportunity to extend that impact globally. By bringing youth-driven perspectives to Spotify, we can help shape a platform that champions both safety and authentic expression. Together, we’re paving the way for a digital world where young people are empowered, respected, and safe.” 

Original Post

Safety is a top priority for Spotify, which is why we want to make it easier for young people and parents to understand and navigate the digital world. Today, we’re announcing that we have a new Parental Guide to help do just that, and that Spotify has joined the Tech Coalition to share best practices with industry on upholding youth safety.

The Tech Coalition is a trusted organization that unites the global tech industry to foster a safer online environment for young people by preventing and combating online child sexual exploitation and abuse. This partnership expands our network of trusted third-party experts who help advise our teams on how we launch policies and products with safety by design. Some of the key stakeholders who’ve been helping guide our work in this space include Thorn, the WeProtect Global Alliance, and Safety Advisory Council members Alex Holmes (Deputy CEO of the nonprofit The Diana Award and founder of Anti-Bullying Ambassadors). 

In addition to partnering with experts across the globe, Spotify works to craft an experience that is both safe and enjoyable for young people through a number of ways, including by:

  • Establishing a zero-tolerance policy against content that exploits children and Platform Rules that ban illegal and/or abusive behaviors that could harm children;
  • Leveraging machine learning signals and establishing user reporting mechanisms to detect potential policy and/or legal violations;
  • Staffing teams around the clock to review and promptly remove potentially violating or explicit content; and
  • Connecting potentially vulnerable users to mental health resources when they search for content related to suicide, self-harm, and disordered eating content.

We have also launched our Parental Guide to help explain how parents can curate the experience that’s right for their families. Our work in this space is ongoing and we’ll continue updating this resource (and others) to reflect the best and most up–to-date information. 

How Spotify Approaches Safety With Sarah Hoyle, Global Head of Trust and Safety

Spotify’s mission is to unlock the potential of human creativity by giving a million artists the opportunity to live off their art and billions of fans the opportunity to enjoy and be inspired by it. In support of that endeavor, our global teams work around the clock to ensure that the experience along the way is safe and enjoyable for creators, listeners, and advertisers. 

While there is some user-generated content on Spotify, the vast majority of listening time is spent on licensed content. Regardless of who created the content, our top priority is to allow our community to connect directly with the music, podcasts, and audiobooks they love. When we think about the safety aspect of this, it can be helpful to do so in the context of seeing a show at a performance venue.

Like a performance venue, Spotify hosts different types of shows across a variety of genres. Not every show may be suitable for all audiences or in line with everyone’s unique tastes. Just like people select which shows they want to see, Spotify provides opportunities for users to seek out and curate content that they like and that is appropriate for their preferences. For example, users can skip music tagged by creators or rights holders as “explicit” by using our content toggle. Mobile users can block artists or songs they wish to hide and exclude playlists from their taste profiles or use the “not interested” button to better control their experiences.

While Spotify strongly supports enabling creative expression and listener choice, this does not mean that anything goes. In the same way that a venue has rules to ensure that shows run smoothly and are safe, Spotify has Platform Rules to guide what’s acceptable content and behavior on our platform. Bad behavior at a concert can lead to things like backstage access being revoked or, in egregious situations, someone being kicked out of the venue. Breaking Spotify’s rules can have consequences like removal, reduced distribution, and/or demonetization. We will also remove content that violates the law and/or our Terms of Service. Creators or rights holders may also choose to remove content themselves.

Measures we continue to take around responsible content recommendations and search also play key roles in creating a safe and enjoyable experience. For example, product and engineering teams across the company work with Trust & Safety to conduct impact assessments that allow us to evaluate and better mitigate potential algorithmic harms and inequities. We’ve also been introducing search warnings and in-app messaging to users searching for suicide, self-harm, and disordered eating-related content, which link to Spotify’s Mental Health Resources page. This work is being done in partnership with experts like Ditch the Label and the Jed Foundation with the goal of connecting potentially at-risk users with trusted help resources. 

Keeping our platform safe is a challenging job and, as the landscape evolves, we’re committed to evolving along with it. Safety is a company-wide responsibility and our efforts involve ongoing coordination between engineers, product managers, data scientists, researchers, lawyers, and social impact experts, as well as the policy and enforcement experts in Trust & Safety. Many of the folks on these teams have long careers in online safety, as well as in fields like human rights, social work, academia, health care, and consulting. We have also established an internal Safety Leadership group that regularly brings executives from different departments together to help ensure awareness of safety needs and monitor progress on our efforts. 

To complement our in-house expertise, we also seek counsel and feedback from third-party experts around the world, including our Safety Advisory Council, to ensure we’re considering multiple points of view when shaping our safety approach. In 2022, we invested in the local and linguistic expertise of start-up Kinzen, now known as the Content Safety Analysis team within Spotify, which has a nuanced understanding of the global safety landscape and works proactively to deliver a safe and enjoyable experience across our content offerings. 

Click here to learn more about Spotify’s approach to safety.

Thorn Joins the Spotify Safety Advisory Council

At Spotify, we’ve had long-standing policies against content that promotes, solicits, or facilitates harm, including child sexual abuse or exploitation. It is with that commitment in mind that we are announcing the addition of Thorn to our Safety Advisory Council. Thorn is a nonprofit dedicated to building technology to defend children from sexual abuse and is a longstanding safety partner to Spotify.

Since 2020, Spotify has integrated Thorn’s feedback to better understand the child safety space and identify emerging trends, leveraging their product Safer, which is designed to detect, identify, and report child sexual abuse material (CSAM) at scale. In addition, Thorn provided key insights into emerging abuse trends in the child safety space that have helped shape our approach to policy enforcement.

“We are thrilled to join the Spotify Safety Advisory Council and continue our work with Spotify to defend children from sexual abuse,” says John Starr, Vice President of Business Operations and Strategic Impact at Thorn. “Spotify’s commitment to child safety is evident in their long-standing policies, strong emphasis on continuous improvement, and partnerships with organizations like Thorn. We look forward to collaborating with the other council members to further advance these efforts.”

The Safety Advisory Council’s mission is to help Spotify evolve its policies and products to help ensure a safe experience, while making sure we respect creator expression. Council members do not make enforcement decisions. During this first year, the SSAC continued to support us in an advisory capacity and provided ongoing feedback in key issue areas. 

This partnership is another example of Spotify’s investment in advancing our child safety efforts, which have included hiring child safety experts who have experience working at organizations like NCMEC and our membership as a WeProtect Alliance partner, says Sarah Hoyle, Head of Trust and Safety at Spotify.

Looking ahead, we are excited to partner with Thorn and the rest of the council members to continue evolving our approach to keeping children safe on our platform, tackle emerging issues related to the responsible management of AI-generated content, and ensure we’re creating a safe and enjoyable experience for everyone who uses Spotify.

Spotify Continues to Ramp Up Platform Safety Efforts with Acquisition of Kinzen

Spotify and Kinzen logos

Listen to this story read aloud in 2 minutes and 15 seconds. 


Today, Spotify is excited to share that we have acquired Dublin, Ireland-based Kinzen, a global leader in protecting online communities from harmful content. Kinzen’s advanced technology and deep expertise will help us more effectively deliver a safe, enjoyable experience on our platform around the world.
 

Spotify’s current partnership with Kinzen, which began in 2020, has been critical to enhancing our approach to platform safety. The company’s unique technology is particularly suited for podcasting and audio formats, making its value to Spotify clear and unmatched. The technology the Kinzen team brings to Spotify combines machine learning and human expertise—backed by analysis from leading local academics and journalists—to analyze potential harmful content and hate speech in multiple languages and countries. 

“We’ve long had an impactful and collaborative partnership with Kinzen and its exceptional team. Now, working together as one, we’ll be able to even further improve our ability to detect and address harmful content, and importantly, in a way that better considers local context,” said Dustee Jenkins, Spotify’s Global Head of Public Affairs. “This investment expands Spotify’s approach to platform safety, and underscores how seriously we take our commitment to creating a safe and enjoyable experience for creators and users.”

Given the complexity of analyzing audio content in hundreds of languages and dialects, and the challenges in effectively evaluating the nuance and intent of that content, the acquisition of Kinzen will help Spotify better understand the abuse landscape and identify emerging threats on the platform.

“The combination of tools and expert insights is Kinzen’s unique strength that we see as essential to identifying emerging abuse trends in markets and moderating potentially dangerous content at scale,” said Sarah Hoyle, Spotify’s Head of Trust and Safety. “This expansion of our team, combined with the launch of our Safety Advisory Council, demonstrates the proactive approach we’re taking in this important space.”

Introducing the Spotify Safety Advisory Council

Spotify Safety Advisory Council logo on blue background

Over the past several months, Spotify has moved to being more transparent about our safety efforts. In January, we published our Platform Rules and took measures to ensure that creators on our platform view and adhere to them. These were first steps forward, and today we unveil another: our newly formed Spotify Safety Advisory Council—the first safety-focused council of its type at any major audio company.

The founding members of Spotify’s Safety Advisory Council are individuals and organizations around the world with deep expertise in areas that are key to navigating the online safety space. At a high level, the council’s mission is to help Spotify evolve its policies and products in a safe way while making sure we respect creator expression. Our council members will advise our teams in key areas like policy and safety-feature development as well as guide our approach to equity, impact, and academic research. Council members will not make enforcement decisions about specific content or creators. However, their feedback will inform how we shape our high-level policies and the internal processes our teams follow to ensure that policies are applied consistently and at scale around the world.

While Spotify has been seeking feedback from many of these founding members for years, we’re excited to further expand and be more transparent about our safety partnerships. As our product continues to grow and evolve, council membership will grow and evolve along with it. In the months ahead, we will work closely with founding members to expand the council, with the goal of broadening regional and linguistic representation as well as adding additional experts in the equity and impact space.

The founding members and partner organizations include the following:

Dangerous Speech Project, represented by Professor Susan Benesch and Tonei Glavinic

Center for Democracy and Technology, represented by Emma Llansó

Professor Danielle Citron

Dr. Mary Anne Franks

Alex Holmes

Institute for Strategic Dialogue (ISD), represented by Henry Tuck and Milo Comerford

Dr. Jonas Kaiser

Kinzen, represented by Founders Mark Little and Áine Kerr

Dr. Ronaldo Lemos

Dr. Christer Mattsson

Dr. Tanu Mitra

Desmond Upton Patton, PhD, MSW

Megan Phelps-Roper 

USC Annenberg Inclusion Initiative, represented by Dr. Katherine Pieper and Dr. Stacy L. Smith

 

You can read the full bios for the organizations and individuals here.