Tag: Safety Landing Page

Partnering To Combat the Spread of Violent Content Online

At Spotify, our Platform Rules strictly prohibit content that promotes terrorism or violent extremism. We take action on content that violates our Platform Rules, and we work closely with third parties with extremism expertise to ensure we are making the most informed decisions throughout these processes, taking local, regional, and cultural context into account.

Today, to advance these efforts, we are announcing the addition of Moonshot to our Spotify Safety Advisory Council (SSAC). Moonshot’s work on understanding the motivations for online hate and extremism—and how to reduce the threat of violence both online and offline—has been critical in building safeguards on our platform. 

Growing our Spotify Safety Advisory Council

At Spotify, we design products and develop policies with safety in mind, and our longstanding partnership with Moonshot has provided critical insights to get ahead of emerging online trends.

“Since 2021, we have partnered with Spotify to produce innovative product solutions and meaningful action in response to the threat of online hate and extremism,” says Clark Hogan-Taylor, Director of Tech Partnerships at Moonshot. “From protections against racial violence to fighting organized crime, our team has worked tirelessly to ensure Spotify is a safe platform for all. We are proud to be joining the Safety Advisory Council to further contribute to this important mission.”

Moonshot joins other members of the SSAC, such as the Institute for Strategic Dialogue (ISD), to help Spotify better understand the global landscape of hate and extremism as harmful trends across borders and platforms. 

These partnerships and ongoing engagement with our SSAC reflect our commitment to constantly evolve our approach as we combine the expertise of our internal safety teams, the scale of our technology, and the insights of our trusted external partners.

Violent extremist content is prohibited on Spotify

Spotify addresses potential violent extremist content through multiple policies, which include, but are not limited to:

  • Our hate policies, which prohibit content that explicitly incites violence or hatred toward people based on protected characteristics, including race, sex, ethnicity, or sexual orientation; and
  • Our dangerous content policies, which clearly outline that content promoting or supporting terrorism or violent extremism is strictly not allowed on the Spotify platform.

We identify potentially violative content by using proactive monitoring methods, leveraging human expertise, and reviewing and responding to user reports. When it comes to enforcement, we may take various actions, including removing content or the creator’s account, reducing distribution, and/or demonetization.

This space is nuanced, complex, and always evolving, and we are committed to continually improving our processes. You can read more about our approach to violent extremism and our safety work here.

Partnering With Industry To Create a Safer Online Environment for Young People

By Marcelina Slota – Head of Platform Integrity, Spotify

Updated September 13, 2024

To continue our work providing positive experiences for young people, we’re proud to announce that PROJECT ROCKIT is joining our Spotify Safety Advisory Council (SSAC). Through their global advocacy, PROJECT ROCKIT empowers young people to combat bullying at school, online, and beyond, and their expertise will help inform how Spotify identifies and manages bullying and harassment however it may arise on our platform. Our work with PROJECT ROCKIT will build upon our long-standing partnerships with experts worldwide to advise how we continue building Spotify with safety by design at every step. 

“It’s a privilege for PROJECT ROCKIT to join Spotify’s Safety Advisory Council,” says Lucy Thomas, Cofounder and CEO of PROJECT ROCKIT. “For years, we’ve worked tirelessly to create safer spaces for young people, and this collaboration represents an incredible opportunity to extend that impact globally. By bringing youth-driven perspectives to Spotify, we can help shape a platform that champions both safety and authentic expression. Together, we’re paving the way for a digital world where young people are empowered, respected, and safe.” 

Original Post

Safety is a top priority for Spotify, which is why we want to make it easier for young people and parents to understand and navigate the digital world. Today, we’re announcing that we have a new Parental Guide to help do just that, and that Spotify has joined the Tech Coalition to share best practices with industry on upholding youth safety.

The Tech Coalition is a trusted organization that unites the global tech industry to foster a safer online environment for young people by preventing and combating online child sexual exploitation and abuse. This partnership expands our network of trusted third-party experts who help advise our teams on how we launch policies and products with safety by design. Some of the key stakeholders who’ve been helping guide our work in this space include Thorn, the WeProtect Global Alliance, and Safety Advisory Council members Alex Holmes (Deputy CEO of the nonprofit The Diana Award and founder of Anti-Bullying Ambassadors). 

In addition to partnering with experts across the globe, Spotify works to craft an experience that is both safe and enjoyable for young people through a number of ways, including by:

  • Establishing a zero-tolerance policy against content that exploits children and Platform Rules that ban illegal and/or abusive behaviors that could harm children;
  • Leveraging machine learning signals and establishing user reporting mechanisms to detect potential policy and/or legal violations;
  • Staffing teams around the clock to review and promptly remove potentially violating or explicit content; and
  • Connecting potentially vulnerable users to mental health resources when they search for content related to suicide, self-harm, and disordered eating content.

We have also launched our Parental Guide to help explain how parents can curate the experience that’s right for their families. Our work in this space is ongoing and we’ll continue updating this resource (and others) to reflect the best and most up–to-date information. 

How Spotify Is Protecting Election Integrity in 2024

Dustee Jenkins, Chief Public Affairs Officer, Spotify

With billions of people from over 50 countries heading to the polls to cast their vote, 2024 is shaping up to be the largest election year in history. Safeguarding our platform during critically important global events is a top priority for our teams, and we’ve spent years developing and refining our approach. Knowing that election safety is top of mind for many of our creators and users, we wanted to provide further insights into our approach at Spotify.

Our Approach 

To help creators understand what they can and cannot do on Spotify, we have Platform Rules that apply to everyone, partnerships with trusted experts who help us identify evolving forms of online abuse, and product interventions to elevate timely and trusted resources and help restrict the distribution of potentially harmful content. Our Platform Rules clearly state that content that attempts to manipulate or interfere with election-related processes is prohibited. When we identify violative content, we take the appropriate action.

When it comes to elections, risks often vary from country to country, and the types of abuse that manifest can be hyper-localized and nuanced. Rather than use the same one-size-fits-all approach to every market, our teams conduct individual risk assessments that consider a variety of indicators, including Spotify’s availability and specific product offerings in a country, historical precedents for online and offline harm during voting periods, and emerging geopolitical factors that may increase on-platform risks. 

We monitor these factors on an ongoing basis and use our learnings to inform policy and enforcement guidelines, customize in-product interventions, and determine where we may benefit from additional resourcing and/or third-party inputs. Ultimately, our primary focus is always to reduce risk, allowing our listeners, creators, and advertisers to enjoy our products. 

Expert Partnerships

To help us understand the nuance of potential harms around the world, Spotify acquired Kinzen in 2022. This has allowed us to conduct ongoing research in multiple languages and in critical areas like misinformation and hate speech. This research is supported by a pioneering tool called Spotlight designed to identify potential risks quickly within long-form audio content like podcasts.

Additionally, we partner closely with third-party experts to ensure we represent multiple viewpoints when discussing sensitive policy areas. These include our global Spotify Safety Advisory Council and the Institute for Strategic Dialogue.

In-Product Resources

During key elections, we encourage nonpartisan community engagement by connecting listeners with reliable, local information. These campaigns have driven millions of visits to civic engagement resources, helping users check their voter status, register to vote, or learn more about their local elections, whatever their political affiliation. 

We also leverage a combination of algorithmic and human curation to identify content that violates our guidelines and may update our recommendations to curb potentially manipulated or dangerous information.

Political Ads

Spotify accepts political advertisements across music, the Spotify Audience Network, and Originals & Licensed podcasts in certain markets.

Political ads may be placed in the Spotify Audience Network and Spotify’s free, ad-supported service inventory. An account must be eligible for political ads, and the account holder must complete an advertiser identity verification process to proceed. Political ads are unavailable for purchase via our self-serve tool, Spotify Ad Studio. 

Additionally, we require that political advertisements clearly disclose the use of any synthetic or manipulated media, including media created or edited with the use of artificial intelligence tools that depict real or realistic-looking people or events. This disclosure must be included in the advertisement and must be clear and conspicuous.

To read more about political ads in the markets where they are offered, and to learn about how to report an ad you believe violates our policies, please review Spotify’s political advertising editorial policies.

Thorn Joins the Spotify Safety Advisory Council

At Spotify, we’ve had long-standing policies against content that promotes, solicits, or facilitates harm, including child sexual abuse or exploitation. It is with that commitment in mind that we are announcing the addition of Thorn to our Safety Advisory Council. Thorn is a nonprofit dedicated to building technology to defend children from sexual abuse and is a longstanding safety partner to Spotify.

Since 2020, Spotify has integrated Thorn’s feedback to better understand the child safety space and identify emerging trends, leveraging their product Safer, which is designed to detect, identify, and report child sexual abuse material (CSAM) at scale. In addition, Thorn provided key insights into emerging abuse trends in the child safety space that have helped shape our approach to policy enforcement.

“We are thrilled to join the Spotify Safety Advisory Council and continue our work with Spotify to defend children from sexual abuse,” says John Starr, Vice President of Business Operations and Strategic Impact at Thorn. “Spotify’s commitment to child safety is evident in their long-standing policies, strong emphasis on continuous improvement, and partnerships with organizations like Thorn. We look forward to collaborating with the other council members to further advance these efforts.”

The Safety Advisory Council’s mission is to help Spotify evolve its policies and products to help ensure a safe experience, while making sure we respect creator expression. Council members do not make enforcement decisions. During this first year, the SSAC continued to support us in an advisory capacity and provided ongoing feedback in key issue areas. 

This partnership is another example of Spotify’s investment in advancing our child safety efforts, which have included hiring child safety experts who have experience working at organizations like NCMEC and our membership as a WeProtect Alliance partner, says Sarah Hoyle, Head of Trust and Safety at Spotify.

Looking ahead, we are excited to partner with Thorn and the rest of the council members to continue evolving our approach to keeping children safe on our platform, tackle emerging issues related to the responsible management of AI-generated content, and ensure we’re creating a safe and enjoyable experience for everyone who uses Spotify.

Dr. Stacy Smith of USC Annenberg Calls on All of Us To Address the Gender Gap in Music

Dr. Stacy Smith

Each year, the team at the USC Annenberg Inclusion Initiative (AII), led by Dr. Stacy Smith, takes a look at the numbers of women in music—both behind the scenes and on the charts. The result is an annual study we are proud to underwrite. Together, we recognize there is so much more to be done when it comes to the inclusion of women and nonbinary creators within the music industry. 

Amplifying underrepresented voices is at the core of our work at Spotify. Over the past few years, we’ve launched several initiatives like Frequency, NextGen, SoundUp, and GLOW, each of which promotes a diverse roster of artists, songwriters, and podcasters on our platform. Our global EQUAL music program, which is dedicated to promoting and elevating women artists around the world, has enabled us to support over 700 women in 35 countries since March 2021.

Our work is informed by our partners at the USC AII, and particularly, Dr. Stacy Smith. As the founder of the USC AII—the leading global think tank studying issues of inequality in entertainment—Dr. Smith is the foremost disrupter of inequality in the entertainment industry. She’s also a founding member of our Safety Advisory Council.

The report outlines why women need to help and be supportive of other women through mentorship programs, amplification opportunities, and other confidence-building activities. This is the fourth consecutive year Spotify has funded the study, and we’re committed to continuing to learn and understand, and to work toward a more equitable industry. But don’t just take it from us—read on for Dr. Smith’s observations and recommendations.

How would you define representation?

In light of the research we do, representation focuses on prevalence as well as the nature of how groups are presented in the media. For music, specifically, we are examining who receives access and opportunity to specific key positions.

Your research examines inclusion of gender, race/ethnicity, the LGBTQIA+ community, people with disabilities, and mental health in storytelling across film, TV, and digital platforms. What do you see across the board when these groups are not represented, or are underrepresented? 

We see storytelling that fails to depict the reality of the world where we all live. We are missing critical stories and points of view from dynamic and vibrant communities. A lot of our work has shown negative tropes and stereotypes still occur far too frequently when it comes to gender, race/ethnicity, the LGBTIQ+ community, people with disabilities, and mental health.

The Annenberg Inclusion Study, which Spotify partners on, relates to women in the music industry. What are the encouraging trends you’re seeing? What more needs to be done? 

There is only one encouraging trend: The percentage of women artists increased in 2022 in comparison to 2021. That said, it is still abysmally low.  

People need to hire women songwriters, producers, and engineers. That’s it. Until that happens, the numbers will not change. Ultimately, what is needed to create change is for labels to sign, promote, market, and hire women and gender nonconforming people from all backgrounds as artists, songwriters, and producers.

Is there anything notable in the latest gender in music report that you’d like to call out?

The Recording Academy’s efforts on women in the mix have made absolutely no difference in the lives of women producers or engineers. The solution isn’t gimmicks or publicity grabs. It is people understanding that women songwriters and producers have talent but they are not given the same access and opportunity as their male peers.

What would you like to see Spotify doing more of? Less of?

Spotify, along with all the industry, can showcase the work of talented women songwriters and producers to facilitate opportunities. Making sure that listeners can experience songs written and produced by women—and performed by women, too.

Listen to women at full volume on our global EQUAL playlist.

Spotify Continues to Ramp Up Platform Safety Efforts with Acquisition of Kinzen

Spotify and Kinzen logos

Listen to this story read aloud in 2 minutes and 15 seconds. 


Today, Spotify is excited to share that we have acquired Dublin, Ireland-based Kinzen, a global leader in protecting online communities from harmful content. Kinzen’s advanced technology and deep expertise will help us more effectively deliver a safe, enjoyable experience on our platform around the world.
 

Spotify’s current partnership with Kinzen, which began in 2020, has been critical to enhancing our approach to platform safety. The company’s unique technology is particularly suited for podcasting and audio formats, making its value to Spotify clear and unmatched. The technology the Kinzen team brings to Spotify combines machine learning and human expertise—backed by analysis from leading local academics and journalists—to analyze potential harmful content and hate speech in multiple languages and countries. 

“We’ve long had an impactful and collaborative partnership with Kinzen and its exceptional team. Now, working together as one, we’ll be able to even further improve our ability to detect and address harmful content, and importantly, in a way that better considers local context,” said Dustee Jenkins, Spotify’s Global Head of Public Affairs. “This investment expands Spotify’s approach to platform safety, and underscores how seriously we take our commitment to creating a safe and enjoyable experience for creators and users.”

Given the complexity of analyzing audio content in hundreds of languages and dialects, and the challenges in effectively evaluating the nuance and intent of that content, the acquisition of Kinzen will help Spotify better understand the abuse landscape and identify emerging threats on the platform.

“The combination of tools and expert insights is Kinzen’s unique strength that we see as essential to identifying emerging abuse trends in markets and moderating potentially dangerous content at scale,” said Sarah Hoyle, Spotify’s Head of Trust and Safety. “This expansion of our team, combined with the launch of our Safety Advisory Council, demonstrates the proactive approach we’re taking in this important space.”

Introducing the Spotify Safety Advisory Council

Spotify Safety Advisory Council logo on blue background

Over the past several months, Spotify has moved to being more transparent about our safety efforts. In January, we published our Platform Rules and took measures to ensure that creators on our platform view and adhere to them. These were first steps forward, and today we unveil another: our newly formed Spotify Safety Advisory Council—the first safety-focused council of its type at any major audio company.

The founding members of Spotify’s Safety Advisory Council are individuals and organizations around the world with deep expertise in areas that are key to navigating the online safety space. At a high level, the council’s mission is to help Spotify evolve its policies and products in a safe way while making sure we respect creator expression. Our council members will advise our teams in key areas like policy and safety-feature development as well as guide our approach to equity, impact, and academic research. Council members will not make enforcement decisions about specific content or creators. However, their feedback will inform how we shape our high-level policies and the internal processes our teams follow to ensure that policies are applied consistently and at scale around the world.

While Spotify has been seeking feedback from many of these founding members for years, we’re excited to further expand and be more transparent about our safety partnerships. As our product continues to grow and evolve, council membership will grow and evolve along with it. In the months ahead, we will work closely with founding members to expand the council, with the goal of broadening regional and linguistic representation as well as adding additional experts in the equity and impact space.

The founding members and partner organizations include the following:

Dangerous Speech Project, represented by Professor Susan Benesch and Tonei Glavinic

Center for Democracy and Technology, represented by Emma Llansó

Professor Danielle Citron

Dr. Mary Anne Franks

Alex Holmes

Institute for Strategic Dialogue (ISD), represented by Henry Tuck and Milo Comerford

Dr. Jonas Kaiser

Kinzen, represented by Founders Mark Little and Áine Kerr

Dr. Ronaldo Lemos

Dr. Christer Mattsson

Dr. Tanu Mitra

Desmond Upton Patton, PhD, MSW

Megan Phelps-Roper 

USC Annenberg Inclusion Initiative, represented by Dr. Katherine Pieper and Dr. Stacy L. Smith

 

You can read the full bios for the organizations and individuals here.

Spotify Policy Update

Spotify recently shared a new policy around hate content and conduct. And while we believe our intentions were good, the language was too vague, we created confusion and concern, and didn’t spend enough time getting input from our own team and key partners before sharing new guidelines.

It’s important to note that our policy had two parts. The first was related to promotional decisions in the rare cases of the most extreme artist controversies. As some have pointed out, this language was vague and left too many elements open to interpretation. We created concern that an allegation might affect artists’ chances of landing on a Spotify playlist and negatively impact their future. Some artists even worried that mistakes made in their youth would be used against them.

That’s not what Spotify is about. We don’t aim to play judge and jury. We aim to connect artists and fans – and Spotify playlists are a big part of how we do that. Our playlist editors are deeply rooted in their respective cultures, and their decisions focus on what music will positively resonate with their listeners. That can vary greatly from culture to culture, and playlist to playlist. Across all genres, our role is not to regulate artists. Therefore, we are moving away from implementing a policy around artist conduct.

The second part of our policy addressed hate content. Spotify does not permit content whose principal purpose is to incite hatred or violence against people because of their race, religion, disability, gender identity, or sexual orientation. As we’ve done before, we will remove content that violates that standard. We’re not talking about offensive, explicit, or vulgar content – we’re talking about hate speech.

We will continue to seek ways to impact the greater good and further the industry we all care so much about. We believe Spotify has an opportunity to help push the broader music community forward through conversation, collaboration and action. We’re committed to working across the artist and advocacy communities to help achieve that.