Tag: safety at spotify

Partnering With Industry To Create a Safer Online Environment for Young People

By Marcelina Slota – Head of Platform Integrity, Spotify

Updated September 13, 2024

To continue our work providing positive experiences for young people, we’re proud to announce that PROJECT ROCKIT is joining our Spotify Safety Advisory Council (SSAC). Through their global advocacy, PROJECT ROCKIT empowers young people to combat bullying at school, online, and beyond, and their expertise will help inform how Spotify identifies and manages bullying and harassment however it may arise on our platform. Our work with PROJECT ROCKIT will build upon our long-standing partnerships with experts worldwide to advise how we continue building Spotify with safety by design at every step. 

“It’s a privilege for PROJECT ROCKIT to join Spotify’s Safety Advisory Council,” says Lucy Thomas, Cofounder and CEO of PROJECT ROCKIT. “For years, we’ve worked tirelessly to create safer spaces for young people, and this collaboration represents an incredible opportunity to extend that impact globally. By bringing youth-driven perspectives to Spotify, we can help shape a platform that champions both safety and authentic expression. Together, we’re paving the way for a digital world where young people are empowered, respected, and safe.” 

Original Post

Safety is a top priority for Spotify, which is why we want to make it easier for young people and parents to understand and navigate the digital world. Today, we’re announcing that we have a new Parental Guide to help do just that, and that Spotify has joined the Tech Coalition to share best practices with industry on upholding youth safety.

The Tech Coalition is a trusted organization that unites the global tech industry to foster a safer online environment for young people by preventing and combating online child sexual exploitation and abuse. This partnership expands our network of trusted third-party experts who help advise our teams on how we launch policies and products with safety by design. Some of the key stakeholders who’ve been helping guide our work in this space include Thorn, the WeProtect Global Alliance, and Safety Advisory Council members Alex Holmes (Deputy CEO of the nonprofit The Diana Award and founder of Anti-Bullying Ambassadors). 

In addition to partnering with experts across the globe, Spotify works to craft an experience that is both safe and enjoyable for young people through a number of ways, including by:

  • Establishing a zero-tolerance policy against content that exploits children and Platform Rules that ban illegal and/or abusive behaviors that could harm children;
  • Leveraging machine learning signals and establishing user reporting mechanisms to detect potential policy and/or legal violations;
  • Staffing teams around the clock to review and promptly remove potentially violating or explicit content; and
  • Connecting potentially vulnerable users to mental health resources when they search for content related to suicide, self-harm, and disordered eating content.

We have also launched our Parental Guide to help explain how parents can curate the experience that’s right for their families. Our work in this space is ongoing and we’ll continue updating this resource (and others) to reflect the best and most up–to-date information. 

How Spotify Is Protecting Election Integrity in 2024

Dustee Jenkins, Chief Public Affairs Officer, Spotify

With billions of people from over 50 countries heading to the polls to cast their vote, 2024 is shaping up to be the largest election year in history. Safeguarding our platform during critically important global events is a top priority for our teams, and we’ve spent years developing and refining our approach. Knowing that election safety is top of mind for many of our creators and users, we wanted to provide further insights into our approach at Spotify.

Our Approach 

To help creators understand what they can and cannot do on Spotify, we have Platform Rules that apply to everyone, partnerships with trusted experts who help us identify evolving forms of online abuse, and product interventions to elevate timely and trusted resources and help restrict the distribution of potentially harmful content. Our Platform Rules clearly state that content that attempts to manipulate or interfere with election-related processes is prohibited. When we identify violative content, we take the appropriate action.

When it comes to elections, risks often vary from country to country, and the types of abuse that manifest can be hyper-localized and nuanced. Rather than use the same one-size-fits-all approach to every market, our teams conduct individual risk assessments that consider a variety of indicators, including Spotify’s availability and specific product offerings in a country, historical precedents for online and offline harm during voting periods, and emerging geopolitical factors that may increase on-platform risks. 

We monitor these factors on an ongoing basis and use our learnings to inform policy and enforcement guidelines, customize in-product interventions, and determine where we may benefit from additional resourcing and/or third-party inputs. Ultimately, our primary focus is always to reduce risk, allowing our listeners, creators, and advertisers to enjoy our products. 

Expert Partnerships

To help us understand the nuance of potential harms around the world, Spotify acquired Kinzen in 2022. This has allowed us to conduct ongoing research in multiple languages and in critical areas like misinformation and hate speech. This research is supported by a pioneering tool called Spotlight designed to identify potential risks quickly within long-form audio content like podcasts.

Additionally, we partner closely with third-party experts to ensure we represent multiple viewpoints when discussing sensitive policy areas. These include our global Spotify Safety Advisory Council and the Institute for Strategic Dialogue.

In-Product Resources

During key elections, we encourage nonpartisan community engagement by connecting listeners with reliable, local information. These campaigns have driven millions of visits to civic engagement resources, helping users check their voter status, register to vote, or learn more about their local elections, whatever their political affiliation. 

We also leverage a combination of algorithmic and human curation to identify content that violates our guidelines and may update our recommendations to curb potentially manipulated or dangerous information.

Political Ads

Spotify accepts political advertisements across music, the Spotify Audience Network, and Originals & Licensed podcasts in certain markets.

Political ads may be placed in the Spotify Audience Network and Spotify’s free, ad-supported service inventory. An account must be eligible for political ads, and the account holder must complete an advertiser identity verification process to proceed. Political ads are unavailable for purchase via our self-serve tool, Spotify Ad Studio. 

Additionally, we require that political advertisements clearly disclose the use of any synthetic or manipulated media, including media created or edited with the use of artificial intelligence tools that depict real or realistic-looking people or events. This disclosure must be included in the advertisement and must be clear and conspicuous.

To read more about political ads in the markets where they are offered, and to learn about how to report an ad you believe violates our policies, please review Spotify’s political advertising editorial policies.

How Spotify Approaches Safety With Sarah Hoyle, Global Head of Trust and Safety

Spotify’s mission is to unlock the potential of human creativity by giving a million artists the opportunity to live off their art and billions of fans the opportunity to enjoy and be inspired by it. In support of that endeavor, our global teams work around the clock to ensure that the experience along the way is safe and enjoyable for creators, listeners, and advertisers. 

While there is some user-generated content on Spotify, the vast majority of listening time is spent on licensed content. Regardless of who created the content, our top priority is to allow our community to connect directly with the music, podcasts, and audiobooks they love. When we think about the safety aspect of this, it can be helpful to do so in the context of seeing a show at a performance venue.

Like a performance venue, Spotify hosts different types of shows across a variety of genres. Not every show may be suitable for all audiences or in line with everyone’s unique tastes. Just like people select which shows they want to see, Spotify provides opportunities for users to seek out and curate content that they like and that is appropriate for their preferences. For example, users can skip music tagged by creators or rights holders as “explicit” by using our content toggle. Mobile users can block artists or songs they wish to hide and exclude playlists from their taste profiles or use the “not interested” button to better control their experiences.

While Spotify strongly supports enabling creative expression and listener choice, this does not mean that anything goes. In the same way that a venue has rules to ensure that shows run smoothly and are safe, Spotify has Platform Rules to guide what’s acceptable content and behavior on our platform. Bad behavior at a concert can lead to things like backstage access being revoked or, in egregious situations, someone being kicked out of the venue. Breaking Spotify’s rules can have consequences like removal, reduced distribution, and/or demonetization. We will also remove content that violates the law and/or our Terms of Service. Creators or rights holders may also choose to remove content themselves.

Measures we continue to take around responsible content recommendations and search also play key roles in creating a safe and enjoyable experience. For example, product and engineering teams across the company work with Trust & Safety to conduct impact assessments that allow us to evaluate and better mitigate potential algorithmic harms and inequities. We’ve also been introducing search warnings and in-app messaging to users searching for suicide, self-harm, and disordered eating-related content, which link to Spotify’s Mental Health Resources page. This work is being done in partnership with experts like Ditch the Label and the Jed Foundation with the goal of connecting potentially at-risk users with trusted help resources. 

Keeping our platform safe is a challenging job and, as the landscape evolves, we’re committed to evolving along with it. Safety is a company-wide responsibility and our efforts involve ongoing coordination between engineers, product managers, data scientists, researchers, lawyers, and social impact experts, as well as the policy and enforcement experts in Trust & Safety. Many of the folks on these teams have long careers in online safety, as well as in fields like human rights, social work, academia, health care, and consulting. We have also established an internal Safety Leadership group that regularly brings executives from different departments together to help ensure awareness of safety needs and monitor progress on our efforts. 

To complement our in-house expertise, we also seek counsel and feedback from third-party experts around the world, including our Safety Advisory Council, to ensure we’re considering multiple points of view when shaping our safety approach. In 2022, we invested in the local and linguistic expertise of start-up Kinzen, now known as the Content Safety Analysis team within Spotify, which has a nuanced understanding of the global safety landscape and works proactively to deliver a safe and enjoyable experience across our content offerings. 

Click here to learn more about Spotify’s approach to safety.