How Eventbrite Moderates Content
Last updated: February 19, 2025
Eventbrite’s mission is to bring people together through live experiences. We are deeply committed to creating a welcoming and open marketplace that creators and consumers cherish and trust, supporting a wide spectrum of live experiences. In this article, we share more about how we moderate content on our platform.
Content moderation is the process by which Eventbrite reviews and takes action on content that can harm the integrity of its marketplace, as defined in our Community Guidelines . To combat platform abuse and maintain the integrity of our marketplace, we rely on a combination of tools and processes, including proactive detection via machine learning technology and rules-based systems, reactive detection via reports from our community, and human reviews.
Proactive Detection
We detect a significant portion of content that violates our Community Guidelines proactively via two primary methods: (1) rules-based systems (i.e., hand-crafted rules to identify content that meets specific criteria), and (2) machine learning technology (i.e., a statistical process in which a trained model recognizes certain types of patterns based on previous examples and makes predictions about new data).
Rules-based systems are effective when there are a few simple criteria indicating elevated risk. Machine learning technology leverages multiple signals and is thus more effective when there are complex patterns indicating elevated risk. Both detection strategies are validated by offline analysis and monitored for continuous improvement opportunities.
Reactive Detection
In addition to the proactive methods described above, our community plays an important role in reporting content they feel violates our Community Guidelines , either by Contacting Us or using the Report This Event link that exists in each event’s footer. Although we aspire to proactively remove violating content before anyone sees it, this open line of communication with our community is a crucial part of our content moderation program.
How We Review Content
While automation is essential to scaling our content moderation program, human review is required for most cases involving potential violations that might result in an event being unpublished. When our team reviews content for potential violations, here is what happens:
Review: Our team reviews the content, as well as off-platform information when appropriate, and determines whether the event listing violates our policies or is illegal.
Action: If the content is determined to violate our Community Guidelines or the law, the team will take appropriate action, including delisting content from search results, removing content from event listings, and unpublishing event listings from the marketplace. If the abuse of our Services is severe (in our sole evaluation), which can include repeatedly posting illegal content or repeatedly submitting unfounded complaints, we may also suspend or terminate the associated Eventbrite account.
Involve authorities: If we believe there is a legitimate risk of physical harm to someone or a group of people, or direct threats to public safety, we not only take action on our site but will also work with law enforcement, as appropriate and at our discretion. This includes issuing reports of child sexual abuse material to authorities such as the National Center for Missing and Exploited Children (NCMEC) in the US, and/or the relevant authorities as required by law. We encourage anyone who is – or knows people who are – in immediate danger as a result of content posted on Eventbrite, to please first contact local law enforcement. Once the issue is reported, it can be reported to Eventbrite here. If Eventbrite is requested to do so by law enforcement authorities, we may also, depending on the nature of the content and our obligations under applicable laws, remove content or terminate accounts.
Communicate: Once an event is unpublished, or other appropriate action is taken, the team will inform the relevant parties of the action taken and why.
Appeal: If users believe that we've taken incorrect action on their content, they can let us know by replying to our email within six months of the decision. In addition, per the EU Digital Services Act and the UK Online Safety Act , EU and UK residents can appeal our decision, whether related to their content or content they have reported, using the linked form in Eventbrite’s original decision notification. They will receive an email confirming receipt of their appeal request and another email when a decision has been made.
Eventbrite Ads
If an ad is determined to violate our Ad Content Guidelines, we may reject or remove only the ad, or take down the entire event listing if it also violates our Community Guidelines. We may also take alternative action that we determine to be appropriate given the circumstances surrounding the content.
Transparency Report
As part of our commitment to online safety and providing transparency and accountability to our customers, we publish an annual transparency report in response to our obligations under the EU Digital Services Act (DSA). You can view Eventbrite's 2025 DSA Transparency Report here .
Our priority is to moderate violating content quickly and with the least disruption to our community. By using a combination of proactive and reactive detection, and layering it with human review of escalated content where warranted, we can quickly detect and action content.