What can we learn about the GIPHY content moderation slip-up?

James Glenn Gomez Published on March 16, 2018 Last updated on October 23, 2023

A lot of people understand how vital content moderation is. Unfortunately, blunders like the GIPHY content moderation slip-up still happen.

In March 2018, GIPHY, a popular online platform for sharing GIFs, faced a controversy related to a content moderation slip-up.

The incident involved the discovery of racist and offensive GIFs within GIPHY’s library, which were available for users to search and share. These offensive GIFs came to light when a user discovered a racial slur in the search results and raised concerns about the inappropriate content.

What GIPHY Did Next

The discovery prompted a swift response from GIPHY, who took immediate action to remove the offensive content and investigate how such content made its way into their library. GIPHY acknowledged the seriousness of the issue and issued an apology for the incident, expressing their commitment to ensuring a safe and inclusive platform for users.

Following the incident, GIPHY implemented stricter content moderation measures and reviewed their content approval processes.

They also introduced new policies and guidelines for content providers and users to maintain a higher level of quality and appropriateness in their GIF library. GIPHY emphasized the importance of community feedback and encouraged users to report any offensive content they come across.

Lessons from the GIPHY Content Moderation Slip-Up

The controversy surrounding the GIPHY content moderation slip-up served as a wake-up call for the company and the broader industry.

  • Effective and Real-time Moderation

    It highlighted the challenges of effectively moderating user-generated content in real-time and the importance of having robust systems and processes in place to prevent offensive or inappropriate material from circulating on the platform.

  • Continuous Monitoring and Improvement

    The incident also raised awareness about the need for continuous monitoring and improvement of content moderation practices to ensure that platforms like GIPHY uphold their responsibility to provide a safe and inclusive environment for users.

    It prompted discussions around the impact of offensive content on users and the role of technology companies in maintaining a high standard of content quality and appropriateness.

  • GIPHY’s Commitment

    In response to the controversy, GIPHY took steps to rectify the situation, improve their content moderation processes, and strengthen their commitment to fostering a positive user experience. As a result, Instagram and Snapchat have allowed stickers from GIPHY to be posted on the social media platform once again.

    While the GIPHY content moderation slip-up was a misstep, it also served as an opportunity for growth and learning, highlighting the importance of proactive content moderation practices in the online ecosystem.

    The GIPHY Content Moderation Approach Post-2018

    Since the GIPHY content moderation blunder, the platform has since adopted several improvements to its platform:

    • Image referencing via machine learning helps detect whether a GIF contains known prohibited content.
    • Human moderators and users are also involved in the GIPHY content moderation process, with the former sometimes doing the assessment without the help of machine learning.
    • Also included in the GIPHY content moderation approach are the community guidelines that clearly lists down the categories of content that are not allowed in the platform including violence, hate, and self-harm.
    • GIPHY also keeps a list of keywords that will not yield any results in its search bar.

    Valuable Insights

    The GIPHY content moderation incident offers valuable lessons for content moderators in various areas:

    1. Vigilance in Content Moderation: Content moderators must maintain a continuous watchfulness, actively reviewing and monitoring user-generated content to identify and remove any offensive or inappropriate material promptly.
    2. Regular Monitoring: Regular and thorough content checks are necessary to prevent slip-ups that can harm users and tarnish a platform’s reputation.
    3. Stronger Moderation Processes: Strengthening moderation processes is another crucial lesson learned. Content approval mechanisms should be robust, efficient, and supported by clear guidelines and policies.
    4. Audits and Reviews: Regular audits and reviews of moderation practices can help identify potential weaknesses and areas for improvement. Through effective processes, platforms can ensure that content moderators have the necessary tools and support to carry out their responsibilities effectively.
    5. User Feedback and Reporting: User feedback and reporting play a vital role in content moderation. Platforms should encourage users to actively report any inappropriate or offensive content they encounter.
    6. Moderator Training: Regular training and education for content moderators are essential to keep them updated on emerging trends and challenges. Training programs should cover topics such as identifying misinformation, understanding cultural sensitivities, and staying informed about evolving content moderation practices.
    7. Shared Responsibility: Platforms must collaborate closely with content moderators, providing them with the necessary resources, support, and feedback channels to effectively carry out their duties.

    Other Instances of Content Moderation Blunders

    While the GIPHY content moderation controversy of 2018 was unique to that platform, there have been other instances that highlight the importance of robust content moderation practices and the potential consequences of content moderation slip-ups. Here are a few notable examples:

    • Facebook’s Content Moderation Challenges

      Facebook, as one of the largest social media platforms, has faced numerous content moderation controversies over the years. These include issues related to hate speech, misinformation, and the spread of harmful content.

      In several instances, offensive or inappropriate content managed to evade Facebook’s content moderation systems, leading to public backlash and calls for improved moderation practices.

    • YouTube’s Advertiser Boycott

      In 2017, major advertisers boycotted YouTube due to ads being displayed alongside extremist and offensive content. The incident highlighted the challenges of content moderation on a platform with billions of user-uploaded videos.

      YouTube responded by strengthening its content policies, improving its algorithms, and increasing human review to prevent ads from appearing alongside inappropriate content.

    • Reddit’s Missteps with Moderation

      Reddit, a popular online community platform, has faced criticism for its handling of content moderation in various subreddits.

      Some subreddits were found to host explicit or offensive content that violated the platform’s guidelines. These incidents sparked discussions around the challenges of maintaining consistent and effective content moderation across diverse communities on the platform.

    • Twitter’s Struggle with Harassment

      Twitter has grappled with issues related to harassment and abuse on its platform. Users have raised concerns about the platform’s ability to effectively moderate and address instances of targeted harassment, hate speech, and threats.

      Twitter has made efforts to improve content moderation practices, introducing new policies, and enhancing reporting mechanisms to combat abusive behavior.

    Prevention Measures

    To avoid controversies similar to the GIPHY content moderation incident, brands should prioritize the implementation of robust content moderation practices and guidelines. Here are some key steps they can take:

    1. Establish Clear Content Moderation Guidelines

      Brands should develop comprehensive content moderation guidelines that clearly outline what is acceptable and unacceptable content on their platforms.

      These guidelines should cover various categories, such as hate speech, offensive imagery, discrimination, harassment, and explicit content. It is crucial to provide specific examples and explanations to ensure clarity and consistency in moderation decisions.

    2. Regularly Review and Update Guidelines

      Content moderation guidelines should be periodically reviewed and updated to adapt to evolving societal norms, legal requirements, and emerging content trends. Brands should stay informed about new forms of offensive or harmful content and update their guidelines accordingly to address potential loopholes.

    3. Implement Stringent Approval Processes

      Brands should establish robust approval processes for user-generated content before it is published or shared on their platforms. This may involve manual review by content moderators or the use of automated tools to scan and filter content for potential violations. Stricter approval processes can help prevent inappropriate or offensive content from entering the public domain.

    4. Provide Clear Reporting Mechanisms

      Brands should encourage users to report any content that violates the moderation guidelines. Implementing user-friendly reporting mechanisms, such as reporting buttons or forms, enables users to easily flag problematic content. Promptly address and investigate user reports to ensure swift action against violations.

    5. Invest in Moderation Resources

      Adequate resources, including trained content moderation teams and appropriate technological tools, are essential for effective moderation.

      Just like the GIPHY content moderation approach, brands should invest in building or outsourcing moderation capabilities to ensure that content is reviewed thoroughly and consistently. This includes providing ongoing training and support to content moderators to enhance their understanding of guidelines, context, and emerging trends.

    6. Foster Transparency and Communication

      Brands should maintain open and transparent communication with their user community regarding content moderation practices.

      Clearly communicate the platform’s moderation policies, how user reports are handled, and the actions taken against content violations. Engage with users through public statements, FAQs, and regular updates to address concerns and build trust.

    7. Proactively Monitor and Audit

      Brands should proactively monitor user-generated content for potential violations and conduct regular audits of their moderation processes. This helps identify any weaknesses, gaps, or patterns in content moderation and allows for continuous improvement.

    8. Respond Swiftly to Issues

      In the event of a content moderation controversy, brands should respond swiftly, transparently, and empathetically.

      Acknowledge the issue, apologize if necessary, and communicate the steps being taken to rectify the situation and prevent similar incidents in the future. Engage with the user community, address concerns, and demonstrate a commitment to maintaining a safe and inclusive environment.

    9. Learn from Mistakes

      Brands should view any content moderation missteps as opportunities for learning and improvement.

      Conduct thorough post-incident analyses to understand the root causes, identify areas for enhancement, and implement corrective measures. This includes revisiting moderation processes, guidelines, and training programs to prevent similar controversies in the future.

    Following these steps and continuously evolving their content moderation practices allows brands to minimize the risk of controversies such as the GIPHY content moderation slip-up and foster a safe and positive online environment for their users.

    Outsourcing to Avoid Moderation Blunders

    Outsourcing content moderation jobs can bring several benefits to online platforms like GIPHY in maintaining their good reputation. Here are some of the key advantages:

    1. Expertise and Specialization

      Outsourcing content moderation allows platforms to tap into the expertise and specialization of professional moderation teams. Dedicated outsourcing partners have extensive experience in content moderation across various platforms and industries.

      They possess in-depth knowledge of industry best practices, emerging trends, and the latest technologies. Leveraging their expertise enables platforms to ensure more accurate and efficient moderation, leading to better overall user experiences and a stronger reputation.

    2. Scalability and Flexibility

      Outsourcing content moderation provides platforms with the scalability and flexibility needed to handle fluctuating content volumes. As user-generated content continues to grow rapidly, outsourcing partners can quickly scale their resources to meet increasing moderation demands.

      This scalability ensures that platforms can maintain swift response times, handle peak periods effectively, and adapt to evolving content trends without compromising the quality of moderation.

    3. 24/7 Coverage

      Online platforms operate globally, and user activity occurs around the clock. Outsourcing content moderation allows platforms to benefit from 24/7 coverage, ensuring that content is reviewed and addressed in a timely manner, regardless of time zones or geographic locations.

      This round-the-clock moderation helps prevent the spread of harmful or inappropriate content, enhances user safety, and builds trust among the platform’s diverse user base.

    4. Cost-effectiveness

      Outsourcing content moderation can be a cost-effective solution for online platforms. Establishing an in-house moderation team requires significant investments in recruitment, training, infrastructure, and ongoing management.

      Outsourcing allows platforms to avoid these upfront costs and ongoing operational expenses. Instead, they can leverage the outsourcing partner’s existing infrastructure, resources, and expertise, reducing the overall financial burden.

    5. Access to Advanced Technology

      Professional content moderation providers often have access to advanced technologies and tools specifically designed for content moderation tasks. These technologies, such as AI-powered content analysis and automated filtering systems, can enhance the efficiency and accuracy of moderation processes.

      Through outsourcing, platforms can leverage these cutting-edge technologies without the need for extensive in-house development or investment.

    6. Language and Cultural Expertise

      Online platforms with a global user base face the challenge of moderating content in multiple languages and cultural contexts.

      Outsourcing content moderation to providers with multilingual capabilities ensures that platforms can effectively moderate content across different languages and cultures. Language expertise helps in accurately interpreting and understanding the context of user-generated content, reducing the chances of misinterpretation or bias.

    7. Risk Mitigation

      Content moderation involves potential legal risks and regulatory compliance challenges. Outsourcing content moderation can help mitigate these risks by partnering with providers who are well-versed in local laws and regulations.

      They can ensure that content moderation practices align with legal requirements, preventing potential lawsuits, penalties, or reputational damage associated with non-compliance.

    8. Focus on Core Business Activities

      Outsourcing content moderation allows online platforms to focus on their core business activities, such as platform development, user experience enhancements, and strategic growth initiatives.

      Content moderation is a specialized function that requires dedicated resources and attention. Outsourcing allows platforms to delegate this task to experts, freeing up internal resources to concentrate on other critical aspects of the platform’s operations and growth.

    Key Takeaways

    The GIPHY content moderation slip-up of 2018 serves as a powerful reminder of the critical role content moderators play in maintaining a safe and inclusive online environment. Reflecting on this incident and the lessons learned helps content moderators gain valuable insights into the challenges and best practices of content moderation.

    Vigilance, strong moderation processes, user feedback, regular training, and collaboration between platforms and content moderators are essential for effective content moderation. Looking at improvements on the GIPHY content moderation can also be helpful in guiding moderators and brands in keeping platforms clean.

    If you are concerned about maintaining a good brand image online (as you should), then don’t hesitate to partner with Open Access BPO.

    One of the packages we provide as an outsourcing firm is content moderation, and we have proven to be experts at it due to our multilingual capabilities. We at Open Access BPO take pride in our diverse team of well-trained agents who are always ready to manage content that will keep your brand reputation spotless.

    Contact us today to begin your partnership with Open Access BPO.

    When it comes to content moderation, consider the butterfly effect. In a sense, one mistake can make a ripple and affect everything. And sometimes, one mistake can cost a business relationship.

    Earlier this year, image-sharing social media apps Instagram and Snapchat had integrated the animated GIF aggregator GIPHY into their apps as stickers for snaps and stories. Just months later, they’re now cutting ties with the GIF database after a content moderation mishap. This comes after one racist GIF wasn’t scrubbed from their library, and was therefore available for use for the apps. The GIF in question is a “death counter” of sorts for black people. Thus, their parting of the ways with GIPHY.

    While the GIF aggregator blamed a glitch in their code that allowed for the tasteless moving image to be available as a sticker, it doesn’t change the fact that this has strained their online reputation. Such things happen when content moderation goes awry. In retrospect, you can learn many things from this mishap, including the following.

    1. Automation isn’t foolproof
      inverted color robots working in office

      It’s not every day that one game-changing bug like this occurs to a website. However, if you’re hosting user-generated content—images, videos, audio, or what have you—it’s important to take note that artificial intelligence won’t catch objectionable content 100% of the time. GIPHY has a gigantic library of GIFs; an offensive one went past their radar. Content moderation still needs a human moderator’s subjective eye to discern what is and what isn’t acceptable.

    2. Implement a foolproof content moderation strategy
      business team developing content moderation strategy on white board by glass wall

      Sometimes, it takes a bit of a reality check like this one to know that something’s wrong with your moderation policies. Your content moderation strategy must be stringent enough to prevent objectionable content from spilling out into anyone’s dashboards but accommodating enough to encourage creation of user-generated content (UGC). Whether you’re going to use AI, human moderators, or both, your content moderation strategy must be reliable.

    3. Be thorough and proactive
      stylish gentleman with robot in dark

      Your content won’t clean itself. You must take a thorough and proactive approach in preventing the prevalence of objectionable content on your website. Take, for instance, GIF database alternative Gfycat. In order to combat “deepfakes” (AI-assisted face-swap pornography), it uses two kinds of AI that identify fakery if they see one. Since AI’s not foolproof, they use human moderators and review a GIF’s metadata—all just to combat deepfakes. This preemptive strike on such malicious content allows Gfycat to maintain the safety of their database.

    4. Consider outsourcing your content moderation services
      content moderation analyst being assisted by call center team leader

      If you feel that your company’s lacking in its reviewing efforts, maybe it’s time to consider moving your content moderation services to a provider. Now, there’s nothing wrong with an in-house team of reviewers. However, outsourcing presents an economical and scaling advantages without a drop in quality, letting you maintain a pristine website at a fraction of the cost.

    5. Sometimes, one mistake can be too little too late when it comes to content moderation. Such is what happened to the app integration of GIPHY with Instagram and Snapchat. Prevent such mistakes by taking a holistic and proactive approach to reviewing content.

Join us on facebook
Open Access BPO Yesterday
Addressing upset customers may be challenging, but it's critical for maintaining brand trust and #CustomerSatisfaction.
Proactive resolution builds lasting positive impressions and strengthens #BrandLoyalty.

Turn challenges into opportunities!
Here are statements your #CX agents must say to calm irate callers down: https://buff.ly/3CKOvcl

----------
For #outsourcing services that build and maintain strong customer relationships: https://buff.ly/40VNfeo

#WeSpeakYourLanguage
#CustomerExperience #CSat
#CustomerSuccess #CallCenter
Open Access BPO Yesterday
We're in the first quarter of 2025. Experts have made some bold predictions on how #CustomerExperience will change this year.

There are also expectations for #CallCenters to level up now that the calendar has flipped to the new year.

We have compiled some of these predictions here, so check them out!

----------
Successfully navigating these evolving customer expectations often requires a strategic partnership with a competent outsourcer.

Contact us today to learn more about how Open Access BPO can help your brand deliver exceptional #CX: OpenAccessBPO.com/get-started

#WeSpeakYourLanguage
#CustomerService #CallCenterOutsourcing
#OutsourcingCX #CustomerSupport
Open Access BPO 2 days ago
Who has the time to fully determine what every customer wants?

𝗕𝘂𝘁 𝗶𝘀 𝘁𝗵𝗲𝗿𝗲 𝘀𝘁𝗶𝗹𝗹 𝗮 𝘄𝗮𝘆 𝗳𝗼𝗿 𝘆𝗼𝘂𝗿 𝗯𝘂𝘀𝗶𝗻𝗲𝘀𝘀 𝘁𝗼 𝗽𝗿𝗼𝘃𝗶𝗱𝗲 𝗮𝗻 𝗲𝗹𝗲𝘃𝗮𝘁𝗲𝗱 #𝗖𝘂𝘀𝘁𝗼𝗺𝗲𝗿𝗘𝘅𝗽𝗲𝗿𝗶𝗲𝗻𝗰𝗲 𝘁𝗵𝗮𝘁 𝘀𝘂𝗶𝘁𝘀 𝘁𝗵𝗲𝗶𝗿 𝗽𝗿𝗲𝗳𝗲𝗿𝗲𝗻𝗰𝗲𝘀?
Of course, there is!

Learn more about it in our blog: https://buff.ly/4gIycdW

----------
Partner with a multilingual #CustomerSupport expert for an unforgettable consumer experience: https://buff.ly/3EEABJa

#WeSpeakYourLanguage
#CustomerExperience #CX
#CustomerService #CallCenters
Open Access BPO 2 days ago
𝗠𝘂𝗹𝘁𝗶𝗹𝗶𝗻𝗴𝘂𝗮𝗹 𝗼𝘂𝘁𝘀𝗼𝘂𝗿𝗰𝗶𝗻𝗴 𝗳𝗶𝗿𝗺 𝗢𝗽𝗲𝗻 𝗔𝗰𝗰𝗲𝘀𝘀 𝗕𝗣𝗢'𝘀 𝗹𝗮𝗿𝗴𝗲𝘀𝘁 𝗲𝗺𝗽𝗹𝗼𝘆𝗲𝗲 𝗿𝗲𝗳𝗲𝗿𝗿𝗮𝗹 𝗽𝗿𝗼𝗴𝗿𝗮𝗺, 𝘁𝗵𝗲 𝗥𝗲𝗳𝗲𝗿 & 𝗪𝗶𝗻 𝗖𝗵𝗮𝗹𝗹𝗲𝗻𝗴𝗲, 𝗿𝗲𝗰𝗲𝗻𝘁𝗹𝘆 𝗰𝗼𝗻𝗰𝗹𝘂𝗱𝗲𝗱, 𝘀𝘂𝗰𝗰𝗲𝘀𝘀𝗳𝘂𝗹𝗹𝘆 𝗯𝗼𝗼𝘀𝘁𝗶𝗻𝗴 𝘁𝗮𝗹𝗲𝗻𝘁 𝗮𝗰𝗾𝘂𝗶𝘀𝗶𝘁𝗶𝗼𝗻 𝗮𝗻𝗱 𝗱𝗿𝗶𝘃𝗶𝗻𝗴 𝗲𝗺𝗽𝗹𝗼𝘆𝗲𝗲 𝗲𝗻𝗴𝗮𝗴𝗲𝗺𝗲𝗻𝘁.

Top referrer, fintech CX support specialist Gert Pincher Pranza, won a Yamaha Mio i125 motorcycle. Two other employees won Redmi Note 13 Pro+ 5G smartphones, and three more received phones in a live raffle.

The initiative incentivized employees to leverage their networks, expanding the company's reach and bringing in a diverse range of qualified candidates to strengthen Open Access BPO's outsourcing capabilities.

𝘗𝘩𝘰𝘵𝘰 𝘴𝘩𝘰𝘸𝘴 𝘗𝘳𝘢𝘯𝘻𝘢 𝘳𝘦𝘤𝘦𝘪𝘷𝘪𝘯𝘨 𝘵𝘩𝘦 𝘛𝘰𝘱 𝘙𝘦𝘧𝘦𝘳𝘳𝘦𝘳 𝘱𝘳𝘪𝘻𝘦 𝘧𝘳𝘰𝘮 𝘏𝘦𝘢𝘥 𝘰𝘧 𝘙𝘦𝘤𝘳𝘶𝘪𝘵𝘮𝘦𝘯𝘵, 𝘛𝘳𝘢𝘪𝘯𝘪𝘯𝘨 𝘢𝘯𝘥 𝘚𝘪𝘵𝘦 𝘚𝘶𝘱𝘱𝘰𝘳𝘵 𝘚𝘦𝘳𝘷𝘪𝘤𝘦𝘴 𝘉𝘰𝘣𝘣𝘺 𝘑𝘶𝘴𝘢𝘺𝘢𝘯 𝘢𝘯𝘥 𝘎𝘭𝘰𝘣𝘢𝘭 𝘝𝘗 𝘑𝘰𝘺 𝘚𝘦𝘣𝘢𝘴𝘵𝘪𝘢𝘯.

--------------------
We invest in our people so you can invest in your brand's growth.
Our best-in-class hiring and training ensure a reliable and consistent workforce for your #CallCenter #outsourcing needs.

Get in touch to discuss your requirements: https://buff.ly/3X6zgkO

#WeSpeakYourLanguage
Open Access BPO 3 days ago
Building strong customer relationships requires understanding diverse needs and perspectives.

𝗜𝗻𝗰𝗹𝘂𝘀𝗶𝘃𝗲 𝗵𝗶𝗿𝗶𝗻𝗴 allows us to build a team of #CustomerSupport experts who can connect with a broader customer base. By drawing on their range of backgrounds and experiences, our agents can build trust and rapport, transcending language and cultural differences.

Learn more about practical strategies, success stories, and the power of embracing #diversity for better innovation and growth: https://buff.ly/3EJUX3M

----------
Our recruitment & training practices are meticulously designed to ensure exceptional #CX: https://buff.ly/4b66re4

#WeSpeakYourLanguage
#DiversityandInclusion #inclusivity
#OABPOonEthnicEquality #EthnicEqualityMonth
#EthnicEquality #Diversity
Open Access BPO 4 days ago
𝗠𝗮𝗸𝗲 𝘆𝗼𝘂𝗿 𝗰𝘂𝘀𝘁𝗼𝗺𝗲𝗿𝘀 𝗳𝗲𝗲𝗹 𝘀𝗽𝗲𝗰𝗶𝗮𝗹 𝘄𝗶𝘁𝗵 𝘁𝗵𝗲𝘀𝗲 #𝗖𝘂𝘀𝘁𝗼𝗺𝗲𝗿𝗦𝗲𝗿𝘃𝗶𝗰𝗲 𝗴𝗲𝘀𝘁𝘂𝗿𝗲𝘀!
Every interaction with a customer is an opportunity to build a lasting relationship.

Our blog post offers 10 simple ways your #CallCenter agents can go the extra mile: https://buff.ly/3Qn30Gl

----------
Want to elevate your #CustomerExperience even further?

Outsource to Open Access BPO, your trusted 24/7 multilingual partner!
𝐆𝐞𝐭 𝐬𝐭𝐚𝐫𝐭𝐞𝐝 𝐭𝐨𝐝𝐚𝐲: https://buff.ly/4k2k7Le

#WeSpeakYourLanguage
#outsourcing #CX #CSat
#CustomerSatisfaction