Bumble and Match Take a Stand: Stop Advertising on Instagram Amid Content Controversy
Brands generate significant revenue through advertising on social media platforms. However, there is a potential risk involved if their ads are displayed alongside inappropriate content, which can harm the brand’s reputation and create negative associations. Additionally, there is a danger of targeting the wrong audience, further damaging the brand’s image. In a surprising development, popular online dating platforms Bumble and Match have chosen to halt their advertising on Instagram. This decision comes after a report by The Wall Street Journal revealed that their ads were being shown alongside explicit and child abuse content in Instagram’s Reels feeds. To learn more about this incident, click here.
Bumble and Match to stop advertising on Instagram?
Dating apps such as Bumble and Match have stopped advertising on Instagram after their ads appeared next to child sexual content. The Wall Street Journal ran some tests using accounts that followed young gymnasts, cheerleaders and influencers. The report found that Instagram’s algorithm flagged explicit and inappropriate content, including risque footage of children and overtly sexual videos aimed at adults, as well as ads from major brands such as Bumble, Disney, Walmart and others. This disappointment prompted Bumble and Match to take immediate action.
It was revealed that Instagram’s system placed content such as a Bumble ad that appeared between a video of someone interacting with a life-size latex doll and a video of a young girl in a compromised position. However, we have not been able to verify the same.
Some brands have said that Meta will pay for independent audits to determine whether placing their ads near inappropriate content harms their brand.
This problem also affected other big brands like Disney, Pizza Hut and Walmart. According to Meta, the tests conducted by the Wall Street Journal do not represent what billions of users see. Meta did not respond in this regard. However, a Meta spokesperson told the WSJ that the company introduced new brand safety tools in October to give advertisers more control over ad placement. He also said Instagram either removes or downgrades about four million videos every month, which appears to violate Meta’s standards.
This case highlights the urgent need for social media platforms to improve their content control mechanisms and ensure a safer online environment for both users and advertisers.