Generally, Instagram is a great space to preview beautiful photos and discover entertaining videos, but as is the case with most social media platforms, users will eventually come across a post or comment they find inappropriate for public consumption. Some may prefer to gloss over those types of content and keep scrolling their feeds, but others who feel a more heightened level of offense and concern over the platform’s safety may be driven to report what they saw to Instagram.

Instagram has a few safety features in place that are geared toward different types of viewers. For example, young users who sign up on the app are automatically relegated to a private account. There’s also the option to unfollow accounts or hide specific posts that users no longer want to see. Finally, there’s a report feature that lets users flag posts and users for infractions that go against the Instagram community guidelines.

Related: Instagram Introduces Account Safety Changes To Protect Its Teenage Users

There are many ways to draw attention to inappropriate posts, comments, or users on Instagram. To report a post seen on the Instagram main feed, tap the three-dot icon above the entry in question and select ‘Report.’ Next, pick a reason from the list why the post is being reported. To report an Instagram user, visit their profile page, tap the three-dot icon in the top-right corner of their account, and hit ‘Report.’ Follow onscreen prompts to complete the process. Users who browse Instagram but don’t necessarily have their own account on the platform can also submit a report about posts they see and deem as violations against the app’s community guidelines. Do note that reporting an Instagram post or profile is primarily anonymous unless the reason behind the flagging is intellectual property infringement.

What Types Of Posts Are Not Allowed On Instagram?

Instagram Report Post Pages

According to Instagram, any type of post containing spam, hate speech, credible threats of violence or praise of organized crime or hate groups, graphic violence, bullying or harassment, mentions of suicide or self-injury, eating disorders, scams, and fraudulent information may be reported. In addition, Instagram profiles that sell illegal or regulated goods, impersonate someone else, or are run by someone under the age of 13 can also be reported. Nudity is generally not allowed on Instagram, but some exceptions are extended to photos of nude paintings and sculptures, as well as people who post photos depicting post-mastectomy scarring and active breastfeeding. Selling weapons, drugs and sexual services is also forbidden on the platform.

Reports containing exhaustive information — such as links, usernames, captions and thorough descriptions of the flagged post — are reviewed and processed much quicker. A dedicated team of reviewers goes through every report submitted to Instagram and personally evaluates content for any guideline violations. Instagram may remove the entire post if any content rules have been breached. The account that posted the content may also be temporarily disabled. Additionally, Instagram may use the report to limit similar content from being recommended or appearing on the reporter’s main feed.

Instagram users who report content found on the app can monitor the status of their submission. First, on the user’s profile page, tap the three-line icon in the top-right corner of the screen, then go to ‘Settings.’ Next, hit‘ Help,’ select ‘Support requests,’ then go to ‘Reports.’ Updates to complaints should appear on this page, although some reports may not be viewable in this section. Users can also use this portal to request a review of the report findings if they disagree with Instagram’s decision regarding the flagged post or profile.

Source: Instagram 1, 2, 3, 4, 5

Next: How To Hide Instagram Stories From Specific Users