Apple has informed the social media network Parler that it has 24 hours to rid the platform of inappropriate content, or else face removal from the App Store. The news was first reported by Input Mag, who obtained a copy of the email sent from Apple to Parler.

Apple wrote in the email that it disagrees with Parler’s hands-off approach to moderation, emphasizing that it will not distribute apps that include the types of content found on Parler. Apple says that Parler is responsible for all user-generated content, as per the App Store Review Guidelines.

Apple specifically cites the “illegal activities” that took place in Washington D.C. on January 6 as an example. Apple writes that Parler was used to “plan, coordinate, and facilitate” what happened.

“We want to be clear that Parler is in fact responsible for all the user generated content present on your service and for ensuring that this content meets App Store requirements for the safety and protection of our users,” the company said. “We won’t distribute apps that present dangerous and harmful content.”

Apple shared links to multiple examples of Parler posts inciting violence in the email, not limited to but including:

“We have received numerous complaints regarding objectionable content in your Parler service, accusations that the Parler app was used to plan, coordinate, and facilitate the illegal activities in Washington D.C. on January 6, 2021 that led (among other things) to loss of life, numerous injuries, and the destruction of property. The app also appears to continue to be used to plan and facilitate yet further illegal and dangerous activities.”

Apple’s ultimatum

If the content cited by Apple, and all similar content, is not removed within the next 24 hours, then the Parler app will be kicked out of the App Store. Parler, which has publicly taken a hands-off approach to moderation, must also prove to Apple that it will adopt systems and practices to avoid this kind of content from appearing on the social network in the future.

Honest question for @AppStore and @GooglePlay.

If Parler continues to allow incitement and calls for violence, doesn’t that break your Terms of Service for apps? pic.twitter.com/CkXg99Trl7

— Sleeping Giants (@slpng_giants) January 7, 2021

Apple specifically cites guideline 1.2 of its App Store Review Guidelines, which says that apps with user-generated content must also have precautions in place to manage objections content:

“Please remove all objectionable content from your app and submit your revised binary for review. Such content includes any content similar to the examples attached to this message, as well as any content referring to harm to people or attacks on government facilities now or at any future date. In addition, you must respond to this message with detailed information about how you intend to moderate and filter this content from your app, and what you will do to improve moderation and content filtering your service for this kind of objectionable content going forward.”

Finally, Apple also cited comments by Parler CEO John Matze in which said he does not “feel responsible for any of this and neither should the platform,” in reference to the riots at the Capitol in Washington D.C. this week.

Guideline 1.2 – Safety – User Generated Content

Your app enables the display of user-generated content but does not have sufficient precautions in place to effectively manage objectionable content present in your app.

In response to Apple’s threat to ban Parler from the App Store, Parler co-owner Dan Bongino posted on the platform that this is “clearly an idealogical decision, not a principled one.” He went on to call on Parler users to “spread the word about this destructive war on civil liberties.”

Your CEO was quoted recently saying “But I don’t feel responsible for any of this and neither should the platform, considering we’re a neutral town square that just adheres to the law.” We want to be clear that Parler is in fact responsible for all the user generated content present on your service and for ensuring that this content meets App Store requirements for the safety and protection of our users. We won’t distribute apps that present dangerous and harmful content.

Given that Parler has publicly touted that it will forever take a hands-off approach to moderation, it remains to be seen whether it will cave to Apple’s pressure. With Apple having sent its email this morning, we should know the answer as soon as the 24-hour clock expires sometime on Saturday morning.