Facebook has improved its policies & communications on deleted posts & even provides a facility for users to provide feedback/questions/re-consideration as warranted in further review

https://www.pcmag.com/news/360624/why-did-facebook-remove-your-post-this-doc-might-help

Publishing Our Internal Enforcement Guidelines and Expanding Our Appeals Process

Ever wonder how Facebook decides what—and who—to remove from its platform?  Wonder no more because the social network just published the lengthy “Community Standards” its reviewers use to determine what is and isn’t allowed on Facebook. The standards are broken down into six categories: Violence and criminal behavior, safety, objectionable content, integrity and authenticity, respecting intellectual property, and content-related requests. They outline how Facebook deals with everything from threats of violence to suicide, self-injury, child porn and sexual exploitation, nudity, bullying, harassment, hate speech, and more.

The move to publish these once internal guidelines comes after The Guardian last year obtained and posted snippets of the company’s exhaustive and sometimes contradictory rules.  Facebook’s VP of Global Policy Management Monika Bickert said the company is now going public with this information to “help people understand where we draw the line on nuanced issues” and as a way to solicit feedback on how it can improve its guidelines. Next month, the company plans to launch a series of public events in the US, UK, Germany, France, India, and Singapore called “Facebook Forums: Community Standards” to get people’s feedback in person.