If you’ve ever wondered exactly what sorts of things Facebook would like you not to do on its service, you’re in luck. For the first time, the social network is publishing 27 pages of detailed guidelines to what does and doesn’t belong on its service.

So please don’t make credible violent threats or revel in sexual violence; promote terrorism or the poaching of endangered species; attempt to buy marijuana, sell firearms, or list prescription drug prices for sale; post instructions for self-injury; depict minors in a sexual context; or commit multiple homicides at different times or locations.

Facebook already banned most of these actions on its previous “community standards” page , which sketched out the company’s standards in broad strokes, but on Tuesday it will spell out the sometimes gory details.

The updated community standards will mirror the rules its 7,600 moderators use to review questionable posts, then decide if they should be pulled off Facebook. They will also determine whether to call in the authorities.

The standards themselves aren’t changing, but the details reveal some interesting tidbits. Photos of breasts are OK in some cases — such as breastfeeding or in a painting — but not in others. The document details what counts as sexual exploitation of adults or minors, but leaves room to ban more forms of abuse, should it arise.

Since Facebook doesn’t allow serial murders on its service, its new standards even define the term. Anyone who has committed two or more murders over “multiple incidents or locations” qualifies. But you’re not banned if you’ve only committed a single homicide. It could have been self-defense, after all.

Moderators work in 40 languages. Facebook’s goal is to respond to reports of questionable content within 24 hours. But the company says it doesn’t impose quotas or time limits on the reviewers.

Facebook uses a combination of the human reviewers and artificial intelligence to weed out content that violates its policies. But its AI tools aren’t close to the point where they could pinpoint subtle differences in context and history, not to mention shadings such as humor and satire that would let them make judgments as accurate as those of humans.