What is Consent Culture?
Consent Culture is a culture where people understand that each person knows what is right for their own individual. It is where everyone respects each other’s choices for their own bodies. In a consent culture, everyone’s right to bodily autonomy is protected and ensured. Asking for consent for sex or other things is normalized and promoted by popular culture. There is no involuntary objectification.
Consent culture is the opposite of a rape culture, which is an environment in which sexual assault is prevalent, seen as "normal," and excused in media and Pop-culture. Rape culture is perpetuated through the use of misogynistic language, the objectification of bodies and the glamorization of sexual violence.
A Consent culture can be achieved when people make the decision to stop crossing the boundaries of others, stop harassing others, practice asking for consent, respecting people’s answers, and stopping when asked.
Consent Culture is created when we as individuals, communities, and institutions commit to stepping in when injustice or violence occurs. Consent Culture means raising our standards and working to promote fulfilling, safe relationships for all.
Understanding consent is a cup of tea!
How do you know?
- More Resources
- Scarleteen | Sex Education for the Real World