I realised the other day that, although I had sex education at school and ‘the talk’, I learned a lot about sex from friends and from television.
Not necessarily about the act of sex itself, but about terms and words that relate to sex and sexuality.
When I was in grade 6, a friend said ‘raise your hand if you’re a virgin’. I didn’t raise my hand because I had no idea what that meant at the time! I learnt then after people starting making a fuss.
How did you learn about sex? Do you think it benefitted you to learn it that way?
What’s something about sex that you didn’t know about, that friends, family, or pop culture educated you about?