This question is from a Group Therapy post in our community. Add your advice in the comments!
I grew up in a very conservative environment. The first time I left my home town was 2 years ago. I believe in sex after marriage but now kinda being more open about it. However every time I meet a guy and we start talking about getting intimate I make sure that I tell him that I'm a virgin and ready for sex but somehow this piece of news is kind of a turn off to them. They eventually lose interest and never go on to have sex with me. I think it means to them that I am "stupid." Even when I went to 2 doctors for issues in my breast they were kind of a bit sarcastic about my virginity and said things like "what are you waiting for" with a smirk or "you don't want to die a virgin" and a laugh. I am feeling offended and I feel like I missed out on life. What I believed was right for a long time somehow the world is telling that it was silly. I don't know how to cope with this and I'm very embarrassed to tell anyone anymore.