We live in a culture that doesn't value women's scars. There are a ton of creams, oils, and gels, in fact, that exist specifically to get rid of them. And while I understand the impulse to erase them and have certainly used scar-fading products myself, I also think that it's wrong to behave as though they're categorically a bad thing. Scars mark your continued survival and tout your ability to heal, to surmount obstacles. They're also universal—I don't know anyone who doesn't have at least one, whether it's from an accident, a wasp sting, chickenpox, or falling out of a tree. I have several myself, and I don't think they make me a less attractive person. In some cases, people look a whole lot sexier with their scars. Even celebrities like Padma Lakshmi, who are held to a much higher standard of "perfection" than the rest of us, have them and continue to look amazing. So don't be ashamed of your scars. Every one of them is, in its own way, a little personal victory, one you should feel free to celebrate.