I just finished reading an entertaining story about Bobbi Brown's training camp for makeup artists. It's unsurprising to learn that she's a perfectionist who teaches her students to get every detail just so, and yet she comes off as someone who genuinely wants other women to look and feel their best. "Girls with freckles don't need foundation," she explains. (See, I told you!)
The most revealing part of the story comes toward the end, when the reporter has a moment with Brown:
"I am in an industry that makes women feel bad about themselves, absolutely," she says, when I ask her what she thinks of the beauty industry. And yet within the beauty world Brown has gained a reputation for making women feel good about themselves. "I never in a million years thought I would be the person to go to for self-esteem, that was not my intent. But I happen to love beauty, I love the way people look, and I love making women look beautiful," she says.
It's interesting that she's so blunt with her opinion of the beauty industry. I'm curious: Does it make you feel bad about yourself? And if it does, what would you change about it?