This summer it was reported that 66 percent of US adults are overweight, but only 12 percent say that they've ever been told by a doctor, nurse, or other health care professional that they are obese. I feel like this is a really touchy subject. I mean nobody wants to admit that they need to lose a few pounds, and it would be even worse to hear it from somebody else.
So I was wondering, has a doctor ever told you to lose weight? Do they have the right to tell you that because it's for your health, or do you think it's totally out of line and rude?