Doctors no longer address obesity. Women are more accepting of their bodies. Yay?
During my annual physical I noticed that my Doctor's office no longer opens a dialog or instructional posters/handouts about obesity, overweight, underweight, or weight at all. Patients are asked to stand on a scale where there is a poster offering tips along the lines of "Health At Every Size."
I think these current news stories, when shown together, show a bigger theme at play:
Doctors now ask permission to discuss weight, if they bring it up at all.
"...social issues may take priority over discussing obesity, and social stigma may make providers hesitant to label patients, especially children, as obese."
"URMC Study Shows Obesity Diagnosis is Often Overlooked"
"Modest Progress or New War on Obesity?"
It's now three years ago that I published my book American Women Didn't Get Fat in the 1950s. Seems like little has changed since then. What will it take for there to be a cultural shift so that "body acceptance" coincides with an increase in compassionate, candid conversations with medical professionals about healthy body weight?