Body positivity is a belief (and a growing movement!) that you deserve to have a positive body image, regardless of what society and cultural norms dictate.
We are often told - by family, society, the media and culture - that our body is flawed and doesn’t fit the perceived ideals set by a few people.
Why Do We Feel Bad About Our Bodies?
Feeling Bad about Perceived Flaws
This leads people to feel bad about their “flaws,” which may not be flaws at all!
Replace Negative Thoughts
Remember that what the media sells us isn’t real - and that each of us is different and unique. Replace those negative thoughts with realistic ones and feel good in your skin!
Take the Step!Take your first step towards mental well-being. Click Below!