Celebrities like the Kardashian/Jenner sisters, Lady Gaga, Bella Hadid, Ashley Graham, etc., and many Instagram models have supported #bodypositivity ; #bodypositive and feel comfortable showing themselves bare.
I’ve met plethora of people who believe that women should be conservative, and mysterious with their bodies. Some men wouldn’t want their women’s bodies to be seen by their homies or other guys, and become the hot topic of the crew. -Especially when said women are those they want to wed, and build a family with.
I call that respect, value, and worth.
No need to show your body for validation, or simply for attention.
Others argue that it’s totally ok, because it showcases acceptance and confidence. But, must someone show nudity in order to prove that they’re confident or have self-love?
Confidence and self-love can be manifested in many other ways such as being covered, classy, inspirational women.
Big or small, all women are beautiful and should feel as such. Just don’t let anyone tell you otherwise, period. There really is no need to post nudes to prove that. If that’s what makes you happy, oh well.
If you’re raped by the web and got your nudes leaked that’s a different story.
But hey all perspectives matter.
What’s your perspective on this? Don’t be afraid to leave a comment.