The minute anyone says appearances matter everyone jumps at the chance to argue either against or for this statement. I am not going to get into that discussion because I believe it is right and wrong at the same time. I on the other hand want to talk about why people look down on those who do care about appearances, and I am not talking about people who judge others. I am talking about why people who choose to change something about themselves are looked down upon. I know we are living in the positive vibe era, and there is nothing wrong with that. However with that being said, many believe that it is ok to make people feel bad it they decide to change anything about their bodies.
One weekend me and my husband (Blake) were out with a couple friends of ours. As we were talking having a good time we got into a discussion. We were talking about plastic surgery, lip injections, botox, and other body alternating surgeries. At one point our friend said that he would never date anyone who had any work done because this meant they had low self esteem. I was completely shocked because in my opinion that is a completely false statement for many reasons. For one, what makes a permanent change any different than a temporary one? Does that mean that anyone who decides to change their hair or get a tattoo or even get extensions is insecure? What about women who do something to their bodies and keep it a secret? Are they thought of better than someone who does the same but decides to be open about the situation? As I began thinking about it I realized that it was not just him who thought this way. For many years artists, wealthy women,and movie stars have gotten some sort of work and most of them keep it a secret. It got me wondering, do they keep it a secret because they are ashamed of being judged or is it because they are ashamed of getting work done? Why are all these women being called fake? Just because they decided to change something about themselves does it put a label on who they are? And are they truly ashamed even tho they are open about the changes they decided to undergo?
When we think about it society has always had a high standard on what women should look like. This new love your body/ positivity era has had its good impacts. Us being humans however have flipped the coin and now shame those who do choose to change something about themselves. So when are we going to get it right? I believe that there is good in both opinions and that true beauty will always be about how you look at yourself. It does not make you less confident deciding to change yourself than it would to fully embrace your natural beauty. This world was created for many different types of people with different ways of being beautiful. You know i dont have answers to most of these questions nor do I think there is one answer to any of them. But I do believe everyone is different, so truly go out there and just be you wether you want to change yourself or go full naturale this life, your life, is made for you!!