What's the most important part of your health? What do you think? Is it eating a balanced mostly plant based diet, balancing your hormones, daily exercise, getting enough sleep. Taking your vitamins, seeing your doctor for regular checkups! These things might all seem like important, or even critical factors to living a healthy life right! But what if I told you that taking care of your body was the least important part of your health? What do you think? I'm a physician, so if you had told me that like five years ago, I would have taken it as a complete sacrilege right!
I mean I spent twelve years training, because the body is supposed to be the foundation of everything in life right. But what if I told you that the medical profession had it all backwards. If the the body doesn't shape how we live our lives, what if the body is actually a mirror of how we live our lives?
Think about it for a moment. Think about a time in your life when you were not living the life that you were supposed to be living. Maybe you were in the wrong relationship, or you were in some hostile work environment doing what you thought you should do.....
See more here: https://www.youtube.com/watch?v=7tu9nJmr4Xs