A Bright Life Inspired by Dr. Raymond Peat
What is really starting to frustrate me is that when you read the science
REALLY READ IT in DEPTH
It confirms that what people knew intuitively for centuries is true, yet our society is turning away from natural medicines when in reality the science shows they work just as well?
It almost puts me off wanting to study science. I’m currently facing a big decision in my life. Pursue education or not…
What is going wrong with our culture? No seriously? Does anyone else notice this???