Home Remedies and Western Medicine: Know When You Need a Professional

I’ve noticed a growing number of people interested in natural health and in making home remedies, in treating minor ailments at home instead of consulting a healthcare professional.

I’m all for empowerment, all about natural medicine and in taking charge of your own health. I don’t think you should run to the doctor for everything, and I do believe much can be taken care of at home.

However, I’m also a little amazed at the attitude some people have towards western medicine and healthcare professionals, whether is a practitioner of medicine or nutrition/dietetics. It’s like they think a doctor is out to get them and just make money off prescriptions. Western medicine is flawed, certainly, but what system doesn’t have flaws? What human doesn’t have flaws? When did natural remedies turn into a contest into who can stay away from the doctor the longest by treating at home? Why do people insist on getting their health information from sponsored bloggers instead of someone who went to school? Since when is every person who went to college bought out by big pharma and their advice not to be trusted? 


I believe in common sense. In using your best judgement. In real education, in real empowerment, and making educated decisions. Western medicine saves lives, there’s no arguing the fact. There is no shame in seeking professional medical attention when needed, that’s what medicine and doctors are there for. Let’s continue to educate and empower one another, but also have the humility to seek professional help when needed, instead of arrogantly thinking we can do it all ourselves and that people who went to college don’t know anything about healing. I’m all about natural medicine - I make it - but I also have the grace to accept when I don’t have something on hand or don’t know what I’m dealing with, and am thankful I have doctors to turn to when needed.

36782622_1798554146848341_9140783926036398080_n.jpg