Western Medicine

The definition of Western Medicine:

A system in which medical doctors and other healthcare professionals (such as nurses, pharmacists, and therapists) treat symptoms and diseases using drugs, radiation, or surgery. Also called allopathic medicine, biomedicine, conventional medicine, mainstream medicine, and orthodox medicine.


When is it time to use Western Medicine?

Western medicine is probably the most popular method to seeking help for illness and disease.
When you are not well or you have an issue, you go to the doctor.
In western medicine, a doctor might be most interested in vital signs like weight, height, body temperature, blood pressure and so on, as well as signs of disease.

To find out if using Western medicine is the answer to your sleep struggles, contact us today for a consultation.

"The Western approach clearly divides the health from the disease, yet the Eastern approach considers health as a balanced state versus disease as an unbalanced state."

Dr. Julia J. Tsuei for the National Center for Biotechnology Information