Western medicine - A system of healthcare that uses scientifically developed treatments like drugs, surgery, and therapy, primarily practiced in hospitals and clinics.