The way medicine is practiced in the Western world (especially the hierarchy of doctors, nurses, pharmacists, etc.) evolved fairly recently (including some major changes in the 18th and 19th centuries, see some of the text in this article). Do you think the quality of life would benefit from changing the distinctions between doctors, nurses, and other sorts of health care professionals and how they are trained (perhaps you would have to be a nurse first before you could be a doctor), or from changing the roles of the various sub disciplines (doctors, chiropractors, homeopaths, etc.)?
Please elaborate if you have ideas about this.