1. The theory of medical dominance suggests that doctors have a key authoritative role in the health care profession. Considering that this is a way of thinking that has been in existence since the early part of the twentieth century, do you believe that this remains true? Discuss why you agree or disagree with this line of thinking.
2. What are the differences between the roles of mainstream and alternative health care providers? Do you feel that these types of health care workers are given the respect as professionals that they deserve? Why or why not?