The dominant culture in America is the culture that everyone believes to be superior to others. The dominant culture in current American society is English speaking white Protestants. People who fit that mold are more prone to more job opportunities or better paying jobs as well as education. And because this is "their country" everyone is expected to know English.
-- In the post war era the dominant culture in America were thought to be heroes who would fight off the new found evil. Even following through now the dominant culture has much pride and ego.
----This was one of my first replies, I really hope it helped. If you have any questions contact me. I'm not quite sure how to use this website, so if this was;t up to par please let me know what I can improve on.
Thanks! And good luck :)
Content will be erased after question is completed.