Dominionism
Jump to navigation
Jump to search
Dominionism is a term used to describe various sets of theological/political ideologies held among conservative Protestant Christians in the United States of America. While there are a variety of schools of thought, each maintains that it is a duty of Christians to obtain influence or control government and initiate change in keeping with what are held as Biblical principles and laws.