Dominionism/Definition

Theological and ideological systems that hold that the United States of America should be an officially Christian nation