Many believe the founders wanted a Christian America. Some want the government to declare one now - eviltoast

Large numbers of Americans believe the founders intended the U.S. to be a Christian nation. The belief is especially strong among Republicans and their white evangelical base.