Tag: History

Teaching Christianity

I agree that Christianity has played a central role in the history of the United States – because it most certainly has. And it should be taught in our schools and colleges.