Is America REALLY a Christian nation? Let’s discuss two things that are not supposed to go together, politics and religion. Today, learn why the Christian faith seems to be losing it’s leverage in American politics and policy. Have Christians focused on the wrong thing? Who is supposed to be responsible for the morality of America, us or the government?
Leave a Comment