Boyd Logo

e-mail me:
Website news Page

Is America A Christian Nation

Earlier this year (2009) much brouhaha was caused by President Obama when he stated in a speech that America wasn't a Christian nation. Christians all over went bonkers over his statement. I actually agree with President Obama on this (one of the few things on which we agree). The following article is a collection of thoughts from an e-mail exchange I had on this topic with a good friend.
(I understand that the name of our coutry is the United States of America, but "America" was the general way the country is referred to and so I'm doing it here as well)


First of all, the word "is" implies present tense. We could rephrase the question: Is America now a Christian Nation? Clearly, America is not now a Christian nation in any sense of the phrase. While there is a large portion of Americans who would claim to be Christian, our culture is clearly pagan. And the people we elect to our government are predominantly pagan. With abortion on demand being the law of the land, and homosexuality officially sanctioned, even to the extent where homosexuals can be married, we must assume that those creating the laws are pagan.

We know that on any given Sunday, we have fewer people who go to church than those who do. We know that in our inner cities, we have illegitimacy rates exceeding 70%. We have a higher percentage of our population locked in prisons than any other major country. If this is Christianity, God help us! Just watch an hour or two of prime time network television and if you are a Christian I'm pretty certain your moral sensibilities will be offended,

I think we cannot say that America is now a Christian nation, nor would any Christian in his right mind want to tell people that America is representative of Christianity. Christians first and foremost should say that America is definitely not a Christian nation.

Some might be inclined to say that America is a Christian nation because we were founded on Christian principles. Clearly the men who founded the nation were largely Christians and had a devotion to God, and a belief in Christ as their savior. They believed that God would protect the the nation if they remained true to God. They believed that all men were created by God and were therefore due equal rights (unless you were black or a woman - even our Christian founders had their foibles huh?). So the idea of individual freedom and equal rights at least in part stems from a belief in God as our creator.

While the fundamental belief in freedom came from their Christian beliefs, the political ideas on which our nation is based are not found in the Bible. The idea of a representative democracy is found nowhere in the Bible. The idea of a tripartite government, or balance of powers is not found in the Bible. Indeed, one could make more of a case for Socialism being a Christian form of government since we are told over and over to take care of the poor, the widowed, the downtrodden. God expects us, no, commands us to do this and so if our government simply put laws in place that were based on those Christian commands couldn't we say that constituted a Christian government? Indeed we've heard this claimed very recently - that we have a moral (Christian?) responsibility to enact universal health care. I heard this stated from a Christian pulpit just this past summer. I don't think you can claim America is a Christian nation because of our form of government.

The process by which our nation was founded didn't embody Christian principles. The Bible tells us that we are to be in submission to our civil leaders. The Bible does not tell us to revolt, and yet that's how our nation was founded - hardly based on Christian principles.

But all of this discussion is pointless unless we get down to application. What should Christians today in the United States draw from this? How should we behave?

While our nation was founded by devoted Christians with an form of government that most likely cannot work outside of Christianity, our nation is not a Christian nation today. We see our culture decaying into depravity and our government decaying into self serving corruption. But Christians should not lose heart as we are citizens of another realm. We are citizens of the Kingdom of God and will soon enough be reigning with Him. Seeing such a great nation go into decline may cause us to be sad, but it does not mean that all is lost, unless we are thinking purely in terms of our own rights and comforts. However we still need to answer the question, "What should we do?"

Should we make our objective to return America to being a Christian nation? Is that what Christ calls us to do? I say no. As Christians, we are called to glorify God and the highest way to glorify God is to take up the ministry of Jesus Christ and that is to bring others into a right relationship with the Father. That is our calling - to lead others to Jesus. We are citizens of the United States and so we have a responsibility to participate in our government. We must obey the laws, we must vote, and we certainly should participate in public debate over what is best for our nation. However I don't believe we should make this the focus of our lives and we should not engage in debate to the detriment of winning others to Christ. If we make enemies by fighting over civil matters, how will we lead those enemies to Christ? If Christendom is seen by the public as a political movement, how will we ever convince others that we are about eternal spiritual matters?

We can look at this time in the life of our nation as discouraging because our freedoms may be lost, our prosperity may be lost, our physical security may be compromised, even our religious freedom may be lost. On the other hand, if our primary concern is that of proclaiming Christ we should be excited. At no other time in our lives has Christianity stood in such stark contrast to our culture. At no time in our history has the differences between life with Christ and life without Christ been so apparent. That is, unless we choose to live as citizens of the declining United States of America and forget that we are citizens of the Kingdom of God.

I believe that America will continue in its decline (unless there is a great change in the hearts of men and women). I'm not happy about that, but when I ask myself if that decline impacts the reason I exist, I don't think it does. So should I be concerned about the decline of America? I think I should be more concerned about the hearts of people in America, and hearts aren't changed by political action. We won't win anyone to Christ by resisting Obama or by voting Republican.

Consider Jesus' ministry. Did he work to change the government which was certainly oppressive toward the Jews. Did he work to outlaw heathen practices? He wasn't focused on changing culture, he was focused on changing hearts.

What can we conclude about the decline of this formerly "Christian nation?" Our government is based on majority rule.  So long as the majority are Christians, (all thinking pretty much the same about right and wrong as was the case at the founding) everything is fine.  When we have a myriad of beliefs, the resulting government may not be all that appealing to Christians.  In fact it may well (and soon) become dangerous for Christians.  No matter. Our purpose for living can still be fullfilled. Glorify God! Praise his name forever.