United States no longer a “Christian” nation
According to Barack Obama, who is a reader and student of reputable history books, the United States is not a "Christian nation." As he spoke these words, I imagined great sorrow being exhaled from coast to coast. Those sad revisionists need to go read reputable history books and review the founding documents of the United States.