Well, according to the way I was raised, and to some fundamentalist friends who still talk to me now (ha) it simply means to return the nation to a time when it was almost entirely Protestant Christian. A lot of folks still believe that the US was founded as a Christian nation. They want a time when "god's laws" and "the laws of the land" are the same. (They never were, but never mind that.) Mostly, though, they're angry because the church has lost power and control and influence over the lives of everyday citizens, and they want it BACK.
"The family that prays together...is brainwashing their children."- Albert Einstein