Living in the post-Christian West will not save you. There is nothing magical about the values embraced by America’s founding fathers that confers grace to the human heart, makes men and women right with God, or causes them to be in any way preferable (from God’s perspective) to their fellow human beings steeped in paganism or in blundering around in religious darkness.
Being born into a society where the Christian message still has a residual influence, however diminished, does not make us Christian. Recognizing and appreciating its benefits does not grant us brownie points for cleverness, though it is clear those who do not value what they have been given are ignorant of history and poorly informed about the many drawbacks of living elsewhere.