PREVIEW: Was America Really Founded as a 'Christian Nation'?

PREVIEW: Was America Really Founded as a 'Christian Nation'?

Most Americans believe the U.S. was founded as a Christian nation. But, is this merely a myth? Larry talks with the author of a new book claiming 'Corporate America' invented 'Christian America,' and how that has defined and divided our politics since.


User: PoliticKING

Views: 1

Uploaded: 2015-06-03

Duration: 00:46