Joined
·
2,719 Posts
The Atlantic, Sept. 26, 2019 had an interesting article on how a certain happening was affecting our country, especially the young in replacing a spiritual life with one of workism. "Three Decades Ago, America Lost It's Religion. Why"? I would add another question: "Has this loss been better or worse for America's life and society? And if it is how can we get it back?
"Deep into the 20th century, more than nine in 10 Americans said they believed in God and belonged to an organized religion, with the great majority of them calling themselves Christians. That number held steady throughout the 80's. What the hell happened around 1990?"
According to Christian Smith, cited in the article, a professor of sociology and religion at University of Notre Dame, this change has mostly been the result of three historical events: the association of the Republican Party with the Christian Right, the end of the Cold War, and 9/11.
Do you agree with Smith? Do you have other ideas?
"Deep into the 20th century, more than nine in 10 Americans said they believed in God and belonged to an organized religion, with the great majority of them calling themselves Christians. That number held steady throughout the 80's. What the hell happened around 1990?"
According to Christian Smith, cited in the article, a professor of sociology and religion at University of Notre Dame, this change has mostly been the result of three historical events: the association of the Republican Party with the Christian Right, the end of the Cold War, and 9/11.
Do you agree with Smith? Do you have other ideas?