Anonymous
For much of my life I have seen religions as institutions that take away from society. After all, religion was responsible for the repression of science known as the dark ages. I now come to suspect I am not the only one who thinks this way, but wonder if the trend away from religion is making us worse people, if perhaps, there is a non-zero amount of religion from which society benefits.
Anonymous
I believe that comparisons between the United States and other countries, in a variety of contexts, don't make sense. To point at solutions to problems like race, gun violence, or COVID that worked in other places is to ignore the cultural differences between Americans and citizens of other countries. If the cause of these problems is cultural, no amount of policy can fix them. How do you change the culture of a people? How do you change the way people think?