A large number of Americans have claimed that Democrats are slowly pushing ideologies within the country which is making the country immoral. They have pointed to a large number of radical ideas that would not have been accepted less than a decade ago that are now normal as evidence. What do you think.
Dems are making America immoral.
America is not becoming immoral.