Politics

Are Democrats Making America Immoral?

A large number of Americans have claimed that Democrats are slowly pushing ideologies within the country which is making the country immoral. They have pointed to a large number of radical ideas that would not have been accepted less than a decade ago that are now normal as evidence. What do you think.


YES

Dems are making America immoral.


NO

America is not becoming immoral.

0
Would love your thoughts, please comment.x
()
x