Politics

Is America Worse When Democrats Are In Charge?

With the current shape of the U.S. economy, a large number of Americans are starting to believe that the United States is generally worse when Democrats are in charge. What do you think?


Yes

America is worse when Democrats are in charge.


No

America is not worse.

0
Would love your thoughts, please comment.x
()
x