Is America Worse When Democrats Are In Charge?

Photo by Robert Linder on Unsplash

With the current shape of the U.S. economy, a large number of Americans are starting to believe that the United States is generally worse when Democrats are in charge. What do you think?


Yes

America is worse when Democrats are in charge.


No

America is not worse.

Exit mobile version