Have Things Gotten Worse Since Biden Became President?

Gage Skidmore from Peoria, AZ, United States of America, CC BY-SA 2.0 , via Wikimedia Commons

Since President Biden took office, there has been ongoing debate about the direction of the country. Your input will help us assess public perceptions of the changes in America under his administration.


Yes

They have.


No

They haven’t.

Exit mobile version