Has the US Always Been at War?
Given the role the United States has played in wars and militarized conflicts of every sort, when has the country ever not been at war?
Given the role the United States has played in wars and militarized conflicts of every sort, when has the country ever not been at war?
In America, much has been said about willful ignorance, but not enough about the fountains of falsehoods that feed those unwilling to learn much of anything that requires factual focus.
America’s present government is literally draining the life from the land.
America has always been good at forging a strong consensus on national priorities — from keeping the union intact to putting a person on the moon.
Those who say they are willing to fight for good governance have to stop talking about it and show up at the polls this time. It is way too late to wait for next time.