As wars become less about states and more about societies, women can play a greater role in shaping or ending conflicts. So why do we still think of war as inherently male?
There's a big difference between the free speech rights of individuals under the First Amendment and socially destructive marketing by corporations of products that kill.
With the help of foreign aid, the public healthcare system has vastly improved the Afghan life expectancy.
The filibuster has never been a friend of civil rights.
"Love the sinner, hate the sin," as rendered into Japanese.
Providers of physical and spiritual care are just as indispensable to our society as providers of income. So why don't we treat them that way?