The War on Women Is Real
The “War on Women” is a term usually used to describe Republican policies and their effects on the fairer sex. The right denies such a thing exists, but let’s stop kidding ourselves. Going back over the past half century, it’s impossible not to see a vast right-wing conspiracy to strip...