When we hear from politicians and activists that "the West is at risk, that a clash of civilizations threatens Western culture," what does that mean exactly? What exactly are we talking about when we talk about "the West"? A geographic region? Judeo-Christian heritage? Liberal democracy, free market capitalism? And who do we think is coming for it?