Talking About "the West": What's "Western"?
A few days later, I watched a YouTube video where the commentator made a statement about Mexico: He said that in recent decades Mexican families had been adopting a more "Western" diet, and although it was only an off-hand comment, it annoyed me quite a bit. What do people mean when they talk about "the West" and what is "Western"? Is this just euphemistic, or does it make sense? Should we use different words instead? Let's investigate.
Let's begin by talking about the names of the continents: I believe all of the continents' names are relatively arbitrary and innocent. For example, the terms "Europe," "Africa," and "Asia" all seem to originate from names for smaller bit of land that were then used in a Greek "three continent" vision of the world that carried on for centuries. "Australia" and "Antarctica" are merely latinized versions of "southern" and "opposite to the north," respectively, if you can get your head around that. "America" is a bit different, since it comes from the given name of a European explorer, but given that he is today most known for giving his name to America (sort of) that still seems to be a pretty benign etymology.
So, if all the continents' names seem appropriate enough, what could be wrong with talking about the "West"? It's just a direction, after all, so what could be harmful about that? Well, I don't think it's necessarily harmful to talk about the "West," but it simply doesn't make sense. The West is supposed to be a sort of cultural sphere, but why would one use a geographic term to describe something that's cultural? Even within Afro-Eurasia, Europe is not "the West." Cap-Vert, Senegal (the westernmost point of the African mainland) is significantly further west than Cabo da Roca, Portugal (the westernmost point of the European mainland). In fact, there's already a part of Afro-Eurasia called "the West"—it's the Maghreb, the "West" as perceived by Arabs (and it covers the same lines of longitude as Western Europe.)
|courtesy of Wikipedia|
Just looking at the Wikipedia page for "Western world" shows you what a confused term this is. Referencing "the West" in one's writing or speech, in my opinion, is not only begging for inaccuracy, but it's fundamentally a crutch. The term is used to support non-specific ideas, either in informal remarks or analyses based on stereotypes. Here is an incomplete list of alternative terms that will clarify ideas, increasing their accuracy and thoughtfulness:
European, Euroamerican (or Euro-American), Europeanized, Anglo-American, "New Europes," high HDI countries, high GDP countries, First World countries (as a Cold War term), or, if you prefer, "countries with lots of white people."
Instead of using a single-word label, you can always use a longer phrase—even a sentence or two!—to describe what you're really talking about. Mexico, for example, certainly has strong ties to Europe in its history, language, institutions, and even its cuisine (although Mexican food gave the world a lot more). However, the country retains an immense indigenous heritage present in its people's genes and culture—much much more than one sees in most of the United States and Canada. Mexico is thus "Western" in terms of its longstanding ties to Europe, but those ties should be carefully qualified. If more Mexicans are adopting a "Western" diet currently, that likely means more Mexicans are eating processed or imported foods, or foods sold by large corporations, rather than foods grown and processed locally. That's a model pioneered by the United States, not "the West," so the word "Americanized" works a whole hell of a lot better.
To make a long story short, rectify your names, think before you speak, say what you mean, and understand what you say. "The West and the Rest" sounds catchy, but ultimately it's meaningless.