Is your country part of the WEST? Would you consider it a WESTERN COUNTRY?
My opinion: Culturally wise? Yes. Geographically wise? Yes. Religion-wise? Yes. Politically wise? Maybe not, considering the rising of post-colonial perceptions and a certain attempt of pushing away the Imperialist Agenda (?). Socially wise? Maybe not, considering the North hemisphere perspective of "you're not rich and well developed enough, you're not even white enough"(?).
I wondered myself from another thread and I'd like to know how you guys feel about it.