The End of White America


The Election of Barack Obama is just the most startling manifestation of a larger trend: the gradual erosion of “whiteness” as the touchstone of what it means to be American. If the end of white America is a cultural and demographic inevitability, what will the new mainstream look like—and how will white Americans fit into it? What will it mean to be white when whiteness is no longer the norm? And will a post-white America be less racially divided—or more so?

The Atlantic


Worthwhile read.

Comments

Popular Posts