The End of White America?

The Election of Barack Obama is just the most startling manifestation of a larger trend: the gradual erosion of “whiteness” as the touchstone of what it means to be American. If the end of white America is a cultural and demographic inevitability, what will the new mainstream look like—and how will white Americans fit into it? What will it mean to be white when whiteness is no longer the norm? And will a post-white America be less racially divided—or more so?

Popular posts from this blog

The January 6th Capitol Hill Riot, Censorship and Related Issues Resource List

U.S. Election Fraud Resource List

COVID-Coronavirus Pandemic, Lockdowns and Vaccines Resource List