Ok, exactly what the hell does immigration have to do with the decline of “white America.”
I happen to be an immigrant. I also happen to be white. Most people probably don’t know it to look at me but I was not born in America and in fact, the only citizenship I currently claim is that of a foreign country (I’m a U.S. resident).
I can’t speak for others, but I came to America to marry my wife and build a life here. I’ve had a job practically the entire time I’ve been here. I’ve worked my ass off, I’ve paid my taxes, paid my mortgage and I’m here 100% legally…just like many other NON white immigrants out there.
If decent hard working people like me are “destroying America” than America has serious problems. I just get so pissed off when people say that immigration is causing all these social problems in America. Newsflash: Plenty of America’s government programs and social systems are seriously flawed and inefficient regardless of immigration. You think if all the immigrants magically disappeared tomorrow that America would be a jolly, happy place or that the Government would magically work properly? I seriously doubt it.
When I first made the decision to come to America I was nervous and excited. I felt like there was a lot of opportunity here for me. Now, several years later, I find myself extremely put off by the increasingly mainstream dislike of immigrants, regardless of legal status. I have considered returning to my home country (Canada) but for now that option is off the table.
My green card means something to me. It means something to them too. It means I’m a foreigner. It means I’m not a “real” American and apparently, it means I’m the enemy.
Now is this really the America the founders wanted?