It’s disgusting just how many people in this country are more than happy with a fascist state.
My question… Has this country always been this way? Was I taught bullshit my whole life? Did I just have blinders on because of white middle class privilege?
tRumps election has me doubting everything I thought I knew about this country. And I’m a fucking historian.