Why does the myth persist that America is a “center-right” country? If that was so, Republicans would behave very differently. Somehow the myth is perpetuated by our “liberal media”.