An apology was posted Saturday by a Christian journal that had published and republished an anonymous essay on its blog saying that Jews killed Jesus and deserve God’s punishment.
“Firstly, we apologize for inadequate editorial oversight in the publishing and re-publishing of this blog post,” wrote Aaron Gyde, editor-in-chief of the Harvard Ichthus, which is run by Harvard College undergraduates.
The publication’s apology took the place of the essay “Why Us?” which was written by an anonymous Jewish convert to Christianity and posted on the Ichthus website Wednesday. The author, who remained anonymous due to concern of personal attacks, wrote, “We, the Jews, collectively rejected God and hung Him up on a cross to die, and thus we deserved the punishments that were heaped on our heads over the last 2000 years.”
Gyde wrote the apology on behalf of the Ichthus editorial board, adding in thoughts from the author of the controversial essay.
“While this does not excuse the post of responsibility, it was not the intent of the writer, nor the Ichthus, to present a piece that is anti-Semitic in nature or in interpretation,” the apology stated. “The writer holds nothing but love for his heritage and feels very deeply for the welfare of the Jewish people. The blog was not intended to communicate animosity, but concern and a sincere desire to communicate the necessity of salvation through Jesus Christ alone.”
The essay was originally removed from the website, edited, and reposted Friday morning, when the author wrote that he or she was looking “to warn my beloved Jewish friends and family of the judgment that lies ahead.”
The essay was removed again Friday, this time permanently.
The always-on, simultaneous society in which we have found ourselves has altered our relationship to culture, media, news, politics, economics, and power. We are living in a digital temporal landscape, but instead of exploiting its asynchronous biases, we are misguidedly attempting to extend the time-is-money agenda of the Industrial Age into the current era. The result is a disorienting and dehumanizing mess, where the zombie apocalypse is more comforting to imagine than more of the same. It needn’t be this way.
Douglas Rushkoff — teacher, documentarian, journalist, and author — discusses insights from his recent book “Present Shock: When Everything Happens Now” with David Weinberger and a live audience at Harvard.
More info on this event here: cyber.law.harvard.edu
Stanford University announced Wednesday that it is joining forces with Harvard and MIT on developing a computer system that allows colleges to offer free online courses, a collaboration that school officials said would benefit both educators and students around the globe.
Stanford already has its own fledgling platform for delivering so-called massive open online courses, or MOOCS. But the university has decided to suspend work on it in favor of the software developed by the two East Coast universities, called edX, Vice President John Mitchell said.
Stanford still plans to offer some of its courses through Coursera, a commercial Internet course provider founded by two Stanford professors. But with the demand for online learning increasing rapidly, it makes sense for academic institutions to team up instead of compete, Mitchell said.
“Together, I think we will have a chance to produce a much better platform than each of us would be able to do individually,” he said, adding that the software that emerges from the alliance has the potential to become the “Linux of online learning.”
If you’ve ever had a violent encounter with a porcupine, it probably didn’t end well. The large rodents are most well-known for the coat of some 30,000 barbed quills that cover their backs, an evolutionary adaptation to protect against predators. Although they appear thin—even flimsy—once quills lodge in your flesh, they’re remarkably difficult and painful to get out.
Recently, a group of scientists led by Jeffrey Karp of Harvard decided to closely investigate just what makes these quills so effective. As they report in an article published today in the Proceedings of the National Academy of Sciences, their analysis revealed a specialized microscopic barbed structure that enables the quills to slide into tissue extremely easily but cling to it stubbornly once it’s in place.
Porter University Professor Helen Vendler, the preeminent poetry critic, has served on the faculty’s undergraduate admissions committee. Given contemporary admissions processes and pressures, she recalls ‘wondering how well T.S. Eliot (who had to do a preparatory year at Milton Academy before he could risk admittance, and whose mother was in consultation with Harvard and Milton officials before deciding what to do with him after he finished high school in St. Louis) would have fared, or Wallace Stevens (admitted as a special student to do only three years’ study), or E.E. Cummings (admittedly, a faculty child).’ Accordingly, she proposed that alumni interviewers receive some guidance on how to understand, attract, and evaluate applicants whose creative talents might otherwise be overlooked, and wrote this essay, subsequently posted on the Office of Admissions website
ANYONE who has seen application folders knows the talents of our potential undergraduates, as well as the difficulties overcome by many of them. And anyone who teaches our undergraduates, as I have done for over 30 years, knows the delight of encountering them. Each of us has responded warmly to many sorts of undergraduates: I’ve encountered the top Eagle Scout in the country, a violinist who is now part of a young professional quartet, a student who backpacked solo through Tierra del Fuego, and other memorable writers, pre-meds, theater devotees, Lampoon contributors on their way to Hollywood, and more. They have come from both private and public schools and from foreign countries.
We hear from all sides about “leadership,” “service,” “scientific passion,” and various other desirable qualities that bring about change in the world. The fields that receive the most media attention (economics, biology, technology, political theory, psychology) occupy the public mind more than fields—perhaps more influential in the long run—in the humanities: poetry, philosophy, foreign languages, drama. W.H. Auden famously said—after seeing the Spanish Civil War—that “poetry makes nothing happen.” And it doesn’t, when the “something” desired is the end of hostilities, a government coup, an airlift, or an election victory. But those “somethings” are narrowly conceived. The cultural resonance of the characters of Greek epic and tragedy—Achilles, Oedipus, An tigone—and the crises of consciousness they embody—have been felt long after the culture that gave them birth has disappeared. Gandhi’s philosophical conception of nonviolent resistance has penetrated far beyond his own country and beyond his own century. Music makes nothing happen, either, in the world of reportable events (which is the media world); but the permanence of Beethoven in revolutionary consciousness has not been shaken. We would know less of New England without Emily Dickinson’s “seeing New Englandly,” as she put it. Books are still considering Lincoln’s speeches—the Gettysburg Address, the Second Inaugural—long after the events that prompted them vanished into the past. Nobody would remember the siege of Troy if Homer had not sung it, or Guernica if Picasso had not painted it. The Harlem Renaissance would not have occurred as it did without the stimulus of Alain Locke, Harvard’s first black Rhodes Scholar. Modern philosophy of mind would not exist as it does without the rigors of Wittgenstein’s Philosophical Investigations, nor would our idea of women’s rights have taken the shape it has without Woolf’s claim for a room of her own.
We are eager to harbor the next Homer, the next Kant, or the next Dickinson. There is no reason why we shouldn’t expect such a student to spend his or her university years with us. Emerson did; Wallace Stevens did; Robert Frost did; Frank O’Hara and John Ashbery and Fairfield Porter and Adrienne Rich did; and had universities harbored women in residence when Dickinson came of age, she might have been glad to be here. She and Woolf could be the writers they were because their fathers had extensive private libraries; women without such resources were deprived of the chance to be all they could be. Universities are the principal educators, now, of men and women alike, and they produce the makers of culture. Makers of culture last longer in public memory than members of Parliament, representatives, and senators; they modify the mind of their century more, in general, than elected officials. They make the reputation of a country. Michelangelo outlasts the Medici and the popes in our idea of Italy; and, as one French poet said, “le buste/ Survit à la cité”: art outlives the cities that gave it birth.
Newsweek is dead. The 80-year-old magazine will cease publication at the end of the year, a teary-eyed Tina Brown said last Thursday. Before we sink too deeply into grief, let’s all remember what lies beyond these earthly, stapled pages. Newsweek may have passed away, its paper turned to dust, but the Newsweek spirit carries on, not as matter or material, but in a state of pure electron flux, a ghostly form that rides the WiFi waves around us. Its words will rise off the printing press and be transformed into an energy that’s everywhere at once, but also nowhere. The magazine will become an online angel—a Web-based publication that penetrates our minds with truth and light. In death, it will be reborn and find everlasting life. …
Sorry, I’m getting all mixed up. I’ve been having a little trouble focusing since I read Newsweek’s cover story from Oct. 15—the one with a picture of a hand reaching up into the clouds and a headline promising that “Heaven Is Real.” It’s a personal account of meeting God, excerpted from a memoir published this week, called Proof of Heaven: A Neurosurgeon’s Journey Into the Afterlife. A neurosurgeon? More than that! The author, Eben Alexander III, makes a point of saying that he’s a skeptic and a scientist, a skeptical scientist who happens to have spent some time (did he mention?) at a little school in Boston called Harvard University. This science-minded Harvard skeptic never thought he’d find the truth of Jesus Christ. But the facts are just the facts: Alexander has been graced with the divine, and he’ll share that grace with us. He’s become a neuro-prophet.
This experiment in out-of-body consciousness began in fall of 2008, when a case of bacterial meningitis put Alexander in a coma and “shut down” his “entire cortex.” What he means by that is never clear—you might think this state would be synonymous with death, which is sort of what Alexander claims, even though he’s now alive and writing books. But it’s a waste of time to quibble over details, since according to the author, the fact of his brain’s inactivation is the only thing that could possibly explain what happened next. While Alexander was in the coma, and his brain was “totally off-line” he drifted from this world of Harvard neuroscience into a land of pink and puffy clouds, and chanting flocks of angels, and a glowing orb that speaks telepathically, and a blue-eyed lady-friend, and lots and lots of butterflies. You would not believe how many butterflies there are in Heaven.
Unchecked and Unbalanced: Obama Has Continued Many of His Predecessor’s Most Controversial Counter-Terror Policies. Here’s Why
At the outset of his new book, Power and Constraint, Harvard law professor Jack Goldsmith makes the case that President Obama has continued many of his predecessor’s most controversial counterterrorism policies. From preventive detention to the state secrets privilege to military commissions, Goldsmith asserts, Obama has adopted practices that he criticized in his presidential campaign.
This claim of continuity rankles Obama supporters who believe that the president’s approach to counterterrorism evinces a respect for the rule of law that his predecessor lacked. But the claim is not a new one. It has been put forward both by conservatives who consider the continuity a validation of President Bush’s approach and by liberals who consider it a betrayal. Indeed, even Obama’s staunchest defenders acknowledge some unexpected similarities between the two administrations in national security matters.
More provocative is Goldsmith’s argument about why this is the case. He contends that, contrary to conventional wisdom, the Bush era was one of unparalleled oversight and accountability. After 9/11 the executive branch initially assumed broad and intrusive powers, which it exercised largely in secret. The media, aided by Freedom of Information Act requests from NGOs, uncovered these secret acts. Congress, the courts, and internal agency watchdogs then pushed back and trimmed the president’s powers. By the time Obama took office, existing policies reflected a rigorous application of the constitutional system of checks and balances. By continuing those policies, Obama did not abandon the reformist commitments he made during his campaign, as some believe. The policies already had been reformed, and whether they ended up in the “right” place is, Goldsmith asserts, beside the point.
Goldsmith is no mere observer of the events he describes. In his brief tenure as the head of the Justice Department’s Office of Legal Counsel (OLC), Goldsmith made the unprecedented decision to withdraw two standing OLC opinions: the “torture memos” authored by John Yoo. The memos, he found, provided a flimsy legal justification for subjecting terrorist suspects to “enhanced interrogation techniques” that included waterboarding, shackling suspects in “stress positions,” confining them in small boxes, subjecting them to extreme temperatures, and preventing them from sleeping. With the exception of waterboarding, however, Goldsmith did not dispute the legality of the practices—only the quality of the memos authorizing them—and so allowed them to continue.
This is from 2005, and here you see Obama making some highly salient points about response to Katrina and gaps in understanding the true situation prior to Katrina.
In my opinion this isn’t really news, it’s something that books and news stories covered exhaustively, including debunking the egregiously sensationalist reporting from Fox. Now we have Fox dredging up Rev Wright and Katrina again - which tells me that they are desperate, and that they’ve got to dredge this non news and non muck up again just to keep the GOP base barely engaged in the election.
Most of the time, when I am blindsided by a seemingly perplexing psychological phenomenon—prephonological spelling, for instance, or the downside of social intelligence—it doesn’t take more than a day’s research to convince me that said phenomenon makes sense after all, at least on some level, at least well enough to write about. I often come away questioning whether it was really that surprising in the first place.
But it’s safe to say that babies and numbers—specifically one-year-olds and the number four—are an exception. What am I talking about? In studies published in 2003 and 2005, Lisa Feigenson of Johns Hopkins and Susan Carey of Harvard brought 12-to-14-month-old infants into a lab for a manual search task. The infants first watched as an experimenter placed one Ping-Pong ball at a time into a box. “Look at this!” the experimenter said. “What’s in my box?” The babies, who, thanks to the presence of a spiraling ball chute, were pretty motivated to retrieve the balls, subsequently reached into the box themselves to pull them out.
Except, this being an experiment, the box also had a hidden slit in the back. Thus, after depositing the balls, the experimenter could remove any number of them without the baby being any the wiser. The researchers were interested in whether an infant who had watched three balls enter the box, but could only retrieve two, would scour the box for a longer period of time than if he’d been able to retrieve all three. This would suggest he was able to mentally represent three individual objects.
With news that nearly half of the 279 Harvard undergraduates enrolled in an “Introduction to Congress” class have been accused of cheating on the final exam, the university has been immersed in some understandable soul-searching. Blame has been cast far and wide — on everything from today’s entitled youthto the temptations of the take-home final to Harvard’s tendency toward grade inflation.
In the end, culpability will lie with the 125 students in question, who allegedly turned in strikingly similar answers on a take-home test last spring. So far, some have complained that the test instructions — which described the exam as “open-book, open-note, open-Internet etc.” — were misleadingly vague. But the instructions also stated that “students may not discuss the exam with others.” The trouble is that students approached the test with the understanding, it seems, that no one, from their classmates to their professor, took the course as seriously as they should have from the start.
That’s an important component of this cheating scandal, and it speaks to a larger problem with undergraduate education at Harvard and beyond. Government 1310 was a survey course with a reputation for requiring little effort. In an online course evaluation site, one former student wrote that the class required “four hours of work every three weeks. Pretty chill.” Some of last year’s students maintain that Assistant Professor Matthew Platt didn’t seem to mind if people skipped his lectures. And, as with many Harvard courses, graduate teaching assistants took on a heavy share of the instruction.