Comment

Another Strange New AI Video From Bob Schneider: "God Is Real"

28
goddamnedfrank12/23/2023 1:12:08 pm PST

Turning a big dial that says “Collateral damage” on it and constantly looking back at the audience for approval like a contestant on The Price is Right

According to the sources, the increasing use of AI-based systems like Habsora allows the army to carry out strikes on residential homes where a single Hamas member lives on a massive scale, even those who are junior Hamas operatives. Yet testimonies of Palestinians in Gaza suggest that since October 7, the army has also attacked many private residences where there was no known or apparent member of Hamas or any other militant group residing. Such strikes, sources confirmed to +972 and Local Call, can knowingly kill entire families in the process.

According to the sources who spoke to +972 and Local Call, the targets in Gaza that have been struck by Israeli aircraft can be divided roughly into four categories. The first is “tactical targets,” which include standard military targets such as armed militant cells, weapon warehouses, rocket launchers, anti-tank missile launchers, launch pits, mortar bombs, military headquarters, observation posts, and so on.

The second is “underground targets” — mainly tunnels that Hamas has dug under Gaza’s neighborhoods, including under civilian homes. Aerial strikes on these targets could lead to the collapse of the homes above or near the tunnels.

The third is “power targets,” which includes high-rises and residential towers in the heart of cities, and public buildings such as universities, banks, and government offices. The idea behind hitting such targets, say three intelligence sources who were involved in planning or conducting strikes on power targets in the past, is that a deliberate attack on Palestinian society will exert “civil pressure” on Hamas.

Compared to previous Israeli assaults on Gaza, the current war — which Israel has named “Operation Iron Swords,” and which began in the wake of the Hamas-led assault on southern Israel on October 7 — has seen the army significantly expand its bombing of targets that are not distinctly military in nature. These include private residences as well as public buildings, infrastructure, and high-rise blocks, which sources say the army defines as “power targets” (“matarot otzem”).

The bombing of power targets, according to intelligence sources who had first-hand experience with its application in Gaza in the past, is mainly intended to harm Palestinian civil society: to “create a shock” that, among other things, will reverberate powerfully and “lead civilians to put pressure on Hamas,” as one source put it.

Several of the sources, who spoke to +972 and Local Call on the condition of anonymity, confirmed that the Israeli army has files on the vast majority of potential targets in Gaza — including homes — which stipulate the number of civilians who are likely to be killed in an attack on a particular target. This number is calculated and known in advance to the army’s intelligence units, who also know shortly before carrying out an attack roughly how many civilians are certain to be killed.

In one case discussed by the sources, the Israeli military command knowingly approved the killing of hundreds of Palestinian civilians in an attempt to assassinate a single top Hamas military commander. “The numbers increased from dozens of civilian deaths [permitted] as collateral damage as part of an attack on a senior official in previous operations, to hundreds of civilian deaths as collateral damage,” said one source.

The entire point of using an AI system to mass select targets is to crush a populace through the deliberate maximization of “collateral damage,” which is in quotes because it’s no longer collateral when its maximization is an actual goal to put pressure on the enemy. Again, the entire current use case for all AI systems is to launder theft at an absolutely staggering scale by removing human culpability for the AI’s owners by painting the machine’s output as a new, non-biased product arrived at via rules you aren’t allowed to examine or understand. In this case the crime is the theft of human lives at a scale so large that each individual atrocity is drowned in the onrushing tide because the initial targeting decision can be offloaded to an unthinking machine that has no idea what a human being even is.