县里的局长是什么级别| 脚麻吃什么药有效| 什么是穴位| 鲜为人知什么意思| 指甲扁平是什么原因| 恩客是什么意思| 有恙是什么意思| 给小孩办身份证需要什么| 家里什么东西止血最快| 啤酒花是什么| 井是什么生肖| 蝗虫的呼吸器官是什么| 山丘是什么意思| 狗狗能吃什么水果| 故的偏旁是什么| 质是什么意思| 八方来财是什么生肖| 94属什么生肖| 婴儿什么时候开始说话| 手关节疼痛挂什么科| 1206是什么星座| 防疫站属于什么单位| 差强人意是什么意思| 亨廷顿舞蹈症是什么病| 和风对什么| 天麻是什么东西| 小人痣代表什么意思| 一张纸可以做什么| 肾结石是由什么原因引起的| 囊肿长什么样子图片| 梦见自己被警察抓了是什么意思| 磕头虫吃什么| 巴图是什么意思| 撸铁是什么| 老感冒是什么原因| 非布司他片是什么药| 318是什么日子| 男性肾虚有什么症状| 逍遥丸治什么病| 杨柳木是什么生肖| 心衰竭是什么症状| 乔迁对联什么时候贴| 出圈什么意思| 同房出血是什么原因造成的| 建档是什么意思| 胃气不足吃什么中成药| 玻璃体切除后对眼睛有什么影响| 尿电导率低是什么意思| 扁桃体发炎吃什么中成药| 萝卜喝醉了会变成什么| 氟康唑治什么妇科炎症| 附件是什么部位| 女性前列腺叫什么| 非油炸是什么意思| 浑身没劲挂什么科| 鼠疫是由什么引起的| 荨麻疹用什么药| 大枣枸杞泡水喝有什么好处| 尿臭是什么原因男性| 清凉的什么| 打耳洞需要注意什么| 空调外机不出水是什么原因| 瓠子和什么相克| 贫血吃什么药| 蜱虫咬人后有什么症状图片| 岔气是什么症状| 沉冤得雪是什么意思| 月经不能吃什么水果| 脑供血不足检查什么项目| 造纸术什么时候发明的| 对偶是什么意思| 颜控是什么意思| 食管ca是什么意思| 桃树什么时候修剪最好| 天理是什么意思| 大龄补贴需要什么条件| 头发掉是什么原因引起的| ts是什么意思| 张飞穿针歇后语下一句是什么| 组cp是什么意思| 双非是什么| 尿毒清颗粒主治什么病| 芥末黄是什么颜色| cpap是什么意思| 尿尿疼吃什么药| samsonite什么牌子| 超负荷是什么意思| 6月18号是什么日子| 什么颜色加什么颜色等于橙色| 酒石酸是什么| 梁子是什么意思| 什么是瘦马| 从来就没有什么救世主| 377是什么| 口嫌体正直是什么意思| 走青是什么意思| 曹操为什么要杀华佗| 尿酸高多吃什么食物好| 春秋鼎盛是什么意思| 晚饭吃什么| 铅超标有什么症状| 抹茶粉是什么做的| 热射病什么症状| 新生儿感冒是什么症状| mrr是什么意思| 为什么吃芒果会过敏| 做造影什么时候做最好| 内膜厚吃什么掉内膜| 祉是什么意思| 左边肚子疼是什么原因| 上车饺子下车面什么意思| 捡和拣有什么区别| 什么化妆品好| 成吉思汗属什么生肖| 热感冒吃什么药| 心脏缺血吃什么药最好| 儿童肠炎吃什么药| 慢阻肺是什么原因引起的| 子宫b超能查出什么来| 献血浆有什么好处| 穿云箭是什么意思| 梦见烧衣服什么预兆| 处长是什么级别| 出家需要什么条件| 6月12号是什么星座| 尿频繁什么原因| 什么叫自负| 做爱时间短吃什么药| 一模一样的意思是什么| 吃什么可以通便| 刷屏是什么意思| 中文是什么意思| 舌面上有裂纹是什么病| 一月10号是什么星座| 抗ro52抗体阳性是什么意思| 什么布料| 大自然的馈赠什么意思| 湿气重有什么症状| 李叔同为什么出家| 被褥是什么意思| 云是什么意思| 菱角是什么意思| 啤酒加味精有什么作用| 心衰用什么药| 黯然泪下是什么意思| 1957年属什么生肖| 土霉素喂鸡有什么作用| 售馨是什么意思| 流苏是什么东西| 灵魂摆渡是什么意思| 碱什么意思| 警察是什么编制| 社保卡是什么样的图片| 乳头痒是怎么回事是什么原因| 淼字五行属什么| 朋友圈屏蔽显示什么| 吃止痛药有什么副作用| 古来稀是什么意思| 四相是什么意思| 迁移宫代表什么| scc是什么检查项目| 备孕需要做什么准备| 一个黑一个今念什么| 欧阳修字什么| 波折是什么意思| 吉利丁片是什么| 省委组织部长是什么级别| 痛风吃什么菜| 阴道里面长什么样| 受精卵着床的时候会有什么症状| superman什么意思| 阴道刺痛什么原因| 任字五行属什么| 糖尿病人晚餐吃什么最好| 普洱在云南什么位置| 乳酪和奶酪有什么区别| 太形象了是什么意思| 护士要什么学历| 舌钉有什么用| 舌头红是什么原因| 山东日照有什么好玩的| 阳气不足是什么意思| 黄辣丁吃什么食物| 黑素瘤早期什么症状| 口引念什么| 非淋菌尿道炎用什么药| 狗肉不能和什么食物一起吃| 经期头疼是什么原因| 脑囊肿是什么病严重吗| 兆上面是什么单位| 慢性非萎缩性胃炎伴糜烂吃什么药| 福鼎白茶属于什么茶| 为什么会长闭口粉刺| 农历12月18日是什么星座| 侃侃而谈什么意思| 反射弧长是什么意思| 吃烧烤后吃什么水果可以帮助排毒| 2pcs是什么意思| 贱人的意思是什么意思| 为什么女追男没好下场| 孕酮什么意思| 玹字五行属什么| 茜读什么| 四月十八日是什么日子| 左传是一部什么体史书| 什么的味道| 起水泡痒是什么原因| 过敏是什么症状| bso是什么意思| 嘴唇发麻什么病兆| 捣碎东西的器皿叫什么| 皮尔卡丹属于什么档次| 行房时硬度不够是什么原因| 硫酸镁注射有什么作用| 腰椎挂什么科室| 新生儿便秘怎么办什么方法最有效| 头上戴冠是什么生肖| 儿童坐动车需要带什么证件| 睾丸萎缩是什么原因| 五福是什么| 2018年属什么| 睡觉起来眼睛肿是什么原因| 水清则无鱼什么意思| 30年的婚姻是什么婚| 感冒为什么不能吃鸡蛋| 营业执照什么时候年审| 麦是什么意思| 备孕吃叶酸有什么好处| 黄瓜有什么功效| b超能检查出什么| 手上长水泡痒用什么药| 渗透压低是什么原因| 三点是什么时辰| 意尔康属于什么档次| 出虚汗吃什么药| 三保是什么| 宽字五行属什么| 什么西瓜最好吃| 波澜壮阔是什么意思| kj是什么单位| 龟头瘙痒是什么原因| 纹眉失败擦什么淡化| 私联是什么意思| 1989年什么生肖| 手上有湿疹是什么原因引起的| 容易长痣是什么原因| bj是什么| 什么时间最容易怀孕| 尿潜血是什么意思| 属虎的守护神是什么菩萨| 风云人物什么意思| 贡品是什么意思| 什么叫有氧运动和无氧运动| 早年晚岁总无长是什么意思| 木字多一撇是什么字| b型o型生出来的孩子什么血型| gpi是什么意思| 夏字五行属什么| 懒是什么生肖| 白醋泡脚有什么功效| 80年出生属什么生肖| 肾功能不好吃什么药| 梦见穿袜子是什么意思| 鼻饲是什么意思| 百度Jump to content

2016-05-30期 最帅伴郎来啦!胡歌赴德参加袁弘婚礼

Page extended-confirmed-protected
From Wikipedia, the free encyclopedia
百度 建言:FT账户可对接境外经贸合作区来自上海的全国人大代表深入调研后形成的《落实一带一路倡议与上海桥头堡建设专题调研报告》(下称《调研报告》)显示,全国有色金属保税仓库规模达到近180万吨,其中上海保税仓库规模达到120万吨以上。

As part of the Gaza war, the Israel Defense Force (IDF) has used artificial intelligence to rapidly and automatically perform much of the process of determining what to bomb. Israel has greatly expanded the bombing of the Gaza Strip, which in previous wars had been limited by the Israeli Air Force running out of targets.

These tools include the Gospel, an AI which automatically reviews surveillance data looking for buildings, equipment and people thought to belong to the enemy, and upon finding them, recommends bombing targets to a human analyst who may then decide whether to pass it along to the field. Another is Lavender, an "AI-powered database" which lists tens of thousands of Palestinian men linked by AI to Hamas or Palestinian Islamic Jihad, and which is also used for target recommendation.

Critics have argued the use of these AI tools puts civilians at risk, blurs accountability, and results in militarily disproportionate violence in violation of international humanitarian law.

The Gospel

Israel uses an AI system dubbed "Habsora", "the Gospel", to determine which targets the Israeli Air Force would bomb.[1] It automatically provides a targeting recommendation to a human analyst,[2][3] who decides whether to pass it along to soldiers in the field.[3] The recommendations can be anything from individual fighters, rocket launchers, Hamas command posts,[2] to private homes of suspected Hamas or Islamic Jihad members.[4]

AI can process intel far faster than humans.[5][6] Retired Lt Gen. Aviv Kohavi, head of the IDF until 2023, stated that the system could produce 100 bombing targets in Gaza a day, with real-time recommendations which ones to attack, where human analysts might produce 50 a year.[7] A lecturer interviewed by NPR estimated these figures as 50–100 targets in 300 days for 20 intelligence officers, and 200 targets within 10–12 days for the Gospel.[8]

Technological background

Artificial intelligences, despite the name, are not capable of thought or consciousness.[9] Instead, they are machines developed to automate tasks humans accomplish with intelligence through other means. The Gospel uses machine learning,[10] where an AI is tasked with identifying commonalities in vast amounts of data (e.g. scans of cancerous tissue, photos of a facial expression, surveillance of Hamas members identified by human analysts), then looking for those commonalities in new material.[11]

What information the Gospel uses is not known, but it is thought[a] to combine surveillance data from diverse sources in enormous amounts.[14]

Recommendations are based on pattern-matching. A person with enough similarities to other people labeled as enemy combatants may be labelled a combatant themselves.[10]

Regarding the suitability of AIs for the task, NPR cited Heidy Khlaaf, engineering director of AI Assurance at the technology security firm Trail of Bits, as saying "AI algorithms are notoriously flawed with high error rates observed across applications that require precision, accuracy, and safety."[8] Bianca Baggiarini, lecturer at the Australian National University's Strategic and Defence Studies Centre wrote AIs are "more effective in predictable environments where concepts are objective, reasonably stable, and internally consistent." She contrasted this with telling the difference between a combatant and non-combatant, which even humans frequently can't do.[15]

Khlaaf went on to point out that such a system's decisions depend entirely on the data it's trained on,[b] and are not based on reasoning, factual evidence or causation, but solely on statistical probability.[16]

Operation

The IAF ran out of targets to strike[17] in the 2014 war and 2021 crisis.[18] In an interview on France 24, investigative journalist Yuval Abraham of +972 Magazine stated that to maintain military pressure, and due to political pressure to continue the war, the military would bomb the same places twice.[19] Since then, the integration of AI tools has significantly sped up the selection of targets.[20] In early November, the IDF stated more than 12,000 targets in Gaza had been identified by the target administration division[21] that uses the Gospel.[2] NPR wrote on December 14 that it was unclear how many targets from the Gospel had been acted upon, but that the Israeli military said it was currently striking as many as 250 targets a day.[8] The bombing, too, has intensified to what the December 14 article called an astonishing pace:[22] the Israeli military stated at the time it had struck more than 22,000 targets inside Gaza,[22] at a daily rate more than double that of the 2021 conflict,[23] more than 3,500 of them since the collapse of the truce on December 1.[22] Early in the offensive the head of the Air Force stated his forces only struck military targets, but added: "We are not being surgical."[24]

Once a recommendation is accepted, another AI, Fire Factory, cuts assembling the attack down from hours to minutes[25] by calculating munition loads, prioritizing and assigning targets to aircraft and drones, and proposing a schedule,[26] according to a pre-war Bloomberg article that described such AI tools as tailored for a military confrontation and proxy war with Iran.[25]

One change that The Guardian noted is that since senior Hamas leaders disappear into tunnels at the start of an offensive, systems such as the Gospel have allowed the IDF to locate and attack a much larger pool of more junior Hamas operatives. It cited an official who worked on targeting decisions in previous Gaza operations as saying that while the homes of junior Hamas members had previously not been targeted for bombing, the official believes the houses of suspected Hamas operatives were now targeted regardless of rank.[17] In the France 24 interview, Abraham, of +972 Magazine, characterized this as enabling the systematization of dropping a 2000 lb bomb into a home to kill one person and everybody around them, something that had previously been done to a very small group of senior Hamas leaders.[27] NPR cited a report by +972 Magazine and its sister publication Local Call as asserting the system is being used to manufacture targets so that Israeli military forces can continue to bombard Gaza at an enormous rate, punishing the general Palestinian population. NPR noted it had not verified this; it was unclear how many targets are being generated by AI alone, but there had been a substantial increase in targeting, with an enormous civilian toll.[23]

In principle, the combination of a computer's speed to identify opportunities and a human's judgment to evaluate them can enable more precise attacks and fewer civilian casualties.[28] Israeli military and media have emphasized this capacity to minimize harm to non-combatants.[16][29] Richard Moyes, researcher and head of the NGO Article 36, pointed to "the widespread flattening of an urban area with heavy explosive weapons" to question these claims,[29] while Lucy Suchman, professor emeritus at Lancaster University, described the bombing as "aimed at maximum devastation of the Gaza Strip".[8]

The Guardian wrote that when a strike was authorized on private homes of those identified as Hamas or Islamic Jihad operatives, target researchers knew in advance the expected number of civilians killed, each target had a file containing a collateral damage score stipulating how many civilians were likely to be killed in a strike,[30] and according to a senior Israeli military source, operatives use a "very accurate" measurement of the rate of civilians evacuating a building shortly before a strike. "We use an algorithm to evaluate how many civilians are remaining. It gives us a green, yellow, red, like a traffic signal."[29]

2021 use

Kohavi compared the target division using the Gospel to a machine and stated that once the machine was activated in the war of May 2021, it generated 100 targets a day, with half of them being attacked, in contrast with 50 targets in Gaza per year beforehand.[31] Approximately 200 targets came from the Gospel out of the 1,500 targets Israel struck in Gaza in the war,[23] including both static and moving targets according to the military.[32]

The Jewish Institute for National Security of America's after action report identified an issue, stating the system had data on what was a target, but lacked data on what wasn't.[33] The system depends entirely on training data,[16] and intel that human analysts had examined and deemed didn't constitute a target had been discarded, risking bias. The vice president expressed his hopes this had since been rectified.[32]

Organization

The Gospel is used by the military's target administration division (or Directorate of Targets[3] or Targeting Directorate[34]), which was formed in 2019 in the IDF's intelligence directorate[21] to address the air force running out of targets to bomb,[17] and which Kohavi described as "powered by AI capabilities" and including hundreds of officers of soldiers.[31] In addition to its wartime role, The Guardian wrote it'd helped the IDF build a database of between 30,000 and 40,000 suspected militants in recent years, and that systems such as the Gospel had played a critical role in building lists of individuals authorized to be assassinated.[21]

The Gospel was developed by Unit 8200 of the Israeli Intelligence Corps.[35]

Lavender

The Guardian defined Lavender as an AI-powered database, according to six intelligence officers' testimonies given to +972 Magazine/Local Call and shared with The Guardian. The six said Lavender had played a central role in the war, rapidly processing data to identify potential junior operatives to target, at one point listing as many as 37,000 Palestinian men linked by AI to Hamas or PIJ.[36] The details of Lavender's operation or how it comes to its conclusions are not included in accounts published by +972/Local Call, but after a sample of the list was found to have a 90% accuracy rate, the IDF approved Lavender's sweeping use for recommending targets. According to the officers, it was used alongside the Gospel, which targeted buildings and structures instead of individuals.[37]

Citing multiple sources, The Guardian wrote that in previous wars, identifying someone as a legitimate target would be discussed and then signed off by a legal adviser, and that, after 7 October, the process was dramatically accelerated, there was pressure for more targets, and to meet the demand, the IDF came to rely heavily on Lavender for a database of individuals judged to have the characteristics of a PIJ or Hamas militant.[38] The Guardian quoted one source: "I would invest 20 seconds for each target at this stage, and do dozens of them every day. I had zero added-value as a human, apart from being a stamp of approval. It saved a lot of time."[39] A source who justified the use of Lavender to help identify low-ranking targets said that in wartime there's no time to carefully go through the identification process with every target, and rather than invest manpower and time in a junior militant "you're willing to take the margin of error of using artificial intelligence."[40]

The IDF issued a statement that some of the claims portrayed are baseless while others reflect a flawed understanding of IDF directives and international law, and that the IDF does not use an AI system that identifies terrorist operatives or tries to predict whether a person is a terrorist. Instead, it claims information systems are merely one of the types of tools that help analysts gather and optimally analyze intelligence from various sources for the process of identifying military targets, and according to IDF directives, analysts must conduct independent examinations to verify that the targets meet the relevant definitions in accordance with international law and the additional restrictions of the IDF directives.[41]

The statement went on to say the "system" in question is not a system, nor a list of confirmed military operatives eligible to attack, only a database to cross-reference intelligence sources in order to produce up-to-date layers of information on the military operatives of terrorist organizations.[41]

Experts in ethics, AI, and international humanitarian law have criticized the use of such AI systems along ethical and legal lines, arguing that they violate basic principles of international humanitarian law, such as military necessity, proportionality, and the distinction between combatants and civilians.[42]

Allegations of bombing homes

The Guardian cited the intelligence officers' testimonies published by +972 and Local Call as saying Palestinian men linked to Hamas's military wing were considered potential targets regardless of rank or importance,[43] and low-ranking Hamas and PLJ members would be preferentially targeted at home, with one saying the system was built to look for them in these situations when attacking would be much easier.[44] Two of the sources said attacks on low-ranking militants were typically carried out with dumb bombs, destroying entire homes and killing everyone there, with one saying you don't want to waste expensive bombs that are in short supply on unimportant people.[45] Citing unnamed conflict experts, The Guardian wrote that if Israel has been using dumb bombs to flatten the homes of thousands of Palestinians who were linked with AI assistance to militant groups in Gaza, it could help explain what the newspaper called the shockingly high death toll of the war.[46] An Israeli official speaking to +972 stated also that the Israeli program "Where's Daddy?" tracked suspected militants until they returned home, at which point "the IDF bombed them in homes without hesitation, as a first option. It's much easier to bomb a family's home."[47]

The IDF's response to the publication of the testimonies said that unlike Hamas, it is committed to international law and only strikes military targets and military operatives, does so in accordance to proportionality and precautions, and thoroughly examines and investigates exceptions;[48] that a member of an organized armed group or a direct participant in hostilities is a lawful target under international humanitarian law and the policy of all law-abiding countries;[49] that it "makes various efforts to reduce harm to civilians to the extent feasible in the operational circumstances ruling at the time of the strike"; that it chooses the proper munition in accordance with operational and humanitarian considerations; that aerial munitions without an integrated precision-guide kit are developed militaries' standard weaponry; that onboard aircraft systems used by trained pilots ensure high precision of such weapons; and that the clear majority of munitions it uses are precision-guided.[50]

Family homes were also hit in Southern Lebanon, in a residential area of Bint Jbeil, killing two brothers, Ali Ahmed Bazzi and Ibrahim Bazzi, and Ibrahim's wife Shorouq Hammond.[51] The brothers were both Australian citizens; Ali lived locally but Ibrahim was visiting from Sydney to bring his wife home to Australia.[52][51] Hezbollah claimed Ali as one of their fighters,[53][54] and also included the civilian family members in a Hezbollah funeral.[55][51]

Allegations of pre-authorised civilian kill limits

According to the testimonies, the IDF imposed pre-authorised limits on how many civilians it permitted killing in order to kill one Hamas militant. The Guardian cited +972 and Local Call on how this number was over 100 for top-ranking Hamas officials, with one of the sources saying there was a calculation for how many civilians could be killed for a brigade commander, how many for a battalion commander, and so on. One of the officers said that for junior militants, this number was 15 on the first week of the war, and at one point was as low as five. Another said it had been as high as 20 uninvolved civilians for one operative, regardless of rank, military importance, or age.[56] The Guardian wrote that experts in international humanitarian law who spoke to the newspaper expressed alarm.[57]

The IDF's response said that IDF procedures require assessing the anticipated military advantage and collateral damage for each target, that such assessments are made individually, not categorically, that the EDF doesn't carry out strikes when the collateral damage is excessive relative to the military advantage,[58] and that the IDF outright rejects the claim regarding any policy to kill tens of thousands of people in their homes.[59]

Limits of human review

The Guardian cited Moyes as saying a commander who's handed a computer-generated list of targets may not know how the list was generated or be able to question the targeting recommendations, and is in danger of losing the ability to meaningfully consider the risk of civilian harm.[42]

In an opinion piece in Le Monde, reporter élise Vincent [fr] wrote that automated weapons are divided into fully automated systems, which aren't really on the market, and lethal autonomous weapons, which in principle allow human control, and that this division allows Israel to claim the Gospel falls on the side of the more appropriate use of force. She cited Laure de Roucy-Rochegonde, a researcher at Institut fran?ais des relations internationales, as saying the war could obsolete these blurred categories and invigorate a stricter regulatory definition, significant human control, which human rights activists including Article 36 have been trying to advocate. She quoted de Roucy-Rochegonde as saying it's not known what kind of algorithm the Israeli army uses, or how the data has been aggregated, which wouldn't be a problem if they didn't lead to a life-or-death decision.[60]

Diligence

Dr. Marta Bo, researcher at the Stockholm International Peace Research Institute, noted that the humans in human-in-the-loop risk "automation bias": overreliance on systems, giving those systems too much influence over decisions that need to be made by humans.[42]

Suchman observed that the huge volume of targets is likely putting pressure on the human reviewers, saying that "in the face of this kind of acceleration, those reviews become more and more constrained in terms of what kind of judgment people can actually exercise." Tal Mimran, lecturer at Hebrew University in Jerusalem who's previously worked with the government on targeting, added that pressure will make analysts more likely to accept the AI's targeting recommendations, whether they are correct, and they may be tempted to make life easier for themselves by going along with the machine's recommendations, which could create a "whole new level of problems" if the machine is systematically misidentifying targets.[61]

Accountability

Khlaaf noted the difficulty of pursuing accountability when AIs are involved. Humans retain the culpability, but who's responsible if the targeting system fails, and it's impossible to trace the failure to any one mistake by one person? The NPR article went on: "Is it the analyst who accepted the AI recommendation? The programmers who made the system? The intelligence officers who gathered the training data?"[61]

Reactions

United Nations Secretary-General, Antonio Guterres, said he was "deeply troubled" by reports that Israel used artificial intelligence in its military campaign in Gaza, saying the practice puts civilians at risk and blurs accountability.[62] Speaking about the Lavender system, Marc Owen Jones, a professor at Hamad Bin Khalifa University stated, "Let's be clear: This is an AI-assisted genocide, and going forward, there needs to be a call for a moratorium on the use of AI in the war".[63] Ben Saul, a United Nations special rapporteur, stated that if reports about Israel's use of AI were true, then "many Israeli strikes in Gaza would constitute the war crimes of launching disproportionate attacks".[64] Ramesh Srinivasan, a professor at UCLA, stated, "Corporate America Big Tech is actually aligned with many of the Israeli military's actions. The fact that AI systems are being used indicates there's a lack of regard by the Israeli state. Everybody knows these AI systems will make mistakes."[65]

See also

Notes

  1. ^ The Guardian, citing unnamed experts, wrote that "AI-based decision support systems for targeting" would typically "analyse large sets of information from a range of sources, such as drone footage, intercepted communications, surveillance data," and "movements and behaviour patterns of individuals and large groups."[12] NPR cited Blaise Misztal of the Jewish Institute for National Security of America as saying the data likely comes from a wide variety of sources, including such things as cellphone messages, satellite imagery, drone footage and seismic sensors.[13] The data is aggregated and classified by other AI systems before being fed into the Gospel.[2]
  2. ^ It's well-known in the field that an AI imitating the decisions of humans may imitate their mistakes and prejudices, resulting in what's known as algorithmic bias.

References

  1. ^ Lee, Gavin (12 December 2023). "Understanding how Israel uses 'Gospel' AI system in Gaza bombings". France24. Archived from the original on 20 February 2024. Retrieved 1 April 2024.
  2. ^ a b c d Brumfiel, Geoff. "Israel is using an AI system to find targets in Gaza. Experts say it's just the start". NPR. Archived from the original on 20 February 2024. Retrieved 1 April 2024. The Gospel is actually one of several AI programs being used by Israeli intelligence, according to Tal Mimran, a lecturer at Hebrew University in Jerusalem who has worked for the Israeli government on targeting during previous military operations. Other AI systems aggregate vast quantities of intelligence data and classify it. The final system is the Gospel, which makes a targeting recommendation to a human analyst. Those targets could be anything from individual fighters, to equipment like rocket launchers, or facilities such as Hamas command posts.
  3. ^ a b c Brumfiel, Geoff. "Israel is using an AI system to find targets in Gaza. Experts say it's just the start". NPR. Archived from the original on 20 February 2024. Retrieved 1 April 2024. A brief blog post by the Israeli military on November 2 lays out how the Gospel is being used in the current conflict. According to the post, the military's Directorate of Targets is using the Gospel to rapidly produce targets based on the latest intelligence. The system provides a targeting recommendation for a human analyst who then decides whether to pass it along to soldiers in the field.

    "This isn't just an automatic system," Misztal emphasizes. "If it thinks it finds something that could be a potential target, that's flagged then for an intelligence analyst to review."

    The post states that the targeting division is able to send these targets to the IAF and navy, and directly to ground forces via an app known as "Pillar of Fire," which commanders carry on military-issued smartphones and other devices.
  4. ^ Davies, Harry; McKernan, Bethan; Sabbagh, Dan (December 2023). "'The Gospel': how Israel uses AI to select bombing targets in Gaza". The Guardian. Archived from the original on 2 December 2023. Retrieved 1 April 2024. Multiple sources familiar with the IDF's targeting processes confirmed the existence of the Gospel to +972/Local Call, saying it had been used to produce automated recommendations for attacking targets, such as the private homes of individuals suspected of being Hamas or Islamic Jihad operatives.
  5. ^ Brumfiel, Geoff. "Israel is using an AI system to find targets in Gaza. Experts say it's just the start". NPR. Archived from the original on 20 February 2024. Retrieved 1 April 2024. Algorithms can sift through mounds of intelligence data far faster than human analysts, says Robert Ashley, a former head of the U.S. Defense Intelligence Agency. Using AI to assist with targeting has the potential to give commanders an enormous edge.

    "You're going to make decisions faster than your opponent, that's really what it's about," he says.
  6. ^ Baggiarini, Bianca (8 December 2023). "Israel's AI can produce 100 bombing targets a day in Gaza. Is this the future of war?". The Conversation. Archived from the original on 20 February 2024. Retrieved 1 April 2024. Militaries and soldiers frame their decision-making through what is called the "OODA loop" (for observe, orient, decide, act). A faster OODA loop can help you outmanoeuvre your enemy. The goal is to avoid slowing down decisions through excessive deliberation, and instead to match the accelerating tempo of war. So the use of AI is potentially justified on the basis it can interpret and synthesise huge amounts of data, processing it and delivering outputs at rates that far surpass human cognition.
  7. ^ Baggiarini, Bianca (8 December 2023). "Israel's AI can produce 100 bombing targets a day in Gaza. Is this the future of war?". The Conversation. Archived from the original on 20 February 2024. Retrieved 1 April 2024.
  8. ^ a b c d Brumfiel, Geoff. "Israel is using an AI system to find targets in Gaza. Experts say it's just the start". NPR. Archived from the original on 20 February 2024. Retrieved 1 April 2024.
  9. ^ Mueller, John Paul; Massaron, Luca (2016). Machine Learning For Dummies?. Hoboken, New Jersey: John Wiley & Sons. ISBN 978-1-119-24551-3. p. 13: Machine learning relies on algorithms to analyze huge datasets. Currently, machine learning can't provide the sort of AI that the movies present. Even the best algorithms can't think, feel, present any form of self-awareness, or exercise free will.
  10. ^ a b Baggiarini, Bianca (8 December 2023). "Israel's AI can produce 100 bombing targets a day in Gaza. Is this the future of war?". The Conversation. Archived from the original on 20 February 2024. Retrieved 1 April 2024. How does the system produce these targets? It does so through probabilistic reasoning offered by machine learning algorithms.

    Machine learning algorithms learn through data. They learn by seeking patterns in huge piles of data, and their success is contingent on the data's quality and quantity. They make recommendations based on probabilities.

    The probabilities are based on pattern-matching. If a person has enough similarities to other people labelled as an enemy combatant, they too may be labelled a combatant themselves.
  11. ^ Mueller, John Paul; Massaron, Luca (2016). Machine Learning For Dummies?. Hoboken, New Jersey: John Wiley & Sons. ISBN 978-1-119-24551-3. p. 33: The secret to machine learning is generalization. The goal is to generalize the output function so that it works on data beyond the training set. For example, consider a spam filter. Your dictionary contains 100,000 words (actually a small dictionary). A limited training dataset of 4,000 or 5,000 word combinations must create a generalized function that can then find spam in the 2^100,000 combinations that the function will see when working with actual data.
  12. ^ Davies, Harry; McKernan, Bethan; Sabbagh, Dan (December 2023). "'The Gospel': how Israel uses AI to select bombing targets in Gaza". The Guardian. Archived from the original on 2 December 2023. Retrieved 1 April 2024. Precisely what forms of data are ingested into the Gospel is not known. But experts said AI-based decision support systems for targeting would typically analyse large sets of information from a range of sources, such as drone footage, intercepted communications, surveillance data and information drawn from monitoring the movements and behaviour patterns of individuals and large groups.
  13. ^ Brumfiel, Geoff. "Israel is using an AI system to find targets in Gaza. Experts say it's just the start". NPR. Archived from the original on 20 February 2024. Retrieved 1 April 2024. Although it's not known exactly what data the Gospel uses to make its suggestions, it likely comes from a wide variety of different sources. The list includes things like cell phone messages, satellite imagery, drone footage and even seismic sensors, according to Blaise Misztal, vice president for policy at the Jewish Institute for National Security of America, a group that facilitates military cooperation between Israel and the United States.
  14. ^ Inskeep, Steve. "Israel is using an AI system to find targets in Gaza. Experts say it's just the start". NPR. Archived from the original on 20 February 2024. Retrieved 1 April 2024. The system is called the Gospel. And basically, it takes an enormous quantity of surveillance data, crunches it all together and makes recommendations about where the military should strike.
  15. ^ Baggiarini, Bianca (8 December 2023). "Israel's AI can produce 100 bombing targets a day in Gaza. Is this the future of war?". The Conversation. Archived from the original on 20 February 2024. Retrieved 1 April 2024. Some claim machine learning enables greater precision in targeting, which makes it easier to avoid harming innocent people and using a proportional amount of force. However, the idea of more precise targeting of airstrikes has not been successful in the past, as the high toll of declared and undeclared civilian casualties from the global war on terror shows.

    Moreover, the difference between a combatant and a civilian is rarely self-evident. Even humans frequently cannot tell who is and is not a combatant.

    Technology does not change this fundamental truth. Often social categories and concepts are not objective, but are contested or specific to time and place. But computer vision together with algorithms are more effective in predictable environments where concepts are objective, reasonably stable, and internally consistent.
  16. ^ a b c Brumfiel, Geoff. "Israel is using an AI system to find targets in Gaza. Experts say it's just the start". NPR. Archived from the original on 20 February 2024. Retrieved 1 April 2024. The Israeli military did not respond directly to NPR's inquiries about the Gospel. In the November 2 post, it said the system allows the military to "produce targets for precise attacks on infrastructures associated with Hamas, while causing great damage to the enemy and minimal harm to those not involved," according to an unnamed spokesperson.

    But critics question whether the Gospel and other associated AI systems are in fact performing as the military claims. Khlaaf notes that artificial intelligence depends entirely on training data to make its decisions.

    "The nature of AI systems is to provide outcomes based on statistical and probabilistic inferences and correlations from historical data, and not any type of reasoning, factual evidence, or 'causation,'" she says.
  17. ^ a b c Davies, Harry; McKernan, Bethan; Sabbagh, Dan (December 2023). "'The Gospel': how Israel uses AI to select bombing targets in Gaza". The Guardian. Archived from the original on 2 December 2023. Retrieved 1 April 2024. The target division was created to address a chronic problem for the IDF: in earlier operations in Gaza, the air force repeatedly ran out of targets to strike. Since senior Hamas officials disappeared into tunnels at the start of any new offensive, sources said, systems such as the Gospel allowed the IDF to locate and attack a much larger pool of more junior operatives.

    One official, who worked on targeting decisions in previous Gaza operations, said the IDF had not previously targeted the homes of junior Hamas members for bombings. They said they believed that had changed for the present conflict, with the houses of suspected Hamas operatives now targeted regardless of rank.
  18. ^ Lee, Gavin (12 December 2023). "Understanding how Israel uses 'Gospel' AI system in Gaza bombings". France24. Archived from the original on 20 February 2024. Retrieved 1 April 2024. Yuval Abraham: "Now, sources that I've spoken to that have operated the Gospel and have served in that center [...] they said the use of artificial intelligence is being incr- increasing trend in the military because in the past, the military ran out of targets in 2014 and 2021.
  19. ^ Lee, Gavin (12 December 2023). "Understanding how Israel uses 'Gospel' AI system in Gaza bombings". France24. Archived from the original on 20 February 2024. Retrieved 1 April 2024. Yuval Abraham: I mean one source recalled how, for example, in 2021 and 2014, y'know, they ran out of targets. They had nothing left to bomb. There was nothing but quality to bomb. But there was political pressure to continue the war. There was a need to continue the pressure in Gaza. So one source recalled how in 2014, they would bomb the same places twice. When you have artificial intelligence, when you have automation, when you can create so many targets, often spending, y'know, less than a minute on a target that, at the end of the day, is killing families, y'know? So, so, so that allows you to continue wars, often even for political purposes, it could be, for much longer than you could in the past.
  20. ^ Davies, Harry; McKernan, Bethan; Sabbagh, Dan (December 2023). "'The Gospel': how Israel uses AI to select bombing targets in Gaza". The Guardian. Archived from the original on 2 December 2023. Retrieved 1 April 2024.
  21. ^ a b c Davies, Harry; McKernan, Bethan; Sabbagh, Dan (December 2023). "'The Gospel': how Israel uses AI to select bombing targets in Gaza". The Guardian. Archived from the original on 2 December 2023. Retrieved 1 April 2024. In early November, the IDF said "more than 12,000" targets in Gaza had been identified by its target administration division.

    The activities of the division, formed in 2019 in the IDF's intelligence directorate, are classified.

    However a short statement on the IDF website claimed it was using an AI-based system called Habsora (the Gospel, in English) in the war against Hamas to "produce targets at a fast pace".

    [...] In recent years, the target division has helped the IDF build a database of what sources said was between 30,000 and 40,000 suspected militants. Systems such as the Gospel, they said, had played a critical role in building lists of individuals authorised to be assassinated.
  22. ^ a b c Brumfiel, Geoff. "Israel is using an AI system to find targets in Gaza. Experts say it's just the start". NPR. Archived from the original on 20 February 2024. Retrieved 1 April 2024. The pace is astonishing: In the wake of the brutal attacks by Hamas-led militants on October 7, Israeli forces have struck more than 22,000 targets inside Gaza, a small strip of land along the Mediterranean coast. Just since the temporary truce broke down on December 1, Israel's Air Force has hit more than 3,500 sites.
  23. ^ a b c Brumfiel, Geoff. "Israel is using an AI system to find targets in Gaza. Experts say it's just the start". NPR. Archived from the original on 20 February 2024. Retrieved 1 April 2024. A report by the Israeli publication +972 Magazine and the Hebrew-language outlet Local Call asserts that the system is being used to manufacture targets so that Israeli military forces can continue to bombard Gaza at an enormous rate, punishing the general Palestinian population.

    NPR has not independently verified those claims, and it's unclear how many targets are currently being generated by AI alone. But there has been a substantial increase in targeting, according to the Israeli military's own numbers. In the 2021 conflict, Israel said it struck 1,500 targets in Gaza, approximately 200 of which came from the Gospel. Since October 7, the military says it has struck more than 22,000 targets inside Gaza – a daily rate more than double that of the 2021 conflict.

    The toll on Palestinian civilians has been enormous.
  24. ^ Davies, Harry; McKernan, Bethan; Sabbagh, Dan (December 2023). "'The Gospel': how Israel uses AI to select bombing targets in Gaza". The Guardian. Archived from the original on 2 December 2023. Retrieved 1 April 2024. Israel's military has made no secret of the intensity of its bombardment of the Gaza Strip. In the early days of the offensive, the head of its air force spoke of relentless, "around the clock" airstrikes. His forces, he said, were only striking military targets, but he added: "We are not being surgical."
  25. ^ a b Newman, Marissa (16 July 2023). "Israel Quietly Embeds AI Systems in Deadly Military Operations". Bloomberg. Retrieved 4 April 2024. In recent months, Israel has been issuing near-daily warnings to Iran over its uranium enrichment, vowing it will not allow the country to obtain nuclear weapons under any circumstances. Should the two enter into a military confrontation, the IDF anticipates that Iranian proxies in Gaza, Syria and Lebanon would retaliate, setting the stage for the first serious multi-front conflict for Israel since a surprise attack by Egypt and Syria 50 years ago sparked the Yom Kippur War.

    AI-based tools like Fire Factory are tailored for such a scenario, according to IDF officials. "What used to take hours now takes minutes, with a few more minutes for human review," said Col. Uri, who heads the army's digital transformation unit [...] "With the same amount of people, we do much more."
  26. ^ Newman, Marissa (16 July 2023). "Israel Quietly Embeds AI Systems in Deadly Military Operations". Bloomberg. Retrieved 4 April 2024. Though the military won't comment on specific operations, officials say that it now uses an AI recommendation system that can crunch huge amounts of data to select targets for air strikes. Ensuing raids can then be rapidly assembled with another artificial intelligence model called Fire Factory, which uses data about military-approved targets to calculate munition loads, prioritize and assign thousands of targets to aircraft and drones, and propose a schedule.
  27. ^ Lee, Gavin (12 December 2023). "Understanding how Israel uses 'Gospel' AI system in Gaza bombings". France24. Archived from the original on 20 February 2024. Retrieved 1 April 2024. Yuval Abraham: "What we're talking about is, a policy of dropping a bomb that weighs two thousand pounds, on a home, in order to assassinate one person, okay? Now, in the past, imagine before artificial intelligence and automation, you would do that, say, for a group of very small senior leaders of Hamas, killing them and knowingly killing everybody around them [...] when you automate that process, when you have a need to strike hundreds and thousands of targets, you can do so in a systematic way..."
  28. ^ Baggiarini, Bianca (8 December 2023). "Israel's AI can produce 100 bombing targets a day in Gaza. Is this the future of war?". The Conversation. Archived from the original on 20 February 2024. Retrieved 1 April 2024. In principle, machine learning systems may enable more precisely targeted attacks and fewer civilian casualties.
  29. ^ a b c Davies, Harry; McKernan, Bethan; Sabbagh, Dan (December 2023). "'The Gospel': how Israel uses AI to select bombing targets in Gaza". The Guardian. Archived from the original on 2 December 2023. Retrieved 1 April 2024. In the IDF's brief statement about its target division, a senior official said the unit "produces precise attacks on infrastructure associated with Hamas while inflicting great damage to the enemy and minimal harm to non-combatants".

    The precision of strikes recommended by the "AI target bank" has been emphasised in multiple reports in Israeli media. The Yedioth Ahronoth daily newspaper reported that the unit "makes sure as far as possible there will be no harm to non-involved civilians".

    A former senior Israeli military source told the Guardian that operatives use a "very accurate" measurement of the rate of civilians evacuating a building shortly before a strike. "We use an algorithm to evaluate how many civilians are remaining. It gives us a green, yellow, red, like a traffic signal."

    [...] "Look at the physical landscape of Gaza," said Richard Moyes, a researcher who heads Article 36, a group that campaigns to reduce harm from weapons. "We're seeing the widespread flattening of an urban area with heavy explosive weapons, so to claim there's precision and narrowness of force being exerted is not borne out by the facts."
  30. ^ Davies, Harry; McKernan, Bethan; Sabbagh, Dan (December 2023). "'The Gospel': how Israel uses AI to select bombing targets in Gaza". The Guardian. Archived from the original on 2 December 2023. Retrieved 1 April 2024. Multiple sources told the Guardian and +972/Local Call that when a strike was authorised on the private homes of individuals identified as Hamas or Islamic Jihad operatives, target researchers knew in advance the number of civilians expected to be killed.

    Each target, they said, had a file containing a collateral damage score that stipulated how many civilians were likely to be killed in a strike.
  31. ^ a b Davies, Harry; McKernan, Bethan; Sabbagh, Dan (December 2023). "'The Gospel': how Israel uses AI to select bombing targets in Gaza". The Guardian. Archived from the original on 2 December 2023. Retrieved 1 April 2024. Aviv Kochavi, who served as the head of the IDF until January, has said the target division is "powered by AI capabilities" and includes hundreds of officers and soldiers.

    In an interview published before the war, he said it was "a machine that produces vast amounts of data more effectively than any human, and translates it into targets for attack".

    According to Kochavi, "once this machine was activated" in Israel's 11-day war with Hamas in May 2021, it generated 100 targets a day. "To put that into perspective, in the past we would produce 50 targets in Gaza per year. Now, this machine produces 100 targets a single day, with 50% of them being attacked."
  32. ^ a b Brumfiel, Geoff. "Israel is using an AI system to find targets in Gaza. Experts say it's just the start". NPR. Archived from the original on 20 February 2024. Retrieved 1 April 2024. Misztal's group documented one of the first trials of the Gospel, during a 2021 conflict in Gaza between Israel and the militant groups Hamas and Islamic Jihad. According to press reports and statements from the military itself, Israel used the Gospel and other AI programs to identify likely targets such as rocket launchers. The system was used to identify static targets as well as moving targets as they appeared on the battlefield. According to press reports, it identified around 200 targets in the conflict.

    But it was not without its problems. The after-action report by Misztal's group noted that, while the AI had plenty of training data for what constituted a target, it lacked data on things that human analysts had decided were not targets. The Israeli military hadn't collected the target data its analysts had discarded, and as a result the system's training had been biased.

    "It's been two years since then, so it's something that, hopefully, they've been able to rectify," Misztal says.
  33. ^ "Gaza Conflict 2021 Assessment: Observations and Lessons" (PDF).
  34. ^ Leshem, Ron (30 June 2023). "IDF possesses Matrix-like capabilities, ex-Israeli army chief says". Ynetnews. Retrieved 26 March 2024.)
  35. ^ Brumfiel, Geoff. "Israel is using an AI system to find targets in Gaza. Experts say it's just the start". NPR. Archived from the original on 20 February 2024. Retrieved 1 April 2024. According to posts on the Israeli military's website, the Gospel was developed by Israel's signals intelligence branch, known as Unit 8200. The system is relatively new — one of the earliest mentions was a top innovation award that it won in 2020.
  36. ^ McKernan, Bethan; Davies, Harry (3 April 2024). "'The machine did it coldly': Israel used AI to identify 37,000 Hamas targets". The Guardian. Retrieved 4 April 2024. The Israeli military's bombing campaign in Gaza used a previously undisclosed AI-powered database that at one stage identified 37,000 potential targets based on their apparent links to Hamas, according to intelligence sources involved in the war.

    The testimony from the six intelligence officers, all who have been involved in using AI systems to identify Hamas and Palestinian Islamic Jihad (PIJ) targets in the war, was given to the journalist Yuval Abraham for a report published by the Israeli-Palestinian publication +972 Magazine and the Hebrew-language outlet Local Call.

    Their accounts were shared exclusively with the Guardian in advance of publication. All six said that Lavender had played a central role in the war, processing masses of data to rapidly identify potential "junior" operatives to target. Four of the sources said that, at one stage early in the war, Lavender listed as many as 37,000 Palestinian men who had been linked by the AI system to Hamas or PIL.
  37. ^ McKernan, Bethan; Davies, Harry (3 April 2024). "'The machine did it coldly': Israel used AI to identify 37,000 Hamas targets". The Guardian. Retrieved 4 April 2024. Details about the specific kinds of data used to train Lavender's algorithm, or how the programme reached its conclusions, are not included in the accounts published by +972 or Local Call. However, the sources said that during the first few weeks of the war, Unit 8200 refined Lavender's algorithm and tweaked its search parameters.

    After randomly sampling and cross-checking its predictions, the unit concluded Lavender had achieved a 90% accuracy rate, the sources said, leading the IDF to approve its sweeping use as a target recommendation tool.

    Lavender created a database of tens of thousands of individuals who were marked as predominantly low-ranking members of Hamas's military wing, they added. This was used alongside another AI-based decision support system, called the Gospel, which recommended buildings and structures as targets rather than individuals.
  38. ^ McKernan, Bethan; Davies, Harry (3 April 2024). "'The machine did it coldly': Israel used AI to identify 37,000 Hamas targets". The Guardian. Retrieved 4 April 2024. In earlier military operations conducted by the IDF, producing human targets was often a more labour-intensive process. Multiple sources who described target development in previous wars to the Guardian, said the decision to "incriminate" an individual, or identify them as a legitimate target, would be discussed and then signed off by a legal adviser.

    In the weeks and months after 7 October, this model for approving strikes on human targets was dramatically accelerated, according to the sources. As the IDF's bombardment of Gaza intensified, they said, commanders demanded a continuous pipeline of targets.

    "We were constantly being pressured: 'Bring us more targets.' They really shouted at us," said one intelligence officer. "We were told: now we have to fuck up Hamas, no matter what the cost. Whatever you can, you bomb."

    To meet this demand, the IDF came to rely heavily on Lavender to generate a database of individuals judged to have the characteristics of a PIJ or Hamas militant.
  39. ^ McKernan, Bethan; Davies, Harry (3 April 2024). "'The machine did it coldly': Israel used AI to identify 37,000 Hamas targets". The Guardian. Retrieved 4 April 2024.
  40. ^ McKernan, Bethan; Davies, Harry (3 April 2024). "'The machine did it coldly': Israel used AI to identify 37,000 Hamas targets". The Guardian. Retrieved 4 April 2024. Another source, who justified the use of Lavender to help identify low-ranking targets, said that "when it comes to a junior militant, you don't want to invest manpower and time in it". They said that in wartime there was insufficient time to carefully "incriminate every target".

    "So you're willing to take the margin of error of using artificial intelligence, risking collateral damage and civilians dying, and risking attacking by mistake, and to live with it," they added.
  41. ^ a b "Israel Defence Forces' response to claims about use of 'Lavender' AI database in Gaza". The Guardian. 3 April 2024. Retrieved 4 April 2024. Some of the claims portrayed in your questions are baseless in fact, while others reflect a flawed understanding of IDF directives and international law. Following the murderous attack by the Hamas terror organization on October 7, the IDF has been operating to dismantle Hamas' military capabilities.

    [...]The process of identifying military targets in the IDF consists of various types of tools and methods, including information management tools, which are used in order to help the intelligence analysts to gather and optimally analyze the intelligence, obtained from a variety of sources. Contrary to claims, the IDF does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist. Information systems are merely tools for analysts in the target identification process. According to IDF directives, analysts must conduct independent examinations, in which they verify that the identified targets meet the relevant definitions in accordance with international law and additional restrictions stipulated in the IDF directives.

    The "system" your questions refer to is not a system, but simply a database whose purpose is to cross-reference intelligence sources, in order to produce up-to-date layers of information on the military operatives of terrorist organizations. This is not a list of confirmed military operatives eligible to attack.
  42. ^ a b c Davies, Harry; McKernan, Bethan; Sabbagh, Dan (December 2023). "'The Gospel': how Israel uses AI to select bombing targets in Gaza". The Guardian. Archived from the original on 2 December 2023. Retrieved 2 April 2024. For some experts who research AI and international humanitarian law, an acceleration of this kind raises a number of concerns.

    Dr Marta Bo, a researcher at the Stockholm International Peace Research Institute, said that even when "humans are in the loop" there is a risk they develop "automation bias" and "over-rely on systems which come to have too much influence over complex human decisions".

    Moyes, of Article 36, said that when relying on tools such as the Gospel, a commander "is handed a list of targets a computer has generated" and they "don't necessarily know how the list has been created or have the ability to adequately interrogate and question the targeting recommendations".

    "There is a danger," he added, "that as humans come to rely on these systems they become cogs in a mechanised process and lose the ability to consider the risk of civilian harm in a meaningful way."
  43. ^ McKernan, Bethan; Davies, Harry (3 April 2024). "'The machine did it coldly': Israel used AI to identify 37,000 Hamas targets". The Guardian. Retrieved 4 April 2024. In the weeks after the Hamas-led 7 October assault on southern Israel, in which Palestinian militants killed nearly 1,200 Israelis and kidnapped about 240 people, the sources said there was a decision to treat Palestinian men linked to Hamas's military wing as potential targets, regardless of their rank or importance.
  44. ^ McKernan, Bethan; Davies, Harry (3 April 2024). "'The machine did it coldly': Israel used AI to identify 37,000 Hamas targets". The Guardian. Retrieved 4 April 2024. The testimonies published by +972 and Local Call may explain how such a western military with such advanced capabilities, with weapons that can conduct highly surgical strikes, has conducted a war with such a vast human toll.

    When it came to targeting low-ranking Hamas and PIJ suspects, they said, the preference was to attack when they were believed to be at home. "We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity," one said. "It's much easier to bomb a family's home. The system is built to look for them in these situations."
  45. ^ McKernan, Bethan; Davies, Harry (3 April 2024). "'The machine did it coldly': Israel used AI to identify 37,000 Hamas targets". The Guardian. Retrieved 4 April 2024. Two sources said that during the early weeks of the war they were permitted to kill 15 or 20 civilians during airstrikes on low-ranking militants. Attacks on such targets were typically carried out using unguided munitions known as "dumb bombs", the sources said, destroying entire homes and killing all their occupants.

    "You don't want to waste expensive bombs on unimportant people – it's very expensive for the country and there's a shortage [of those bombs]," one intelligence officer said. Another said the principal question they were faced with was whether the "collateral damage" to civilians allowed for an attack.

    "Because we usually carried out the attacks with dumb bombs, and that meant literally dropping the whole house on its occupants. But even if an attack is averted, you don't care – you immediately move on to the next target. Because of the system, the targets never end. You have another 36,000 waiting."
  46. ^ McKernan, Bethan; Davies, Harry (3 April 2024). "'The machine did it coldly': Israel used AI to identify 37,000 Hamas targets". The Guardian. Retrieved 4 April 2024. According to conflict experts, if Israel has been using dumb bombs to flatten the homes of thousands of Palestinians who were linked, with the assistance of AI, to militant groups in Gaza, that could help explain the shockingly high death toll in the war.
  47. ^ Rommen, Rebecca (7 April 2024). "Israel's 'Where's Daddy?' AI system helps target suspected Hamas militants when they're at home with their families, report says". Yahoo! News. Retrieved 4 October 2024.
  48. ^ "Israel Defence Forces' response to claims about use of 'Lavender' AI database in Gaza". The Guardian. 3 April 2024. Retrieved 4 April 2024. Contrary to Hamas, the IDF is committed to international law and acts accordingly. As such, the IDF directs its strikes only towards military targets and military operatives and carries out strikes in accordance with the rules of proportionality and precautions in attacks. Exceptional incidents undergo thorough examinations and investigations.
  49. ^ "Israel Defence Forces' response to claims about use of 'Lavender' AI database in Gaza". The Guardian. 3 April 2024. Retrieved 4 April 2024. According to international humanitarian law, a person who is identified as a member of an organized armed group (like the Hamas' military wing), or a person who directly participates in hostilities, is considered a lawful target. This legal rule is reflected in the policy of all law-abiding countries, including the IDF's legal practice and policy, which did not change during the course of the war.
  50. ^ "Israel Defence Forces' response to claims about use of 'Lavender' AI database in Gaza". The Guardian. 3 April 2024. Retrieved 4 April 2024. As for the manner of carrying out the strikes – the IDF makes various efforts to reduce harm to civilians to the extent feasible in the operational circumstances ruling at the time of the strike.

    In this regard, the IDF reviews targets before strikes and chooses the proper munition in accordance with operational and humanitarian considerations, taking into account an assessment of the relevant structural and geographical features of the target, the target's environment, possible effects on nearby civilians, critical infrastructure in the vicinity, and more. Aerial munitions without an integrated precision-guide kit are standard weaponry in developed militaries worldwide. The IDF uses such munitions while employing onboard aircraft systems to calculate a specific release point to ensure a high level of precision, used by trained pilots. In any event, the clear majority of munitions used in strikes are precision-guided munitions.
  51. ^ a b c Dyett, Greg (28 December 2023). "Australian man, his wife and brother killed in air strike". SBS News. Retrieved 26 June 2024. Local media in Lebanon says an Israeli war plane fired a missile at a number of homes in Lebanon's Bint Jbei area. A missile strike killed 27-year-old Ibraham Bazzi, his brother Ali Bazzi and Ibrahim's wife Shorouk Hammond. Ms Hammoud had recently acquired an Australian visa and she and her husband Ibrahim were planning a life in Australia. Afif Bazzi (Mayor of Bint Jbeil): "It was a surprise that the Israelis hit a civilian neighbourhood, people are living normally, they have not fled. We did not flee Bint Jbeil, all residents are still in Bint Jbeil. We hear the bombardment and the shelling but it was still far away, the town was neutral but we were surprised that a civilian neighbourhood was hit, civilians, a groom who came from Australia to take his bride. They were spending time together along with his brother at his brother's house, really it was a surprise for us." (translation by SBS World News)
  52. ^ "Australian man and his brother killed in Lebanon after building hit by Israeli air strike, family says". ABC News. 27 December 2023. Retrieved 10 August 2024.
  53. ^ Bourke, Latika; Ireland, Olivia (28 December 2023). "Australian killed in Lebanon strike was Hezbollah fighter, militant group says". The Sydney Morning Herald. Retrieved 10 August 2024.
  54. ^ "Australian reportedly killed in Lebanon by airstrike". Australian Financial Review. 27 December 2023. Retrieved 10 August 2024.
  55. ^ "Military-style funeral held for Australian 'Hezbollah fighter' killed by Israeli air strike in Lebanon". ABC News. 28 December 2023. Retrieved 10 August 2024.
  56. ^ "Israel Defence Forces' response to claims about use of 'Lavender' AI database in Gaza". The Guardian. 3 April 2024. Retrieved 4 April 2024. Such a strategy risked higher numbers of civilian casualties, and the sources said the IDF imposed pre-authorised limits on the number of civilians it deemed acceptable to kill in a strike aimed at a single Hamas militant. The ratio was said to have changed over time, and varied according to the seniority of the target.

    According to +972 and Local Call, the IDF judged it permissible to kill more than 100 civilians in attacks on a top-ranking Hamas officials. "We had a calculation for how many [civilians could be killed] for the brigade commander, how many [civilians] for a battalion commander, and so on," one source said.

    [...]One source said that the limit on permitted civilian casualties "went up and down" over time, and at one point was as low as five. During the first week of the conflict, the source said, permission was given to kill 15 non-combatants to take out junior militants in Gaza. However, they said estimates of civilian casualties were imprecise, as it was not possible to know definitively how many people were in a building.

    Another intelligence officer said that more recently in the conflict, the rate of permitted collateral damage was brought down again. But at one stage earlier in the war they were authorised to kill up to "20 uninvolved civilians" for a single operative, regardless of their rank, military importance, or age.
  57. ^ McKernan, Bethan; Davies, Harry (3 April 2024). "'The machine did it coldly': Israel used AI to identify 37,000 Hamas targets". The Guardian. Retrieved 4 April 2024. Experts in international humanitarian law who spoke to the Guardian expressed alarm at accounts of the IDF accepting and pre-authorising collateral damage ratios as high as 20 civilians, particularly for lower-ranking militants. They said militaries must assess proportionality for each individual strike.
  58. ^ "Israel Defence Forces' response to claims about use of 'Lavender' AI database in Gaza". The Guardian. 3 April 2024. Retrieved 4 April 2024. For each target, IDF procedures require conducting an individual assessment of the anticipated military advantage and collateral damage expected. Such assessments are not made categorically in relation to the approval of individual strikes. The assessment of the collateral damage expected from a strike is based on a variety of assessment methods and intelligence-gathering measures, in order to achieve the most accurate assessment possible, considering the relevant operational circumstances. The IDF does not carry out strikes when the expected collateral damage from the strike is excessive in relation to the military advantage.
  59. ^ "Israel Defence Forces' response to claims about use of 'Lavender' AI database in Gaza". The Guardian. 3 April 2024. Retrieved 4 April 2024. The IDF outright rejects the claim regarding any policy to kill tens of thousands of people in their homes.
  60. ^ Elise, Vincent (15 December 2023). "Israel's use of AI in bombings raises questions over rules of war". Le Monde.fr. Archived from the original on 20 February 2024. Retrieved 14 April 2024. Automated weapons today fall into two main categories: Fully-automated lethal weapons systems, of which there are no real examples on the market, and lethal autonomous weapons (LAWs), which in principle allow humans to have control. The vast majority of Western military powers – and Israel, with Habsora – now claim to have opted for LAWs and can therefore claim to be on the more appropriate side of the use of force.

    Laure de Roucy-Rochegonde, also a researcher at IFRI and the author of a thesis on the regulation of autonomous weapons systems, said the specifics of the war between Israel and Hamas could render these blurred categories obsolete and reinvigorate another regulatory concept, that of "significant human control." It's a stricter definition that some human rights activists, including the NGO Article 36, have been pushing for without much success. "The problem is that we don't know what kind of algorithm is being used [by the Israeli army], or how the data has been aggregated. It wouldn't be a problem if there wasn't a life-or-death decision at the end of it," said de Roucy-Rochegonde.
  61. ^ a b Brumfiel, Geoff. "Israel is using an AI system to find targets in Gaza. Experts say it's just the start". NPR. Archived from the original on 20 February 2024. Retrieved 2 April 2024. The huge volume of targets is also likely putting pressure on the humans asked to review them, says Suchman. "In the face of this kind of acceleration, those reviews become more and more constrained in terms of what kind of judgment people can actually exercise," she says.

    Mimran adds that, under pressure, analysts will be more likely to accept the AI's targeting recommendations, regardless of whether they are correct. Targeting officers may be tempted to think that "life will be much easier if we flow with the machine and accept its advice and recommendations," he says. But it could create a "whole new level of problems" if the machine is systematically misidentifying targets.

    Finally, Khlaaf points out that the use of AI could make it more difficult to pursue accountability for those involved in the conflict. Although humans still retain the legal culpability for strikes, it's unclear who is responsible if the targeting system fails. Is it the analyst who accepted the AI recommendation? The programmers who made the system? The intelligence officers who gathered the training data?
  62. ^ "UN chief 'deeply troubled' by reports Israel using AI to identify Gaza targets". France 24. 2025-08-05. Retrieved 2025-08-05.
  63. ^ "'AI-assisted genocide': Israel reportedly used database for Gaza kill lists". Al Jazeera. Retrieved 12 April 2024.
  64. ^ ". "'AI-assisted genocide': Israel reportedly used database for Gaza kill lists". Al Jazeera. Retrieved 16 April 2024.
  65. ^ "Big Tech 'basically complicit' in civilian deaths in Gaza". Al Jazeera. Retrieved 23 April 2024.
曹操为什么杀华佗 痔疮什么症状 面包虫长大后变成什么 asus是什么牌子 为什么都开头孢不开阿莫西林
又什么又什么的花 中国第一个不平等条约是什么 80岁称什么之年 疤痕增生是什么 什么是c字裤
利普刀是什么手术 什么树没有叶子 直肠癌是什么症状 晕车为什么读第四声 咳嗽吃什么菜
济公搓的泥丸叫什么 文爱是什么意思 百什么争鸣 头孢长什么样图片 蟑螂为什么叫小强
成双成对是什么数字liaochangning.com 见分晓是什么意思hcv9jop0ns6r.cn 转移灶是什么意思hcv9jop1ns1r.cn 感冒头晕吃什么药hcv7jop6ns6r.cn 湿疹涂什么药hcv9jop8ns2r.cn
左小腹疼是什么原因hcv8jop2ns1r.cn 皮肤属于什么组织hcv9jop7ns0r.cn 韩信属什么生肖hcv8jop3ns2r.cn 什么是腺癌0735v.com 什么霄云外hcv9jop3ns6r.cn
飘零是什么意思hcv8jop4ns4r.cn 为什么老是肚子疼shenchushe.com 纳征是什么意思hcv8jop3ns0r.cn 锦五行属什么hcv8jop6ns0r.cn 做脑部检查挂什么科hcv9jop8ns0r.cn
嗓子疼吃什么水果hcv8jop8ns7r.cn 阴道红肿是什么原因hcv8jop1ns3r.cn 甲醛会导致什么病hcv9jop5ns5r.cn 小便无力是什么原因hcv9jop5ns6r.cn 阴毛有什么作用hcv8jop4ns0r.cn
百度