Summary
Transcript
In 2021, the commander of israeli intelligence published a book on designing a special machine that would resolve what he described as a human bottleneck for locating and approving targets in war. A recent investigation by Plus 972 magazine and local call reveals that the israeli army has developed an artificial intelligence based program known as Lavender, which does exactly that. According to six israeli intelligence officers with firsthand experience, the lavender AI machine determined who to kill and was obeyed with military discipline.
During the first weeks of the war, the lavender system designated about 37,000 Palestinians as targets and directed airstrikes on their homes. Despite knowing that the system makes errors about 10% of the time, there was no requirement to check the machine’s data. The israeli army systematically attacked the targeted individuals at night in their homes while their whole family was present. An automated system known as where’s Daddy? Was used to track the targeted individuals and carry out bombings when they entered their family’s residences.
The obvious result was that thousands of women and children were wiped out by israeli airstrikes. According to these israeli intelligence officers, the IDF bombed them in homes as a first option, and on several occasions, entire families were murdered when the actual target was not inside. In one instance, four buildings were destroyed, along with everyone inside because a single target was in one of them. When it came to targets marked as low level by the AI lavender system, cheaper bombs were used, which destroyed entire buildings, killing mostly civilians and entire families.
This was done because the IDF did not want to waste expensive bombs on who they deemed as unimportant people. It was decided that for every low level Hamas operative that lavender marked, it was permissible to kill up to 15 or 20 civilians. And if the target was a senior Hamas official, more than 100 civilians was acceptable. Most of these AI targets were never tracked. Before the war, the lavender software analyzed information collected on the 2.
3 million residents of the Gaza Strip through a system of mass surveillance, assessed the likelihood of each person being a militant, and gave a rating from one to 100. If the rating was high enough than they were killed. Along with their entire family, lavender flagged individuals who had patterns similar to Hamas, including police, civil defense relatives, and residents who had similar names and nicknames. This sort of tracking system has existed in the US for years.
What I will be providing you, and the fine gentleman of secret service, is a list of every threat made about the president since February 3 and a profile of every threat maker. And these are like existing targets. Exhibit A, Oakland resident Justin Pinsky posted on a message board, Romania has a storied history of executing their leaders. Couldn’t they do us a solid and take out Bush. How is this all possible? Keyword selectors attack take out Bush.
So think of it as a Google search. Except instead of searching only what people make public, we’re also looking at everything they don’t. So emails, chats, SMS, whatever. Yeah, but which people? The whole kingdom’s not white. And while many people claim that Israel controls the US, Joe Biden said that Israel serves us interests. There’s no apology to be made. None. It is the best $3 billion investment we make.
Were there not in Israel, the United States of America would have to invent an Israel to protect her interests in the region. The United States would have to go out and invent an Israel. Reporting for Infowars, this is Greg Reese. .