I received the following comments in response to Mr. Foust's erroneous description of Palantir from someone who actually works for Palantir. I received a second response to those comments from a colleague and that is pasted below. But note the very perceptive concluding comment about showing your work in Washington.
Dave,
I feel a responsibility to clarify his allegations.
His excerpt:
In war zones, too, many decisions to kill are at least partly automated. Software programs such as Panatir collect massive amounts of information about IEDs, analyze without human input, and spit out lists of likely targets. No human could possibly read, understand, analyze, and output so much information in such a short period of time.
Palantir does zero analysis on it's own. It's simply a database that pulls reporting from a multitude systems (such as M3, CIDNE, BATS, etc...). The software includes analytic tools (map imagery, graph and linking tools similar to Analyst Notebook) to allow users to more quickly sift through the data in a format that they can easily use to identify anomalies and make predictions on enemy patterns of life and, yes IED locations and likely future emplacements. Palantir of course require human input to do this.
In no way does Palantir autonomously "spit out lists of likely targets". That would be awesome, but we're not there yet. In fact, Palantir's entire goal is to provide the means to allow analysts to read and understand this massive amount of data. Certainly not a one person task, but with hundreds of analysts using Palantir throughout AF (and elsewhere), I consider it a compliment that Mr. Foust's perception of our enjoyed success is that it is an autonomous system. That's testament to the hard work of all its users to locate and defeat the IED networks that have taken so many of our soldiers, sailors, marines, and airmen out of the fight
Anyway, I figured I'd weigh in to clarify the misunderstanding.
In response to the above comments I received this one:
Dave,
That's exactly right, and it makes me question how much due diligence Foust does before writing. Palantir's whole philosophy - seriously, you can't spend more than 5 minutes on their website without running into it - is that purely algorithmic approaches to data analysis are doomed to fail against adaptive (read: human) phenomena. The human is indispensable, and the goal of the platform is to empower the human, not replace him. If someone is trying to use it in that way, selecting criteria and then labeling everyone who matches as a target, then it would be a terrible misunderstanding of the system. Such a claim would require citation though, and as someone wrote recently, the defining feature of contemporary Washington is that no one shows their work.
V/R
Dave
The false fear of autonomous weapons
December 20, 2012
Last month, Human Rights Watch raised eyebrows with a provocatively titled report about autonomous weaponry that can select targets and fire at them without human input. “Losing Humanity: The Case Against Killer Robots,” blasts the headline, and argues that autonomous weapons will increase the danger to civilians in conflict.
(Continued at the link below)
No comments:
Post a Comment