Monday, April 1, 2013

Drone On

photo: Sheika04 at en.wikipedia
In a Times Op-Ed piece entitled “Smart Drones,” (NYT, 3/16/17) Bill Keller remarks “ Imagine that the decision to kill a suspected enemy is not made by an operator in a distant control room, but by the machine itself. Imagine that an aerial robot studies the landscape below, recognizes hostile activity, calculates that there is minimal risk of collateral damage, and then, with no human in the loop, pulls the trigger.” What if the objects of man’s own creation, develop a life of their own and eventually overpower him? Could you envision a smart gun that came to a verdict about its owner’s character and pulled the trigger? What if the computer typing these words decides that the audience who will read them is not ready and proceeds to edit the passage independently?What if it decides it has a better way of doing things than the ones you might have chosen? As farfetched as this seems we have numerous examples of computers not letting us or do this or that function in everyday life. We treat these computers or software programs with a great deal of tolerance like a parent a child in the throes of his or her terrible two’s, but we are in denial about the fact that some cybernetic protocols exist in a morally ambiguous universe and one that is not directly the product of human consciousness. Certain programs simply will not be stopped once they are started, as we have seen in the case of the stock market where complex algorithms have undone their hosts. But returning to the subject of drones. No one likes to be the executioner. It’s easier to have a machine do the job. Words and phrases in fact are created as euphemisms for torture or murder. “Extraordinary rendition” is one. “Termination with extreme prejudice" is another. Imagine that you still have a death penalty for murder, but the electric chair, gas chamber or lethal injection is no longer given by a person and the witnesses too are all robots. That would indeed be a crime.

1 comment:

  1. jylle benson-gaussApril 1, 2013 at 7:51 PM

    Robert Sheckley wrote a short story, "Watchbird", in 1953 forseeing drones and addressing the questions you pose (available at the gutenberg project for a free read). Interesting that our complex technological creations are as morally neutral as is nature itself, which we seek to outdo. Ah, hubris!
    I'd add to your euphemisms 'success' when used to describe a culture or species. It often actually refers to ruthlessness in obliterating (real or perceived) competitors. Don't get me started.


Note: Only a member of this blog may post a comment.