Wednesday, August 19, 2009

Blade Runner

This book is also called Do Androids Dream of Electric Sheep?, by Philip Dick, and a movie called Blade Runner was based on the book. It is set around 2020 after a nuclear war, and androids are prevalent and convincingly human. In fact, the main character's job is to track down androids masquerading as humans and kill them, ostensibly because they are a danger to society.

The book deals with the issue of what constitutes humanity, and was actually quite good. The bounty hunter had to go after these androids that look and act just like humans, except for the fact that they don't feel empathy. It really made me muse on artificial intelligence and how humanity will cope with it when it arrives. If humans are able to program a sort of empathy into machines, how will they be any different than humans? Would machines that actually feel emotions ever be able to get into heaven?

In thinking about it more, I'm not really sure what would distinguish humans from Really Good Robots. In Blade Runner, androids are discovered because they lack empathy. However, isn't it possible for a human not to be empathic? I have it in my mind that autistic people sometimes do not exhibit a ton of empathy. Are they sub-human? What do you think would be the line of separation between humans and robotic replicas?

I usually don't consume science fiction, even though it is often very similar to my chosen genre of fantasy; I'm not really sure why that is. Perhaps I just really like books that use medieval weaponry and therefore focus on physical prowess, which science fiction often does not feature. However, I can't imagine I would like a book about modern knife-fighters very much. For whatever reason, I consistently like science fiction less than fantasy. Even this book, which was pretty good, started dragging toward the end, and I couldn't wait for it to finish so I could start something else.

No comments: