UNCANNING THE UNCANNY VALLEY
A few nights ago my producer, James Evan Pilato, was talking about how he was arriving home late on a Thursday night and was curious about a lot of traffic and cars parked in his area of town after 12:30 in the morning. He then realized that what he saw was traffic lining up and parking at the local theater near his home to see the new movie ‘The Hobbit: The Desolation of Smaug’.
While I have been a fan of the Peter Jackson’s vision of the Tolkien books, I have noticed that many critics have said this new film is stuck in that “limbo” that often plagues “middle” trilogy films and that Jackson has become a little self-indulgent with the CGI effects much like George Lucas when he gave is the new ‘Star Wars’ prequels.
That is another way of saying that maybe some of the CGI effects work well and others seem to be unfinished or cartoonish in the film, which gets in the way of the script and the acting. There seems to be complaints about how the computer-generated orcs in the film don’t work and a couple of scenes in the Smaug’s cave seem cartoonish and unfinished. Of course, the big star of the films is Smaug the dragon, who is a CGI creation that critics are saying is so life-like it’s mind boggling.
Movies have always been used as a showcase for computer-generated magic and there are most certainly moments where the CGI is indistinguishable from the real thing. We can all marvel at the way CGI can bring us fantasy beings like dragons and recreate long extinct dinosaurs. But now there are CGI artists that can create UFO sightings or angel and ghost appearances for YouTube, posting them and claiming they are real.
Welcome to the not-so-real world. A world where everything seems to be real but it is becoming more evident that misleading people is becoming an art form and the magicians are becoming more cunning in their way of manipulating you into accepting the homogenization of public thought and ignorance about what is real and what is artificial.
Ever since CGI and life-like robotics were introduced to the public awareness, there has been this unwritten rule where the more life-like a creation is, the more likely it crosses the line from cute or intriguing to morbid and creepy.
This is known as the ‘uncanny valley‘.
Believe it or not, life-like robots and CGI effects have been with us for nearly 30 years and every year the ability to create life-like images and robots has improved. While our exposure to these life-like creations is usually limited to the movies, the integration of life-like robotics and computer simulations is happening at a relatively rapid rate and many robotics engineers are questioning if the ‘uncanny valley’ even exists anymore.
From Pixar’s animated movies to CG puppets and even human simulacrum, there used to be this taboo that anything that even appeared to look too life-like would be a liability to any new electronic endeavor.
Now we see that if CGI is not real enough or believable it gets ridiculed. However, there are still some limits that perhaps should be employed by those who choose to simulate reality.
We are integrating so quickly into a technological hyper-reality that the simulation is far more real than the mediocrity of the real. Everything is so intensified that our stimulation sensors need upgrades in order to sustain the kind of stimulation we are seeing today. In order to do this, we may have to face voluntary human extinction with the promise that a better life awaits through technological advancement.
We live in a world where the horrors of possible terrorism and war are abundant. In fact, it can be argued that the fear of terrorism is statistically unwarranted, however, there can be arguments made for the possibility that realistically violent games that are created and played reinforce the images and the simulated reality of “kill or be killed.”
No one is saying that video games promote a violent culture; however, when someone is realistically placed into the battlefield scenario, one has to wonder if the brain absorbs the simulation and loosens its ability to distinguish the ‘uncanny valley’ scenario.
The simulation is becoming accepted and the question needs to be asked “How real is too real?” and what are the long-term and potentially lasting effects of repeatedly participating in realistically violent simulations?
According to a study published in the Journal of Youth Violence and Juvenile Justice in April 2013, “Violent video game playing is correlated with aggression.” It is also being found that people who spend long hours in simulated battlefield games start showing anti-social behaviors.
Back in the day, it could be argued that a player firing at pixelated targets was a harmless activity. Now that most game programmers have allowed their consumers to cross into the ‘uncanny valley’, there may need to be a reevaluation of how all of the simulated reality affects the brain psychologically.
Recently, I took part in a handgun training exercise. The simulation used real people and real situations where you need to be ready to use lethal force if necessary to protect yourself. The frightening part is that the simulation used real people and the people in the simulation actually responded to your conversations.
In one instance, I was police officer responding to a domestic violence call. The girl comes to the door and speaks to me telling me her boyfriend assaulted her. When the boyfriend hears that I have arrived, he walks out and says “So you called the police? That’s just great.” He then walks back into the room and then walks out, emptying his gun into his girlfriend and – as I stood there in shock as to how real it seemed – the boyfriend shot me, too.
The instructor, who happened to be a police officer, says “Clyde you are dead, put down your weapon.”
I was literally frightened at what just happened. The trauma was very real as I just witnessed the murder of a woman and saw a killer shoot me.
I asked to try again.
This time I approached a truck as an attractive woman was trying to break into it. She asks “Is this your truck?” I said “yes” she then said, “I really want your truck.” I then pulled out my gun and said, “Step away from the vehicle.” What I didn’t notice was that she had a gun in her back pocket. She pulled it out and shot me.
The police man in the room said, “Once again, Clyde, you are dead please put down your gun.”
I said, “This is rigged for me to lose, isn’t it?”
The officer said “No, try again.”
This time I approached the vehicle, pulled my gun and ordered the woman to step away from the vehicle. I also said, “I know you have a gun… drop it!” The woman decided to flee.
I realized that this simulation could go many different ways and that each time I was shot, it was because I was not able to shoot at another human being. Later, as the simulation progressed, I became more comfortable with shooting at people who were trying to harm me. In one session, I was able to bring down a man with a gun that ordered me to drop mine. My response to him was a bullet in the head.
The simulation felt so realistic that it was literally life changing. The purpose was to make me more tactical; however, I was actually mulling over the fact that I did not have the nerve to shoot a real person – even in a simulation that felt real.
For some time, the experience has made me question if we are psychologically prepared for complete integration into a somewhat virtual existence where we have a hard time differentiating between real world and simulated world.
Recently, Activist Post presented a revealing report on non-human telemarketing technology and how far we have come with regard to customer service agents that are now very convincing robots that ask you questions and direct you to various departments when prompted.
The new technology is actually a robotic text-to-speech recognition engine that responds to human answers using a human voice electronically produced to respond. The voice is that of a real person with canned responses fed into the engine to simulate responses when prompted.
An article on CallerCenter.com notes that, “The newest telemarketer to come on the market is actually a robot that has one of the most sophisticated text-to-speech recognition engines on the market today. Designed by Loquendo, an Italian-based company, it is so authentic it even won an award in 2009 for the best speech engine on the market today.”
The thing that is most creepy about the voice engine is that it will not admit to being a robot. In fact, Time magazine is investigating the healthcare telemarketing firm that uses the engine and why the simulated voice insists that it is a real person.
Since Time magazine investigated the “real voice” engine at a company called Premiere health, the company has since gone off line and its telephone number – along with the realistic robo-receptionist – has disappeared.
It does bring up a very valid question and that is: If technology gets so advanced that we can get phone calls from very intelligent robots, should they at least have to admit if they’re robots?
Think about it, you could receive a call from Barack Obama asking you to vote for an issue – or even Hillary Clinton asking for your vote and you would think you received a personal phone call when it was only a simulation.
How about the idea that while you are on the phone with a simulated customer service representative and an emergency happens where you need to have what you think is a real person act on your behalf and you realize that the person is not a person and can do nothing to help you – only laugh repeatedly and insist that they are real when they aren’t.
This opens the door for a new kind of nuisance, a telemarketing call from a celebrity asking that you buy their new album, or a new car, see their new movie or anything else.
Right now, there may be flaws in the technology and maybe people are more aware of simulations than other people.
But where does it end? What would be the ultimate simulation?
Could we see a simulated alien invasion? How about a mass sighing of the Virgin Mary or the Prophet Mohammed maybe even an appearance of Jesus?
When researchers try to study the ‘uncanny valley’, they have a hard time pinning down what an uncanny response to it would be. Is it a reflexive turning away from the simulation – or is it the acceptance of the simulation as something that is real and tangible?
This is where computer tech and artificial intelligence has gone. The question is how we, as humans, deal with simulation psychologically. Are we able to differentiate from that which is computer generated and that which is real?
Robotics and their uncanny resemblance to animals and humans is becoming more sophisticated. With the arrival of drones and even robotic policemen, the urge to employ their use is just too irresistible for the power elite.
Google has been making news as of late because they have bought eight robotics firms and the latest is a military robotics company known as Boston Dynamics.
The New York Times has reported that Google purchased Boston Dynamics for an unknown sum and will honor their military contracts. Google has said they have no desire to become a military contractor, so there is a question as to why they want their hand in the military industrial complex.
EndTheLie.com reports, “Boston Dynamics is well known for their military robotics work, including many projects for the Defense Advanced Research Projects Agency (DARPA). Boston Dynamics has produced everything from remote controlled jumping cars to four-legged running robots (including the fastest one in the world) to humanoid robots and much more.”
Google is very tight-lipped as to why they have an interest in these companies and why they have kept their financial dealing with them secret.
However, people who were described as having “specific knowledge of the project” claim the robots will be used in manufacturing and retailing.
Psychologists that have been studying the human reaction to life-like technology have stated that the ‘uncanny valley’ is slowly fading and that the ‘creep factor‘ is subsiding as we become more used to technology that attempts to be human. What makes us uncomfortable is likely to shift and within 50 years humans will find themselves having relationships with androids and not even know it.
A new film called “Her”, directed by Spike Jonze and starring Joaquin Phoenix, takes this issue head on as a man who is a loner realizes that his computer-operating system – voiced by Scarlett Johansson – is meeting his every need intellectually.
We’ve seen numerous, examples of these types of films where the simulated woman or man is far more desirable. A film like Fritz Lang’s ‘Metropolis’ – and his beautiful robot Maria – illustrated these types of experiences when film was in its infancy.
‘Westworld‘ and ‘Futureworld’ were Michael Crichton’s vision of an amusement park where realistic robots would provide sex and entertainment for tourists. Of course, the train goes completely off the rails when a robotic gunslinger goes rogue and kills people.
In 1982, Ridley Scott’s ‘Blade Runner’ gave us the axiom “more human than human” as the Tyrell Corporation introduced the Nexus 6 robot that is remarkably life-like. The only way to tell the difference between human and robot is to test the subject with a psychological exam that is administered using a Voight-Kampff machine.
What used to be science fiction is becoming reality. The simulation is becoming the reality. If the simulation is too uncannily real, then what is it?
It will be reality until further notice, as technicians keep finding ways of psychologically uncanning the ‘uncanny valley’.