A new webcam has been launched that has the eerily accurate appearance of a human eye. Would you want it to watch over you?
There are plenty of PC users who are already perturbed by the mere presence of a webcam, a camera that seems to be watching you while that you sit in front of your computer screens. With worries about internet security becoming ever more paramount, it’s only natural to feel a bit uneasy about this potential breach of privacy. On the other hand, what could be comforting than the familiarity of a human face?
You might presume that this was the sympathetic thought process behind the creation of this creepy monstrosity, but the reality is that a webcam resembling a real human eye – complete with a robotic blinking motion – is the stuff of nightmares. Just check out the video above, if you dare.
Not only does the product itself look like a rejected prop from a grisly David Cronenberg film, but the intertitles only make the atmosphere more creepy; “Imagine Eyecam waking up on its own”; “Imagine Eyecam observing every one of your steps”; “Imagine Eyecam embodying someone else”. Oh well, I wasn’t planning on getting any sleep tonight anyway…
Despite the promo video sharing more in common with a trailer for a new series of Black Mirror than the bland marketing material we’re used to seeing, this device really does exist. The Eyecam, or Anthropomorphic Webcam, was created by Marc Teyssier at the Saarland Human-Computer Interaction Lab, and the purpose is not to give you nightmares (that’s just the inevitable side-effect), but rather to “speculate on the past, present and future of technology”.
More specifically, it’s to make you more aware of the fact that we are constantly under surveillance. As the video explains: “Sensing devices are everywhere, up to the point where we become unaware of their presence”. You can be sure that nobody would forget a disembodied human eye perching on their workstation, so by that metric it’s a roaring success, even if watching it in action has sent me on a trip to uncanny valley that’s taken years off my life.
The questions that this installation was designed to provoke about the wider problem of surveillance include:
Should the device be transparent and invisible to the user? What are the next social and ethical challenges of IoT? What is the balance between mediation and intrusion? How can we design for the right amount of agency to smart sensing devices? How can we reinforce privacy and show the user they are being watched? How can we design smart devices to be present where needed, but respectfully absent when not?
Whatever the solutions are to these soul-searching conundrums of our internet age, can we all please agree that decontextualised human body parts should not be one of them? Please?