Plato’s Cave 2.0: Virtual Reality to decipher animal behavior
From animals in virtual realities we can learn how to tackle today’s present challenges in engineering by adopting their alternative strategies.
Being imprisoned in Plato’s Cave, unable to tell the difference between reality and illusion is one of the greatest thought experiments in epistemology. Individuals are tethered to a wall in the cave, projections of objects are presented to them. Not having any other experiences than these illusions, they adopt them as their reality. The allegory of Plato’s Cave highlights the subjectivity of our reality and how it is shaped and limited by our percept of the world and our experiences. With limitations on our perception, how do we grasp parts of reality that are inaccessible to us? We can ponder if our world is real or an illusion, if our perception is what really surrounds us or if we create our world by impressions. Nowadays we can voluntarily enter Plato’s Cave by using Virtual Reality (VR).
VR, as most people know it, is an immersive experience designed for computer game enthusiasts. But VR has far wider applications. By reverse engineering Plato’s Cave and descending into these realistic illusions, we gain beneficial insights beyond what is ordinarily possible. In industry VR allows manufacturing with complex assembly, the healthcare industry uses VR to enhance operation and training of new techniques. These are VR application for humans, but what is even less commonly known is that this technology is being increasingly applied in biological sciences. Now, VR enables scientists to study animals under controlled but naturalistic conditions.
At the Max Planck Institute of Animal Behavior (MPIAB), Iain Couzin is leading a team of researchers in the Department of Collective Behavior who are employing VR to study decision-making in animals. Hemal Naik, PhD student in Couzin’s group, has recently published a review on the achievements that have been made so far and the current challenges of animals in virtual environments.
Naik, who is working on computer vision for animal behavior, is especially interested in what we can learn from animals and their applications in engineering. “With these experiments we can address fundamental questions of how animals behave; respond, move, fly,” Naik says, “Nature as our greatest role model inspires the construction of extremely efficient micro-robots, smart sensor design, and self-navigating drones. To perform efficient biomimicry, we first need to understand the underlying mechanisms of animal behavior.”
An excellent example where bioinspiration can be used to address engineering challenges is that of autonomous driving. Schools of fish and flocks of birds constantly engage in collective movement without colliding. But achieving a consistent traffic flow while avoiding collisions is a major challenge in autonomous driving. How do animal groups coordinate movement free of collisions but remain responsive and cohesive? To successfully navigate through busy streets, not colliding with others requires us to integrate a large amount of information. Where are the others going to be as we move forwards, meaning at what distance are they and at what speed are they moving in which direction? To calculate the distance of objects from an autonomous car, we apply the mechanism we use ourselves to get depth cues: stereopsis. This requires two eyes or cameras, matching the objects present in both cameras and triangulation. We do this instantaneously, but for image recognition and processing these data, it poses a problem concerning real-time computing. Maybe our conception of how to address these kinds of problems is limited by our experience and looking at animal collectives can inspire us how to solve them.
Here we might get caught in Plato’s Cave ourselves: from our perception we tend to infer what the world is like; but mechanisms of perception have evolved independently between species. Different organisms have evolved their specific sensory apparatus to maximize fitness in their ecological niches. Just because we estimate distance, perceive light, sound, vibration in a certain way, we might assume this to be the objective representation of the world. But there are more physical entities that can be perceived, and different animals use different algorithms to solve the same task and access identical information from their environment. For depth perception, this comes in the form of non-stereoscopic depth cues such as motion parallax – triangulating the distance to the desired object with body movement – or shadow, vertical distance to the horizon, retina to image ratio and texture. Understanding the performance of alternative depth perception mechanisms may decrease the reliance on complex calculation of collision avoidance with much more efficient methods derived from animals.
Being able to address so advanced questions with technical sophisticated state-of-the-art methods has come a long way. In the old days, when decision-making was the interest of a biological laboratorial study, investigation could only be performed with a model of a natural stimulus and with intervention. This disturbance would most likely alter the behavior of the animal. In the 90s, biologists started projecting stimuli to animals using projectors and screens. These initial computer-generated experiments lacked feedback and there was no correction of perspective as the animal moved. In 1992, the CAVE setup gave rise to the first immersive experience in which an observer enters a room and is presented with images on surrounding walls. Tracking observer movements allowed for the illusion to be corrected in real-time. This method was successfully adapted for studying animal behavior. There was an enormous gain; animals could navigate freely in a seemingly natural environment without further disturbance.
With the introduction of VR systems for animals, additional obstacles popped up: As different species perceive the world differently, they need sophisticatedly enhanced stimuli that fulfil their spectrum of environmental perception. Whereas we only see light in the range of 380 – 750 nm wavelength, birds additionally perceive shorter UV light, bees detect polarized light, bats sense ultrasound, some animals have no or a different chromatic spectrum. Certainly, presenting appropriate stimuli is limited by our understanding of their sensing and our capability to reproduce natural stimuli covering the species' spectrum of perception that are inaccessible to us.
Engineering has managed to solve several of these obstacles in how to provide input stimuli which are appropriate to organisms that sense differently than we do. Stimuli can be presented at an appropriate frequency to avoid flickering, enlarged fields of view for animals with non-binocular vision can be covered, and tracking of different locomotion types such as swimming, running, flying and synchronization with perspective corrected stimulus delivery are being achieved.
Yet, VR is limited to certain species, e.g. not all types of locomotion can be sufficiently tracked, not all demands of spectra of perception can be met. To overcome these difficulties, we are obliged to leave Plato’s Cave ourselves to broaden our horizon. That is why Naik calls for interdisciplinarity: “We are confronting many problems in human societies. Existing technology consumes too much energy, requires too much computation or space and we have been working hard to try to solve these issues. I believe many solutions are given by nature already and we have barely started exploiting these. VR-like tools will allow us to tap into efficient and sustainable tech solutions. In order to build such tools, we need experts from biology, computer science and other engineering fields to combine their efforts.”
About the author of the recent review
Hemal Naik has nicely gathered key developments that have been achieved in virtual reality for animals – a field that might be completely alien to the VR community. This aims to bridge the gap between biologists and the tech world.
During his PhD, Naik became familiar with the biologists’ point of view. He guides the tech reader firmly through the approaches and efforts that have been done in biology so far and encourages interdisciplinarity for more exciting challenges to come.
Written by Bianca Schell, Department of Collective Behavior