Is it small or just far away?
Virtual-reality room shows how we can be blind to the size of our environment.
We've all heard that seeing is believing, but scientists know that it may be the other way around.
Researchers have constructed strange environments in order to pick apart how our eyes and brain work together to present us with an understanding of what's going on around us. Classic experiments, which often turn up at science museums, show that we can be easily fooled into believing objects are smaller, bigger, lighter or darker than they really are, just by putting them against a different backdrop.
Now Andrew Glennerster and his colleagues at Oxford University, UK, have constructed a virtual reality in which people don't notice that the room they are in has expanded. This fools them into thinking that two objects, one seen before and the other after the expansion, are the same size, when in fact one is several times larger.
Volunteers were asked to don goggles that put them in a plain, virtual room. They were then asked to walk to their right, moving to look at the other side of the room. While they were moving, the virtual room ballooned four times in size, and the view was adjusted such that as objects became bigger, they also got further away.
This clever tweaking of the environment means that static photographs taken of both sides of the room would look much the same.
But the people in the virtual room weren't static: they could turn their heads and walk about. So their depth perception and cues from motion should have helped them to work out the change in size.
But it didn't. "They don't notice that anything odd is happening," says Glennerster. If they did report oddities, it was that their strides seemed to be getting longer or shorter as they walked to and fro. People seemed more apt to attribute these strange observations to changes in their walking, rather than a change in the size of the room.
To quantify the illusion, the researchers placed a cube mid-air in the first side of the room, and asked participants to remember how big it was. The subjects then compared this to the size of a second cube in the new environment. The volunteers always misjudged the size of the second cube, by a factor of two to four. The team reports the findings in Current Biology1.
To calculate the size and distance of objects, people often simply rely on depth perception, which comes from a comparison of the different information arriving at each eye. But this experiment shows that our ability to see in three dimensions isn't always very good at determining such information.
Visual specialist Michael Morgan from City University in London, UK, says this "neat" experiment calls into question what binocular vision is good for. "The study indicates that its function is really not to estimate details of far-away objects," said Morgan. "It's for near-vision and for reaching for objects in near-vision."
Experiments into visual illusions help scientists to understand human perception, and also to construct computer algorithms that may help robots to process visual information. "Humans do something different from what computers do," says Glennerster. Knowing these differences can help in creating more compelling virtual-reality environments.
Studies like this could help towards improving everything from computer games to the technologies by which doctors remotely operate surgical instruments.
Post a comment to this story by visiting our newsblog.
- Glennerster , et al. Current Biology, 16 . 428 - 432 DOI 10.1016/j.cub.2006.01.019 (2006).