Crumb Trail
     an impermanent travelogue
email: guesswho @ guesswhere.com

Tuesday, February 21, 2006
 

We don't always believe what our eyes tell us.

Computer-generated vision has shown that viewing a scene with two eyes, or walking around it, provides enough information to calculate its 3D structure. To find out how far away things are by this method, however, requires knowledge of the separation of the eyes or the distance walked. There is good evidence that the human visual system uses both these pieces of information when making judgments of 3D size, shape, and distance.

In the new work, performed at the University of Oxford, Dr. Andrew Glennerster and colleagues use an immersive virtual-reality display to show that the human visual system cannot be carrying out the same type of 3D reconstruction that is used in computer vision. People experiencing the virtual-reality display failed to notice when the virtual scene around them quadrupled in size as they walked around, and, as a result, they made gross errors in judging the size of objects. Intriguingly, these results imply that observers are more willing to adjust their estimate of the separation between the eyes or the distance walked than to accept that the scene around them has changed in size. More broadly, these findings mark a significant shift in the debate about the way in which the brain forms a stable representation of the world--that is, the world as it is perceived to exist independent of head and eye movements.

If things appeared to quadruple in size as I walked around I'd lay down and rest, assuming that my systems were out of whack since things like that don't happen. Often.

posted by back40 | 2/21/2006 12:27:00 PM

0 Comments:

Post a Comment


Recent
Resources
Open Access
People
News
Tools
Blogs
Archives

Technorati Profile