Notifications can be turned off anytime from settings.
Item(s) Added To cart
Qty.
Something went wrong. Please refresh the page and try again.
Something went wrong. Please refresh the page and try again.
Exchange offer not applicable. New product price is lower than exchange product price
Please check the updated No Cost EMI details on the payment page
Exchange offer is not applicable with this product
Exchange Offer cannot be clubbed with Bajaj Finserv for this product
Product price & seller has been updated as per Bajaj Finserv EMI option
Please apply exchange offer again
Your item has been added to Shortlist.
View AllYour Item has been added to Shopping List
View AllSorry! The Perception of Object Motion During Self-Motion is sold out.
You will be notified when this product will be in stock
|
Learn More about the Book
This dissertation, "The Perception of Object Motion During Self-motion" by Diederick Christian, Niehorster, was obtained from The University of Hong Kong (Pokfulam, Hong Kong) and is being sold pursuant to Creative Commons: Attribution 3.0 Hong Kong License. The content of this dissertation has not been altered in any way. We have altered the formatting in order to facilitate the ease of printing and reading of the dissertation. All rights not granted by the above license are retained by the author.
Abstract:
When we stand still and do not move our eyes and head, the motion of an object in the world or the absence thereof is directly given by the motion or quiescence of the retinal image. Self-motion through the world however complicates this retinal image. During self-motion, the whole retinal image undergoes coherent global motion, called optic flow. Self-motion therefore causes the retinal motion of objects moving in the world to be confounded by a motion component due to self-motion. How then do we perceive the motion of an object in the world when we ourselves are also moving?
Although non-visual information about self-motion, such as provided by efference copies of motor commands and vestibular stimulation, might play a role in this ability, it has recently been shown that the brain possesses a purely visual mechanism that underlies scene-relative object motion perception during self-motion. In the flow parsing hypothesis developed by Rushton and Warren (2005; Warren & Rushton, 2007; 2009b), the brain uses its sensitivity to optic flow to detect and globally remove retinal motion due to self-motion and recover the scene-relative motion of objects.
Research into this perceptual ability has so far been of a qualitative nature. In this thesis, I therefore develop a retinal motion nulling paradigm to measure the gain with which the flow parsing mechanism uses the optic flow to remove the self-motion component from an object's retinal motion. I use this paradigm to investigate how accurate scene-relative object motion perception during self-motion can be based on only visual information, whether this flow parsing process depends on a percept of the direction of self-motion and the tuning of flow parsing, i.e., how it is modulated by changes in various stimulus aspects.
The results reveal that although adding monocular or binocular depth information to the display to precisely specify the moving object's 3D position in the scene improved the accuracy of flow parsing, the flow parsing gain was never up to the extent required by the scene geometry. Furthermore, the flow parsing gain was lower at higher eccentricities from the focus of expansion in the flow field and was strongly modulated by changes in the motion angle between the self-motion and object motion components in the retinal motion of the moving object, the speeds of these components and the density of the flow field. Lastly, flow parsing was not affected by illusory changes in the perceived direction of self-motion.
In conclusion, visual information alone is not sufficient for accurate perception of scene-relative object motion during self-motion. Furthermore, flow parsing takes the 3D position of the moving object in the scene into account and is not a uniform global subtraction process. 8e observed
tuning characteristics are different from those of local perceived motion interactions, providing evidence that flow parsing is a separate process from these local motion interactions.
Finally, flow parsing does not depend on a prior percept of self-motion direction and instead directly uses the input retinal motion to construct percepts of scene-relative object motion during self-motion.
DOI: 10.5353/th_b5177318
Subjects:
Motion perception (Vision)
The images represent actual product though color of the image and product may slightly differ.
Register now to get updates on promotions and
coupons. Or Download App