Provides contextual information beyond the central 2–5°
Contributes to perception of own limb movement during actions
Essential for optical flow:
Pattern of motion across the retina created by self-movement
Guides posture, locomotion, and orientation in space
Supports navigation through the environment and coordination with moving objects or people
Figure 18: Peripheral Vision
0.23 👓 Two Visual Systems
Vision-for-Perception (Ventral stream)
Pathway: visual cortex → temporal lobe
Fine analysis of visual scene: form, color, features
Supports object recognition and description
Information is typically conscious
Vision-for-Action (Dorsal stream)
Pathway: visual cortex → posterior parietal lobe
Provides spatial characteristics of objects and environment
Guides movement planning and online control
Processing often occurs non-consciously
Streams operate in parallel → perception and action are supported simultaneously
Figure 19: Two Visual Systems
0.24 🔗 Perception–Action Coupling
Perceptual information and motor actions are tightly connected
Visual perception continuously informs movement parameters
Eye–hand coordination as a classic example:
Spatial and temporal features of gaze align with limb kinematics
Point of gaze typically arrives at the target before the hand
Coupling ensures movements are adjusted online to match environmental demands
Figure 20: Perception-Action Coupling
0.25 ⏱️ Online Visual Corrections: Time Required
Experimental approach
Compare rapid aiming when target is visible vs. occluded after movement onset
If vision is available, corrections can be made mid-flight
If vision is removed, errors increase
Time window for corrections
Visual feedback requires ~100–160 ms to process
Corrections possible only if movement duration allows this window
Implications
Fast, ballistic movements often too brief for corrections
Slower or sustained movements benefit from visual feedback adjustments
Figure 21: Online Visual Corrections
0.26 ⏳ Time-to-Contact (τ)
In interception and avoidance tasks, vision specifies when to initiate action
Optical variable tau (τ):
Derived from the rate of expansion of an object’s image on the retina
Provides a direct estimate of time remaining until contact
At a critical expansion rate, action is automatically triggered (non-conscious)
Allows precise movement initiation in dynamic contexts:
Catching or hitting moving objects
Avoiding oncoming obstacles
Timing steps or braking when approaching surfaces or vehicles
Figure 22: Time-to-Contact (τ)
✅ Major Takeaways (Ch. 06)
Touch & Proprioception (Obj. 1–4)
✋ Tactile feedback supports accuracy, timing, consistency, and force regulation; anesthetizing fingertips degrades performance
🧩 Proprioceptors (muscle spindles, GTOs, joint receptors) signal length/velocity, tension, and joint position → essential for posture, coordination, and movement correction
🧪 Classic methods: deafferentation (loss of afferents) and tendon vibration (illusory lengthening) reveal proprioception’s role
🏥 Training & rehab: task-specific practice that enriches cutaneous + proprioceptive feedback improves skill and recovery
🧠 Pathways: Retina → LGN → V1 → ventral (perception) and dorsal (action) streams running in parallel
Vision & Motor Control (Obj. 5–8)
🔭 Monocular vs. binocular: binocular depth boosts 3D tasks (reach–grasp, navigation, interception)
🎯 Central vs. peripheral: central = fine detail/regulatory conditions; peripheral = context, limb motion, optical flow for posture/locomotion
🔗 Perception–action coupling: gaze timing aligns with limb kinematics; eyes arrive before the hand
⏱️ Online corrections: vision-driven adjustments emerge in ~100–160 ms—fast moves rely more on feedforward
⌛ Time-to-contact (τ): retinal image expansion rate specifies when to initiate action, often non-consciously
So what?
The CNS integrates touch, proprioception, and vision to plan, guide, and evaluate movement
Effective instruction/rehab leverages the right sensory cues for the task (detail vs. context; depth; feedback timing)
Designing practice that matches sensory demands (visibility, textures, loads, speeds) accelerates learning and recovery
0.27 💡 Points to the Practitioner
🔍 Assess Sensory Deficits - Movement problems may stem from touch, proprioception, or vision deficits - Examples: - Poor balance post-stroke may indicate proprioceptive loss, not just weakness - Gait instability could reflect somatosensory rather than motor deficits
👁️ Use Vision as Compensatory Strategy - Clients rely on vision to substitute for compromised sensory systems - PT Applications: - Mirror feedback for posture training - Visual targets for reaching exercises - Gait training with floor markers/visual cues
🎯 Optimize Visual Attention - Direct central vision appropriately for motor tasks - Clinical Examples: - “Look at the target” during functional reaching - Eye-hand coordination in ADL retraining - Visual tracking exercises for sports return
⏰ Consider Processing Time - Corrections require sufficient time for sensory-motor integration - Rehabilitation Applications: - Slow movement speeds for neurological clients - Adequate reaction time in fall prevention training - Progressive speed increases in sports rehab
0.28 🏥 Additional Clinical Considerations
🩺 Special Populations - Diabetic neuropathy → tactile feedback loss affecting balance/gait - Joint replacement → altered proprioception requiring retraining - Parkinson’s disease → reduced proprioceptive processing - Multiple sclerosis → variable sensory deficits affecting motor control
🎯 Advanced Training Techniques - Closed-eye balance training → force proprioceptive reliance - Dual-task training → divided attention challenges - Sensory integration exercises → combine multiple sensory inputs - Perturbation training → unexpected challenges to reactive systems