Title: Sit, Stand and Sway: Postural State Modulates Visual Influence on Sway
Journal: Experimental Brain Research
Author information:
Kristen
De
Melo1
Julia
De
Oliveria1
LoriAnnVallis
Ph.D.
1,2✉
Email
1Department of Human Health SciencesUniversity of GuelphGuelphONCanada
2Human Health SciencesUniversity of Guelph50 Stone Road EastN1G 2W1GuelphOntarioCANADA
3
A
A
1-519-824-4120 ext. 54589
Kristen De Melo1, Julia De Oliveria1, and Lori Ann Vallis1
1Department of Human Health Sciences, University of Guelph, Guelph ON, Canada
Corresponding author information:
Lori Ann Vallis, Ph.D.
Human Health Sciences, University of Guelph
50 Stone Road East, Guelph, Ontario, CANADA N1G 2W1
Tel: 1-519-824-4120 ext. 54589
E-mail: lvallis@uoguelph.ca
Abstract (Words: 242/250 max)
Human locomotion and navigation involve an interrelationship between individuals, tasks, and the environment. For example, when base of support is challenged, visual input helps fine-tune postural control to maintain balance. Perception-Action (P-A) theory emphasizes the critical feedforward role of optic flow in locomotor control and adaptation. While P-A coupling has been previously studied, the nature of this relationship, sex differences, and its impact on movement control, particularly muscle activity, require further investigation. Participants (N = 24; 23 ± 3.4 years) were instrumented with upper-body kinematic markers, and right-leg muscle activation was recorded using electromyography (EMG). Three postures were assumed (seated with feet planted, seated with feet dangling, standing) while viewing three virtual optic flow fields (hallway, boardwalk, trail) on a projector screen, along with a no video control condition. A weighted center of mass (COMHTP) model assessed displacement and acceleration range, and segmental analyses calculated trunk and head range of motion (ROM). EMG signals were filtered, normalized to the no video condition, and integrated using trapezoidal methods. Significant vision × posture interactions were found for COMHTP displacement in both sexes, with greater sway during optic flow conditions, particularly when standing. Posture consistently influenced COMHTP outcomes, with standing eliciting greater displacement and acceleration. Head and trunk ROM showed selective interactions in pitch and yaw. EMG results indicated posture-related increases in vastus lateralis (females) and gastrocnemius (males). Findings suggest visual cue complexity and posture interact to influence balance-related motor behaviours and highlight the importance of considering sex in visuomotor research.
Keywords:
perception-action
vision
balance
posture
muscle activation
motion analysis
A
A
A
A
Introduction
To understand human interaction and locomotor navigation strategies, it is essential to explore the interrelationships that exist between the task, the environment, and the individual; this offers the ability to predict human motion in relation to the surroundings (Gibson, 1966). The Perception-Action (P-A) Theory explores the constraints between these three components for movement control, effective training, and skill acquisition. Past research in this area suggests that motor control strategies are optimized for individuals to locomote through changing environments, with vision playing a critical role in successful movements (Heft, 2001). Two constructs of the P-A theory that are thought to evolve together in mammals include visual perception, describing the interpretation of visual information by the brain, and action, which can be defined as something being done by the system (Gibson, 1966). These components, although independent constructs, are thought to be in a close interrelationship from birth, developing simultaneously as new visual experiences are acquired (e.g., Barbu-Roth et al., 2009; Barbu-Roth et al., 2014); this allows for adaptive motor strategies in changing environments (Warren, 2006).
A vital component of this interrelationship is optic flow, which can be defined as the pattern of motion of the visual environment caused by the movement between the observer and the visual scene (Warren et al., 2001). Individuals develop an understanding of self-motion while experiencing these flow patterns, as the visual system and associated brain areas capture information about the body position within the physical environment (Marigold, 2008). This visual information is subsequently decoded into flow patterns, which yield rotational and translational movements contributing to the perception of self-motion. Optic flow also provides critical information pertaining to motor control strategies such as speed or heading direction (e.g., Jörges et al., 2024; Warren & Hannon, 1988), depth perception (e.g., Liu et al., 2023), time to contact with obstacles (e.g., Lee & Aronson, 1974; Ramirez & López-Moliner, 2020), and environmental structure and layout (e.g., object placement and surface orientation, Le et al., 2018). Collectively, information captured in optical flow patterns can serve as a means for guiding locomotion. While other sensory systems (i.e., vestibular and somatosensory) also play a role in locomotor control, visual information and especially optic flow, is unique in providing feedforward control information to plan and guide movements within dynamic environments (Warren, 2006).
The effects of optic flow on movement in real-world environments have also been observed in virtually simulated scenes (e.g., Berthoz et al., 1979; Chou et al., 2009; Warren et al., 2001). Within these virtual environments, providing optical flow cues (i.e., textured motion, radial expansion and contraction, objects) can elicit self-motion perception in humans in the absence of movement within the physical environment. An additional similarity between real-world and virtual environments is the ability for optic flow to elicit central nervous system (CNS) compensatory responses in reaction to visual stimuli. For example, participants in Antley and Slater’s study (2010) saw an increase in preparatory postural muscle activation via electromyography (EMG) when exposed to scenes with increased visual perceptual demands. Throughout the literature, in both real-world and virtual environments, we see that changing visual environments with differing stimuli (i.e., visually rich [e.g., complex and detailed visual input, such as a forest trail] versus less visually rich environments [e.g., sparse or minimal visual stimuli, such as a white-walled room]) may pose varying roles in the perception of self-motion (e.g., Jörges & Harris, 2021). Neuroscientific evidence suggests that these differences may be underpinned by activity in visual-vestibular integration areas, such as the ventral intraparietal area and medial superior temporal area, which contribute to spatial orientation and process optic flow (Bremmer et al., 2002; Smith et al., 2012). Animal studies have further demonstrated that circuits essential for spatial navigation, including the hippocampal place and grid cells, are able to adapt dynamically to environmental salience and changes in complexity (Chen et al., 2019). Together, these neuroscientific findings support the notion that visual scene characteristics contribute to the modulation of neural processing related to self-motion, which may, in turn, reflect evolutionary adaptations made to help interpret environmental affordances. While important scientific findings have been reported in this area of study, the behavioural consequences of presenting visual cues with varying richness, particularly within postural control and locomotor patterns, remain underexplored.
Another important consideration of the influence of optic flow on movement control is the impact of the adopted base of support (BoS) while experiencing visual information. Previous research has shown that optic flow influences postural sway in both the mediolateral and anterior-posterior directions (Phu et al., 2023); it appears that larger adopted BoS (e.g., seated versus standing positions) results in improved postural stability (Nam et al., 2017). In situations that challenge our BoS (e.g., changing tasks or environmental conditions), visual information can provide crucial body position information that may enable the fine-tuning of postural control (Nam et al., 2017), i.e. a return to perceived ‘stability’. However, if visual information is presented in a way that is in conflict with other sensory information (i.e., proprioception), postural stability may become disturbed even in seemingly stable BoS positions (Wang et al., 2022). If an individual feels unstable or uncomfortable within their BoS, perhaps due to conflicting or reduced proprioceptive information, their postural sway may increase (Mochizuki et al., 2006); this further emphasizes the importance of visual flow in the generation of motor control strategies.
While P-A theory research has elucidated how visual perception and action exist in an interrelationship to control movement effectively, much of the support for this theory has been derived from studies assessing newborn (e.g., Barbu-Roth et al., 2009; Barbu-Roth et al., 2014; Forma et al., 2018) and infantile (e.g., Bertenthal et al., 1997; Lee & Aronson, 1974) stages of development. We have discussed some of the available literature surrounding the role of optic flow on movement into adulthood; however, the strength of this relationship has minimal support, especially on a neurological level. More specifically, very little is understood about the interplay between changing visual cue complexity and body position in space and whether BoS will influence how visual information is integrated to inform muscle activation patterns involved in maintaining postural control. Additionally, much of the literature explores the effects of optic flow on visuomotor control on aggregated male and female data (e.g., Chou et al., 2009; Wade et al., 1995; Yoo et al., 2014), overlooking potential sex-based differences in sensory processing and motor response. There is some evidence to suggest that males and females experience optical flow patterns differently, which could influence downstream motor outcomes (e.g., Raffi et al., 2014). Some differences between the sexes may stem from underlying variations in visual attention, spatial orientation strategies, or sensory weighting preferences. For example, female participants have been shown to rely more heavily on visual cues when executing visually guided movements (e.g., Gorbet et al., 2007; Barnett-Cowan et al., 2010) compared to their male equivalents. These findings underscore the importance of considering sex within optic flow studies, as visual scene manipulations may not exert uniform effects across the sexes.
A
This study aims to evaluate the changes in muscle activity and postural sway in the absence of overt movement when exposed to four complex video projected visuomotor environments (i.e., an indoor hallway, a boardwalk outdoors, outside on a hiking trail, and no video) in three different body postures (i.e., seated with feet planted, seated with feet dangling, and standing with feet shoulder-width apart) in both male and female participants. We have various hypotheses for the present study, dependent on the condition presented. First, we expected that exposure to all three visually complex environments, in any postural condition, would result in greater muscle activation and postural sway compared to the no video condition due to P-A coupling (i.e., more visual information to inform self-motion, Marigold, 2008). Additionally, we anticipated that the greatest muscle activity and postural sway would occur when standing and viewing the hiking trail video due to the small BoS paired with increased visual perception requirements in this environment (e.g., trees, branches and rocks), as well as the perceived need to prepare for navigation through a more complex pathway (e.g., Antley & Slater, 2010). Finally, we predict that the seated posture with feet dangling would result in greater muscle activation and increased postural sway compared to the seated posture with feet planted, due to reduced proprioceptive information from the lower limbs (Wang et al., 2022). Without reliable somatosensory input from stable ground contact, it is hypothesized that the postural response would be similar to that of an unstable surface (i.e., foam; Vuillerme & Pinsault, 2007), where the body increases neuromuscular effort to compensate, reflected in greater muscle activation and increased sway.
Methods
Participants
A total of 24 young adults (12 males, age 23 ± 3.4 years) provided written consent to participate in this protocol following approval by the University of Guelph Natural - Physical and Engineering Research Ethics Board (REB#-23-01-002). Participants were recruited by posters, emails, and word-of-mouth referrals. Before eligibility to participate was confirmed, participants were required to complete online questionnaires assessing their individual features and health status. The questionnaires included the Mindful Attention Awareness Scale – Lapses Only (MAAS-Lo; Carriere et al., 2008) and the Adult ADHD Self-Report Scale (ASRS-vI.I; Adler et al., 2006). Inclusion criteria included right-hand dominance (to avoid laterality effects, confirmed via Edinburgh Handedness Inventory; Oldfield, 1971), normal or corrected to normal vision, and required to be between the ages of 18 and 35 years old. Due to the nature of the current study, individuals who regularly experience vertigo or dizziness, had any self-reported musculoskeletal or neurological disorders affecting their muscle activity, or skin sensitivities to adhesives, including medical-grade tape, were excluded from participation. Participants wore comfortable, fitted clothing with exposed legs (e.g. shorts) as well as their preferred walking shoes throughout the protocol.
Experimental Setup
A
On the day of the in-person laboratory visit, and prior to the commencement of the protocol, participants were first provided verbal instructions of what they could anticipate throughout the duration of the study, and verbal informed consent was reconfirmed. Participants were then palpated for four muscle bellies of the right leg (i.e., gastrocnemius, tibialis anterior, vastus lateralis, and biceps femoris) using contraction and palpation procedures (Hermens et al., 1999). The skin above the muscle belly was then prepped by shaving and rubbing the area with isopropyl alcohol. Four EMG bipolar electrode sets (Ambu 32 mm Sensors, Kego, Ontario, CA) were then fitted to the identified areas (Fig. 1B). A ground electrode was placed over the right patella or right fibular head (dependent on the participant's physique) and taped to ensure the electrode was secure. EMG was measured via Bortec EMG System (1000 Hz, Bortec, Alberta, CA). Following EMG preparation, participants were then fitted with 27 retroreflective markers placed at anatomical landmarks following the OptiTrack Conventional Upper Body skeleton (Motive software; Version 2.2.0, Oregon, USA). An additional two markers were placed on the iliac crests (Fig. 1B). A surrounding 10-camera kinematic system (100 Hz; OptiTrack, Natural Point, Oregon, USA) tracked the movement of these markers. In addition to the above equipment, the experimental setup required a projection screen (spanned 3.30 m x 2.50 m, from ceiling to floor in the lab), an overhead projector (HY300 Pro, QRJ Eokeiy, China; connected via computer), over-ear headphones (Sony WH-CH720N, Tokyo, Japan), and a height-adjustable stool with no backrest.
Experimental Procedure
The protocol started by asking participants to stand in front of and face the white projector screen. From this position, participants completed nine quiet trials of 30 seconds each; this included three trials in each of the three body postures (9 trials x 30 seconds, three trials per posture). For the standing posture, participants were told to stand with their feet shoulder-width apart and hands by their sides. For the seated with feet dangling posture, a stool was adjusted to a height where participants' feet could not reach the ground; they were told to sit upright, with hands resting on the side of the chair, and not to touch the ground with their feet during the trials. In the seated with feet planted position, participants were told to sit upright, with hands resting on the side of the stool and feet firmly planted on the ground. During quiet trials, participants were instructed to remain stationary and look straight ahead; this allowed for baseline muscle activity and postural sway to be captured. Once quiet trials were collected, maximal head range of motion (ROM) trials were collected in the standing posture, which involved participants completing yaw and pitch head movements within one trial (approximately one minute). This served as a baseline measure for head ROM.
To begin the experimental trials, participants were placed again in front of the screen (approximately 1.42 m from centre) and were surrounded by black curtains. All overhead lighting was off to minimize distractions from the surrounding laboratory environment. Participants had a total visual angle of approximately 90° in front of the screen. Headphones were placed over the ears and ambient sound (appropriate for the visual environment being displayed, e.g., ventilation system low hum in hallway; bird song on hiking trail) was played to reduce laboratory background noises and create a more immersive sensation. Each trial consisted of a one-minute video that would stream in front of the participant via the computer-based projection. The videos were pre-recorded on an iPhone (iPhone 15 Pro, Apple, California, USA) through first-person perspective in three environments; this included an indoor hallway of a university building, an outdoor boardwalk, and an outside hiking trail through a conservation area (Fig. 1C). This was meant to simulate real-world environments within the virtual lab-setting.
A total of 45 experimental trials were collected. In the first 15 trials, individuals watched each type of video five times in the seated with feet planted posture. For the next 15 trials, individuals watched each type of video five times in the seated with feet dangling posture. For the final 15 trials, individuals watched each type of video five times in the standing posture. The postural order was block randomized between participants while the videos consistently played in the same order. During trials, participants were instructed to adopt a comfortable hand position for the duration of the trials, remain facing forward, and watch the video in front of them. While the videos were one minute in duration, kinematic and EMG data were sampled for only 30 seconds, approximately at the midway point of the video, to capture a period where the participants were fully immersed in the virtual setting. Once the 45 trials were collected, the markers and electrodes were removed, and testing was concluded. The total in-person testing session took approximately 90 minutes.
Fig. 1
(a) Lab experimental setup of the projector screen, stool position, safety mats, and black curtains to minimize external distractions. 10-motion capture cameras were placed surrounding the participant location. (b) A representative participant instrumented with kinematic markers and EMG electrodes. A total of 27 retroreflective markers were placed at anatomical landmarks following the OptiTrack ‘Conventional Upper Body’ skeleton; two additional markers were placed over the right and left iliac crests. EMG electrodes were affixed to the gastrocnemius, tibialis anterior, vastus lateralis, and biceps femoris muscles of the right leg; a ground electrode was placed over the right fibular head. (c) Sample images taken from the three visual environments: hallway (left), boardwalk (middle), and hiking trail (right).
Click here to Correct
Data Analyses
Kinematic Data Processing
Visual3D software (V3D, Oregon, USA; v2024.11.1; Has-Motion, Ontario, CA) was used to process kinematic data. All files were first tagged to identify the trial condition. To interpolate missing data points, a 10-frame fill was used. Once missing data frames were interpolated, we applied an 8 Hz Butterworth, dual-pass, 4th-order filter. An average of the first 100 frames of each kinematic trial was then subtracted from all data points (to remove possible trial-to-trial ‘bias’ in the data). Pipelines were then created within V3D to generate an estimated COM model constructed from three body segments: the head, trunk and pelvis (COMHTP; adapted from Winter et al, 1998).
COM Displacement and Acceleration
The minimum and maximum displacement of COMHTP were then identified for each trial in both the mediolateral (ML) and anterior-posterior (AP) directions; COMHTP acceleration was calculated using the double derivative of the COMHTP displacement, and finally, the minimum and maximum acceleration values were calculated for the COMHTP acceleration. Once these values were calculated, the measures of AP and ML maximum and minimum COMHTP displacement and acceleration were exported from V3D and into Microsoft Excel (Microsoft Office, Washington, USA) for each visual condition (hallway, boardwalk, hiking trail and no video) for each posture (seated with feet planted, seated with feet dangling, and standing) for all participants. Within Microsoft Excel (Version 2505, Excel for Microsoft 365), the total range was computed per trial by calculating the difference between the maximum and minimum values for displacement and acceleration. These range values were then averaged across experimental condition (vision x posture; 4 x 3) to calculate the independent variables of interest: average COMHTP displacement and acceleration range, along the AP and ML directions. Outlier trials were removed if they were greater or less than two standard deviations from the mean of the experimental condition.
Head and Trunk ROM
The same COMHTP model was used to calculate measures of head and trunk ROM for each trial within the V3D software. To evaluate head ROM, the joint angle was computed between the right head marker and the generated trunk segment for yaw (rotation about longitudinal axes) and pitch angles (rotation about medio-lateral axis). To evaluate trunk ROM, the joint angle was computed between the trunk and pelvis segments, also in the yaw and pitch directions. Once the joint angles were generated for each measure, the maximum and minimum angle per trial was identified. The range of ROM was then computed by calculating the difference between the maximum and minimum values and exported (MS Excel). Range values were then averaged across the experimental condition (vision x posture; 3 x 3) to calculate the independent variables of interest: head and trunk ROM, along the yaw and pitch directions. Once again, outlier trials were removed if they were greater or less than two standard deviations from the mean of the condition.
EMG Processing
To calculate our EMG dependent variable of interest, the trapezoidal integrated root-mean square (RMS; e.g., Pourhashemi et al., 2025), the raw data was processed within Python (Python Software Foundation, Delaware, USA). Raw data was band-pass filtered from 10–499 Hz, and then further filtered using a 4th order Butterworth filter at 10 Hz. The resulting signal was then smoothed using a 200-ms RMS rolling window (i). Once complete, a trapezoidal integration was completed across the signal with unit spacing (all trials had the same sampling rate; see Eq. 1), and the total value was produced in millivolts (mV).
[Equation 1]
This yielded a single value for each trial, for each visual and postural condition, across all participants. The data were exported to Microsoft Excel, where, for each participant, an average of the no video trials was calculated. Each virtual cue video trial (hallway, boardwalk, and hiking trail) was then normalized to its corresponding no video condition by dividing the trial’s RMS value by that participant’s average corresponding no video RMS. Outlier trials were removed if they were greater than or less than one standard deviation from the mean of the condition. The remaining values were then averaged across each experimental condition (vision x posture; 3 x 3) for each participant, in preparation for statistical analyses.
Statistical Analyses
All statistical analyses were performed using SPSS (SPSS Inc., USA, Version 29). Within SPSS, interaction and main effects between vision and posture were evaluated for each dependent variable. Given observed sex-related trends in initial analyses, and to avoid the potential interpretive ambiguity and complexity of a three-way interaction, all data files were split by sex. This approach enabled a clearer understanding of how postural and visual manipulations affected female and male participants individually. Repeated measures ANOVAs were conducted to assess the effects of vision (no video, hallway, boardwalk, trail) and posture (seated with feet planted, seated with feet dangling, and standing) on kinematic measures (i.e., average COMHTP displacement and acceleration range, head and trunk ROM). Statistical analyses of muscular activity across experimental conditions involved repeated measures ANOVAs of the trapezoidal integrated EMG data to assess the effects of vision and posture. When appropriate, Bonferroni pairwise comparisons were conducted, and Greenhouse-Geisser corrections were applied; statistical significance was set as p < 0.05.
Results
Effects of Vision and Posture on Average ML and AP COMHTP Displacement Range
ML COMHTP Displacement Range
For female participants, a two-way interaction effect was observed between vision*posture for average ML COMHTP displacement (F(6,10) = 4.411, p = 0.035, ηp2 = 0.329). Pairwise comparisons revealed that in the standing posture, boardwalk ML COMHTP displacement was greater than the no video condition (p = 0.001), and hallway ML COMHTP displacement was greater than the no video condition (p = 0.019); see Fig. 2, panel A. For male participants, a similar two-way interaction effect was observed between vision*posture for average ML COMHTP displacement (F(6,8) = 5.039, p = 0.031, ηp2 = 0.419). Pairwise comparisons revealed that in the standing posture, boardwalk ML COMHTP displacement was greater than the no video condition (p = 0.022); see Fig. 2, panel B.
A main effect of vision (F(3,10) = 15.440, p = < 0.001, ηp2 = 0.632) and posture (F(2,10) = 27.668, p < 0.001, ηp2 = 0.755) was also observed in female participants in the average ML COMHTP displacement range. For the vision main effect, pairwise comparisons show that the trail (p = 0.020), boardwalk (p = 0.002), and hallway (p = 0.007) visual conditions resulted in significantly greater ML COMHTP displacement than the no video visual condition. For the posture main effect, pairwise comparisons showed that the seated with feet planted (p < 0.001) and seated with feet dangling (p = 0.003) postural conditions resulted in significantly less ML COMHTP displacement than the standing position. For male participants, only a main effect of posture was found (F(2,8) = 18.76, p = 0.001, ηp2 = 0.728). Pairwise comparisons revealed that all postures were significantly different from each other (p < 0.05), with seated with feet dangling and standing resulting in greater ML COMHTP displacement than the seated with feet planted position; standing had the greatest ML displacement out of all postural conditions; see Fig. 2, panels A and B.
AP COMHTP Displacement Range
For female participants, a two-way interaction effect was observed between vision*posture for average AP COMHTP displacement (F(6,11) = 9.511, p < 0.001, ηp2 = 0.487). Pairwise comparisons revealed that in the standing posture, all visual conditions resulted in significantly different AP COMHTP displacement than the no video condition (p < 0.05); no video condition had the least amount of postural AP sway; see Fig. 2, panel C. For male participants, a similar two-way interaction effect was observed between vision*posture for average AP COMHTP displacement (F(6,10) = 6.833, p = 0.007, ηp2 = 0.432). Pairwise comparisons revealed that the hiking trail environment, while seated with feet planted, resulted in greater AP COMHTP displacement (p = 0.048). In the seated with feet dangling posture, the boardwalk condition resulted in greater AP postural sway than both hiking trail (p = 0.012) and hallway (p = 0.031). Finally, in the standing posture, boardwalk showed greater AP postural sway than the hallway visual condition (p = 0.044), and all visual conditions resulted in significantly different AP COMHTP displacement than the no video condition (p < 0.05); no video condition had the least amount of postural AP sway; see Fig. 2, panel D.
In addition, main effects of vision (F(3,11) = 9.908, p = 0.004, ηp2 = 0.498) and posture (F(2,11) = 106.85, p < 0.001, ηp2 = 0.914) were observed in the average AP COMHTP displacement range for female participants. For the vision main effect, pairwise comparisons showed that all visual conditions were different than no video except for boardwalk (p < 0.05); no video had the least amount of AP COMHTP postural sway. For the posture main effect, pairwise comparisons show that the seated with feet planted and seated with feet dangling (p < 0.001) postural conditions resulted in significantly less AP COMHTP displacement than the standing position. For male participants, main effects of vision (F(3,10) = 16.237, p < 0.001, ηp2 = 0.643) and posture (F(2,10) = 103.56, p < 0.001, ηp2 = 0.920) were observed. The pairwise analysis for vision conditions revealed that hiking trail and boardwalk scenarios had greater AP COMHTP sway than hallway and no video, but were not significantly different from each other. It was also revealed that hallway was not significantly different than the no video condition. For the posture main effect, pairwise comparisons showed that the seated with feet planted and seated with feet dangling (p < 0.001) postural conditions resulted in significantly less AP COMHTP displacement than the standing position; see Fig. 2, panels C and D.
Fig. 2
Mean ± SD ML and AP COMHTP displacement range across posture and vision for females and males. ML displacement: Statistically significant interaction effects (p < 0.05) were observed for both females (panel A) and males (panel B). When standing in the boardwalk scenario, postural sway exceeded the no video condition for both sexes. For the female participants only, the hallway scenario revealed a significant difference from the no video condition (p = 0.019). AP displacement: Statistically significant interaction effects (p < 0.05) were observed in females (panel C) and males (panel D). Standing posture consistently showed greater sway under all experimentally manipulated visual conditions versus no video condition (p < 0.05). Please see text for details.
Click here to Correct
Effects of Vision and Posture on Average ML and AP COMHTP Acceleration Range
ML COMHTP Acceleration Range
Overall, no significant interactions of vision*posture were observed in the ML COMHTP acceleration range in both males and females (p > 0.05). Similarly, no main effects of either vision or posture were observed for ML acceleration in males and females (p > 0.05).
AP COMHTP Acceleration Range
For female participants, a two-way interaction effect was observed between vision*posture for average AP COMHTP acceleration (F(6,9) = 8.652, p = 0.006, ηp2 = 0.520). Pairwise comparisons revealed that in the standing posture, all visual conditions (hallway, 0.474 ± 0.049 m; boardwalk, 0.550 ± 0.033 m; hiking trail, 0.521 ± 0.042 m) resulted in significantly different AP COMHTP acceleration compared to the no video condition (p < 0.05) which had the least amount of postural AP acceleration (no video, 0.283 ± 0.056 m). In the males, no significant interactions of vision*posture were observed in the AP COMHTP acceleration range.
In males, only a main effect of posture was observed in AP COMHTP acceleration range (F(2,11) = 6.188, p = 0.021, ηp2 = 0.407). Statistical analyses revealed that the seated with feet planted resulted in the greatest postural AP acceleration (0.569 ± 0.036 m) in comparison to the seated with feet dangling (0.487 ± 0.038 m) and standing positions (0.435 ± 0.027 m). No main effects of either vision or posture were observed for AP acceleration in females.
Effects of Vision and Posture on Average Head ROM
Yaw Range (rotation about longitudinal axis)
Overall, no significant interactions of vision*posture were observed in the yaw direction for head ROM in both males and females. Similarly, no main effects of either vision or posture were observed for yaw ROM in males and females.
Pitch Range (rotation about medio-lateral axis)
For female participants, a two-way interaction effect was observed between vision*posture in the pitch direction for head ROM (F(6,11) = 3.538, p = 0.021, ηp2 = 0.261). Pairwise comparisons revealed that the head ROM in the pitch direction was greater in the hiking trail environment while standing (4.163 ± 0.504 degrees) than in the no video condition (2.883 ± 0.400 degrees; p = 0.027).
For male participants, a similar two-way interaction effect was observed between vision*posture in the pitch direction for head ROM (F(6,10) = 4.076, p = 0.021, ηp2 = 0.312). Pairwise comparisons revealed that in the boardwalk scenario in the seated feet dangling posture, pitch head ROM (4.637 ± 0.630 degrees) was greater than the hallway condition (3.150 ± 0.362 degrees; p = 0.019). No main effects of either vision or posture were observed for pitch head ROM in both males and females.
Effects of Vision and Posture on Average Trunk ROM
Yaw Range (rotation about longitudinal axis)
For female participants, a two-way interaction effect was observed between vision*posture in the yaw direction for trunk ROM (F(6,12) = 4.536, p = 0.007, ηp2 = 0.292). Pairwise comparisons revealed that in the hiking trail (1.161 ± 0.068 degrees) and hallway scenarios while standing (1.265 ± 0.108 degrees), the yaw trunk ROM was greater than the no video condition (0.849 ± 0.048 degrees; p < 0.05). No interaction effects were observed between vision*posture in male participants.
A main effect of posture was observed in yaw trunk ROM (F(2,12) = 20.427, p < 0.001, ηp2 = 0.650) for female participants. Pairwise comparisons revealed that the standing position resulted in greater yaw trunk ROM (1.119 ± 0.055 degrees) than all other postural conditions (seated with feet planted, 0.793 ± 0.056 degrees; seated with feet dangling, 0.788 ± 0.060 degrees; p < 0.05). In male participants, no main effects of either vision or posture were observed for yaw trunk ROM.
Pitch Range (rotation about medio-lateral axis)
Overall, no significant interactions of vision*posture were observed in the pitch trunk ROM range in both males and females. Similarly, no main effects of either vision or posture were observed for trunk ROM in the pitch direction for males and females.
Effects of Vision and Posture on Trapezoidal Integrated EMG
Gastrocnemius
Overall, no significant interactions of vision*posture were observed in the integrated EMG for gastrocnemius in both males and females. A main effect of posture was observed for the gastrocnemius in male participants (F(2,8) = 5.263, p = 0.022, ηp2 = 0.429; see Fig. 3, panel D); however, subsequent pairwise comparisons revealed no significant effects of one postural condition versus another; seated with feet planted (99.325 ± 3.972 mV), seated with feet dangling (88.603 ± 9.201 mV) and standing (122.072 ± 7.401 mV).
Vastus Lateralis
Overall, no significant interactions of vision*posture were observed in the integrated EMG for vastus lateralis in both males and females. However, a main effect of posture was observed in female participants (F(2,9) = 6.927, p = 0.018, ηp2 = 0.464). Pairwise comparisons showed a greater integrated EMG response in the standing versus seated with feet planted position (p = 0.004) for the vastus lateralis. See Fig. 3, panel A.
Tibialis Anterior and Biceps Femoris
Overall, no significant interactions of vision*posture were observed in the integrated EMG for tibialis anterior and biceps femoris in both males and females. Similarly, no main effects of either vision or posture were observed for integrated EMG in the tibialis anterior and biceps femoris for males and females (p > 0.05).
Fig. 3
Mean (± SD) integrated RMS EMG (mV) for the vastus lateralis and tibialis anterior across three postural conditions and four visual conditions in male and female participants. Vastus lateralis: No significant interaction between posture and vision was observed in either sex (p > 0.05). However, a main effect of posture was observed in females (panel A; p < 0.05), with pairwise comparisons revealing greater EMG in the standing condition compared to seated with feet planted (p = 0.004). No main effects were observed in males (panel B; p > 0.05). Gastrocnemius: No significant posture*vision interaction or main effect of vision was observed in either sex (panel C and D; p > 0.05). However, a main effect of posture was found in male participants (panel D; p = 0.022) with pairwise comparisons revealing no significant effect between conditions.
Click here to Correct
Discussion
As we were interested in understanding how visual input shapes sensorimotor and postural control, this protocol was designed to manipulate visual scene complexity (no video, hallway, boardwalk, and trail scenes) and BoS (seated with feet planted, seated with feet dangling, and standing) and assess how these modifications influence postural sway (ML and AP displacement and acceleration), segmental movement (head and trunk ROM), and muscle activity (integrated RMS) across the sexes. Based on previous research in P-A coupling, where perception is inherently linked to the control of action (Gibson, 1966; Warren et al., 2001), our approach aimed to isolate how visual scenes with varying complexity may serve as feedforward control sources in the absence of overt movement. Through this methodology, we were able to examine how visuomotor responsiveness changes with different conditions of postural demand. Our findings partially supported our hypotheses, whereby we predicted that visually complex environments (e.g., hallway, boardwalk, and trail) would elicit greater postural sway and sensorimotor engagement, especially under reduced BoS (e.g., seated with feet dangling and standing) due to the reliance on visual inputs. The results suggest that the complexity of a visual cue plays an important role in the modulation of postural control mechanisms, more critically in the standing posture, and that this response may not be uniform across the sexes.
Impact of Visual Cue Complexity on Postural Sway
ML and AP Postural Sway Displacement
In line with our predictions, results revealed that the COMHTP postural displacement range was greater in the standing posture, particularly when paired with visually complex scenes (i.e., hallway, boardwalk, trail). This aligns with previous literature on sensory integration in balance control, suggesting that the CNS reweights sensory inputs depending on their reliability (e.g., Peterka, 2002). When proprioceptive information is limited, visual cues may be upweighted, resulting in increased sway in visually complex environments. However, when analyzing sex-specific modulations, important differences arise, highlighting the complexity of visuomotor coupling across populations.
In the ML direction, two-way interaction effects between vision and posture were observed for both sexes; however, the drivers of these interactions are distinct between females and males. For female participants, visual scenes of the boardwalk and hallway resulted in greater ML sway than the no video condition (Fig. 2A). This was in partial support of our hypothesis that more complex visual scenes would evoke greater postural sway due to increased visual cue complexity and optic flow. Notably, the hiking trail visual scene was anticipated to elicit the greatest amount of sway due to its rough and obstacle-laden terrain; however, we found that it did not differ significantly from the no video condition. This may suggest that hallway and boardwalk scenes may have presented with more consistent and salient visual flow, which could more reliably induce vection and destabilize ML sway (i.e., Palmisano et al., 2015). Furthermore, in females, we observed a main effect of posture, confirming that standing resulted in the greatest sway, followed by seated with feet dangling, and then seated with feet planted; this is consistent with our hypothesis that reduced proprioceptive input increases sway (Vuillerme & Pinsault, 2007). Males showed a similar interaction effect between vision and posture, however, the boardwalk scene primarily drove this during standing (Fig. 2B). No other visual conditions had significant effects, potentially supporting the idea that males have less reliance on visual cues than females (Gorbet et al., 2007; Barnett-Cowan et al., 2010).
In the AP direction, both sexes demonstrated more robust and consistent visual effects. For females, all visual scenes elicited significantly greater sway than the no video condition when standing, highlighting how reduced postural support can amplify the influence of visual input, particularly in females (Fig. 2C). An interesting difference was observed in our male participants, where the visual effects were more segmented across postural conditions, with select scenes (e.g., hiking trail or boardwalk) evoking greater sway only in specific postures (Fig. 2D). We observed that the trail induced greater sway in the seated feet-planted condition, while the boardwalk produced the most sway in the feet-dangling posture. These differences may reflect sex-based strategies in processing visual motion or compensating for postural demands (e.g., Ingel et al., 2021; Merritt et al., 2007).
ML and AP Postural Sway Acceleration
COMHTP acceleration reflects dynamic postural adjustments and complements displacement measures by capturing the rate of change in sway (Yu et al., 2008). Our hypothesis that standing in visually complex environments would yield the highest acceleration was partially supported.
No significant effects emerged in the ML direction for either sex, indicating that lateral sway velocity remained stable across conditions. This may reflect the higher baseline control and biomechanical resistance to lateral sway under both seated and standing postures, consistent with prior work indicating that the ML plane is typically more stable than the AP axis in quiet stance (Winter et al., 1996). Overall, the results align with an inverted pendulum model, where ML adjustments are more mechanically constrained.
In the AP direction, females displayed a significant interaction effect between vision and posture, with standing in the three complex visual scenes consistently producing higher acceleration than the no video condition. This supports the notion that complex optic flow increases neuromuscular demand when BoS is limited, prompting more rapid postural corrections (Slobounov et al., 1997). This further emphasizes the greater reliance on visual cues in females compared to males, suggesting that visual information may be more heavily weighted in females’ postural control strategies. Such findings align with literature indicating sex differences in multisensory integration, where females are more visually dependent for balance, relying on more rapid postural adjustments according to visual cueing (e.g., Gorbet et al., 2007; Barnett-Cowan et al., 2010). In males, no interaction effect was observed between vision and posture. Instead, we observed a main effect of posture, with the seated with feet planted posture showing the greatest acceleration. This unexpected result may reflect a postural stiffening response in specific contexts, marked by abrupt, higher-frequency adjustments (e.g., Adkin et al., 2000).
Segmental Kinematics Revealing Effects of Visual Cues on Head and Trunk ROM
Segmental head and trunk kinematics revealed direction- and sex-specific modulation in response to visual and postural manipulations.
A
For head pitch (up-down) ROM, interaction effects were found in both female and male participants. In females, we observed a greater pitch movement in the hiking trail scene compared to the no video condition while standing, and males showed greater pitch movement in the boardwalk scene compared to the hallway scene when their feet were dangling. These results suggest that there is an altered visual engagement between varying visual cue complexities for the male and female participants in our study under different bases of support conditions. These findings further support the idea that head orientation may adjust dynamically to optimize visual information gathering in more visually rich environments according to an adopted attention or postural strategy, and that these changes may differ between females and males (e.g., Wikstrom et al., 2006).
Trunk ROM in the yaw direction (rotation about the longitudinal axis) demonstrated interaction effects exclusively in females, showing greater movement during the hiking trail and hallway scenes while standing. This indicates heightened responsiveness of proximal segments to visual flow under smaller BoS conditions in females. Additionally, the main effects of posture were noted in female participants, where standing was found to enhance trunk mobility. Males exhibited no such effects, further supporting a potential sex-related difference in visual-motor coupling (e.g., Gorbet et al., 2007; Barnett-Cowan et al., 2010). Pitch ROM of the trunk was not significantly affected by vision or posture, suggesting that this movement plane may be less sensitive to environmental or postural constraints in seated and standing tasks with minimal locomotor demands (i.e., absence of overt movement).
Limited Effects of Visual Cueing on Integrated EMG in Lower Limb Muscles
EMG analyses revealed minimal significant effects overall, suggesting that our visual scene manipulations may not have strongly influenced lower limb muscle activation for the muscles we included under relatively static postural conditions. It is important to note that only a limited number of lower-limb muscles could be monitored, as the capacity of our EMG system constrained the number of channels available for simultaneous recording. The results showed that only the vastus lateralis in females demonstrated a main effect of posture, with standing eliciting greater integrated activity than the seated position with feet planted (Fig. 3A). This is the only finding that supports our hypothesis that reduced BoS would elevate muscle activation. Similarly, in males, we observe a main effect of posture only in the gastrocnemius muscle, however, pairwise comparisons did not identify significant differences between conditions, indicating that the source of this effect remains unclear and should be further explored. Overall, the absence of visual effects on EMG suggests that while visual input seemingly modulates sway behaviour through kinematic output, it may not directly scale the magnitude of muscle activation, at least not in the muscles sampled or under the conditions we have tested with our protocol. For example, Antley and Slater (2010) found increased lower spine muscle activation as participants perceived to walk on a flat floor, a narrow ground-level ribbon, and an elevated narrow beam in virtual reality, despite knowing that they were walking on the flat ground. This may support the idea that sway-related adjustments are more strategy-based (e.g., trunk or head modulation) than driven by changes in lower limb muscle output in static postures, while also suggesting that visual effects on EMG may emerge under conditions that challenge balance more strongly than those used in our protocol. This is further supported by the lack of significant results found in the tibialis anterior and biceps femoris muscles for both sexes in our paradigm.
Relationship with Perception-Action Theory, Sex-Differences, and Potential Directions
Our study results align with the principles of the P-A theory, which emphasize the ongoing interrelationship between the task, the environment, and the individual (Gibson, 1966; Warren et al., 2001; Shumway-Cook & Woollacott, 2007). The observed interaction effects between visual cue complexity and posture suggest that individuals continue to modulate their motor behaviour based on task demands and perceptual influences even into adulthood. The changes observed in postural sway, segmental movement, and muscle activity across various combinations of visual and postural conditions reflect the coupling between sensory information and motor response. Focusing specifically on standing posture, where visual scenes consistently induced greater sway, we find that a reduced BoS amplifies the dependence on visual cues for maintaining postural equilibrium, particularly in females. This finding aligns with earlier studies that have shown optic flow enhances the perception of self-motion and modulates postural responses when somatosensory inputs are unreliable (Peterka, 2002). Some research indicates that visually rich environments, filled with abundant visual cues, enhance self-motion perception by providing detailed optic flow information (e.g., obstacles, textures, depth indicators; Jörges & Harris, 2021), while even minimalistic environments (e.g., dot patterns: Warren et al., 1991; or a simple four-walled room: Lee & Aronson, 1974) can support self-motion perception. These findings demonstrate the visual system's adaptability in extracting motion-relevant information under constrained conditions. However, the behavioural consequences of increasing visual richness, particularly concerning varying postural constraints, remain underexplored. Our findings begin to address this gap by showing how increasingly complex visual stimuli modulate balance control and by highlighting potential sex differences in the use of vision for postural regulation.
On this point, the sex-based differences observed in the study results further suggest that P-A coupling may not be uniform across individuals, as males and females may adopt different weighting strategies in multisensory integration. While both sexes responded to changes in posture and vision, females demonstrated a more consistent visual dependence in sway and kinematic outcomes, aligning with previous literature that indicates greater visual reliance in balance tasks among women (e.g., Gorbet et al., 2007; Barnett-Cowan et al., 2010). These differences highlight the need for future work to investigate the cognitive or neural bases of sex-specific sensory weighting strategies, particularly in the context of how optic flow is interpreted and integrated across sensory systems.
Beyond the theoretical implications, these results hold significant relevance in clinical and applied contexts. Activities of daily living, such as walking through visually crowded spaces, navigating crowds or obstacles, and responding to moving visual stimuli, all require the seamless integration of motor behaviour and perceptual input (McFadyen et al., 2022). Understanding how visual cues shape postural behaviour can inform rehabilitation strategies for individuals with sensory deficits or balance disorders. A more specific example is how virtual reality or visually enriched environments (e.g., via a projector) can be strategically employed in training programs aimed at enhancing visuomotor coupling in various populations, including but not limited to older adults, stroke patients, or those with vestibular dysfunction. In summary, this study supports the P-A theory as a framework for understanding balance control into adulthood. It acknowledges the critical role of visual information in influencing movements across different populations and contexts.
Limitations
While our findings provide valuable insight into visuomotor control, there are limitations to our protocol that must be acknowledged. To begin with, many effects, although statistically significant, were small in magnitude and should be interpreted with caution when applied to real-world contexts; however, the partial eta-squared values indicated acceptable effect sizes. Furthermore, the sample was limited in size and consisted of healthy young adults, which reduced its generalizability to the broader population. Although the current aim of the study was to assess lower limb muscle activation, the small number of muscles examined may have led to the omission of key contributors to postural control. Our EMG equipment limited us to the capture of only four muscles, and therefore selected large muscle groups to include in this experiment as they have been shown to play an important role in postural control (e.g., Di Giulio et al., 2009; Florence Tse et al., 2013; García-Massó et al., 2016).
A
We encouraged all participants to maintain their focus during the protocol, however an additional limitation is that we could not fully control participants’ attentional focus during trials, and in addition we did not track their gaze behaviour, which limits our ability to assess attentional engagement. Lastly, visual scenes and biomechanical recordings were not temporally synced, preventing precise alignment of specific visual events with motor responses.
Conclusions
In conclusion, visual cue complexity and posture interact to control balance-related motor behaviours across multiple physiological systems. Findings underscore the need to consider sex as a key variable in visuomotor control research and suggests that complex visual environments can differentially challenge or facilitate postural control depending on the sensorimotor context. Future studies should investigate these visuo-postural dynamics during more ecologically valid locomotor tasks and utilize neurophysiological tools (e.g. mobile electroencephalography (EEG) or functional near-infrared spectroscopy (fNIRS)) to map sensory weighting mechanisms in action further.
A
Funding
This work was suppored by a Natural Sciences and Engineering Research Council of Canada (NSERC) Discovery Grant awarded to LAV and an Ontario Graduate Scholarship awarded to KDM.
A
Acknowledgement
The authors would like to thank our study participants, Kristina Kerr who assisted with data collections and Jerry You who helped develop Python code for EMG processing .
A
A
Author Contribution
KDM was involved in Conceptualization/Experimental Design, Data Collection, Formal Data Processing & Analyses, Visualization/Data Preparation, Writing and Editing of Manuscript.JMDO was involved in Data Collection, Formal Data Processing & Analyses, Visualization/Data Preparation and Editing of Manuscript.LAV was involved in Conceptualization/Experimental Design, Guiding the Data Processing & Analyses, Visualization/Data Preparation, Writing and Editing of Manuscript.
References
Adkin AL, Frank JS, Carpenter MG, Peysar GW (2000) Postural control is scaled to level of postural
threat Gait & posture, 12(2), 87–93. https://doi.org/10.1016/S0966-6362(00)00057-6
Adler LA, Spencer T, Faraone SV, Kessler RC, Howes MJ, Biederman J, Secnik K (2006) Validity
A
of the pilot Adult ADHD Self-Report Scale (ASRS) to rate adult ADHD symptoms. Annals Clin
A
Psychiatry 18(3), 145–148. https://doi.org/10.1080/10401230600801077
Antley A, Slater M (2010) The effect on lower spine muscle activation of. walking on a narrow beam in virtual
A
reality IEEE Trans Vis Comput Graph, 17(2), 255–259
https://doi.org/10.1109/TVCG.2010.26
Barbu-Roth M, Anderson DI, Desprès A, Provasi J, Cabrol D, Campos JJ (2009) Neonatal stepping in
A
relation to terrestrial optic flow. Child Dev, 80(1), 8–14
https://doi.org/10.1111/j.1467-8624.2008.01241.x
Barbu-Roth M, Anderson DI, Desprès A, Streeter RJ, Cabrol D, Trujillo M, Provasi J (2014) Air
stepping in response to optic flows that move toward and away from the neonate. Developmental
A
Psychobiology 56(5), 1142–1149. https://doi.org/10.1002/dev.21174
Barnett-Cowan M, Dyde RT, Thompson C, Harris LR (2010) Multisensory determinants of orientation
A
perception task-specific sex differences. Eur J Neurosci, 31(10), 1899–1907
https://doi.org/10.1111/j.1460-9568.2010.07199.x
Bertenthal BI, Rose JL, Bai DL (1997) Perception–action coupling in the development of visual control
A
of posture J Exp Psychol Hum Percept Perform, 23(6), 1631
https://psycnet.apa.org/doi/10.1037/0096-1523.23.6.1631
Berthoz A, Lacour M, Soechting JF, Vidal PP (1979) The role of vision in the control of posture during
A
linear motion Prog Brain Res, 50, 197–209. https://doi.org/10.1016/s0079-6123(08)60820-1
Bremmer F, Klam F, Duhamel JR, Hamed B, S., Graf W (2002) Visual–vestibular interactive responses
in the macaque ventral intraparietal area (VIP). European Journal of Neuroscience, 16(8), 1569–1586
https://doi.org/10.1046/j.1460-9568.2002.02206.x
Carriere JS, Cheyne JA, Smilek D (2008) Everyday attention lapses and memory failures: The affective
A
consequences of mindlessness Conscious Cogn, 17(3), 835–847
https://doi.org/10.1016/j.concog.2007.04.008
Chen G, Lu Y, King JA, Cacucci F, Burgess N (2019) Differential influences of environment and self-
A
motion on place and grid cell firing. Nat Commun, 10(1), 630
https://doi.org/10.1038/s41467-019-08550-1
Chou YH, Wagenaar RC, Saltzman E, Giphart JE, Young D, Davidsdottir R, Cronin-Golomb A (2009) Effects of optic flow speed and lateral flow asymmetry on locomotion in younger and older adults
A
a virtual reality study Journals Gerontology: Ser B, 64(2), 222–231
https://doi.org/10.1093/geronb/gbp003
Di Giulio I, Maganaris CN, Baltzopoulos V, Loram ID (2009) The proprioceptive and agonist roles of
A
gastrocnemius soleus and tibialis anterior muscles in maintaining human upright posture. J of
physiology, 587(10), 2399–2416. https://doi.org/10.1113/jphysiol.2009.168690
Florence Tse YY, Petrofsky J, Berk L, Daher N, Lohman E, Cavalcanti P, Potnis (2013) P. A. Postural
sway and EMG analysis of hip and ankle muscles during balance tasks. Int J Therapy
A
and Rehabilitation, 20(6), 280–288. https://doi.org/10.12968/ijtr.2013.20.6.280
Forma V, Anderson DI, Goffinet F, Barbu-Roth M (2018) Effect of optic flows on newborn crawling.
A
Developmental psychobiology 60(5), 497–510. https://doi.org/10.1002/dev.21634
García-Massó X, Pellicer-Chenoll M, González LM, Toca-Herrera (2016) J L The difficulty of the
A
postural control task affects multi-muscle control during quiet standing. Experimental brain
research, 234(7), 1977–(1986) http://dx.doi.org/10.1007/s00221-016-4602-z
Gibson JJ (1966) The Senses Considered as Perceptual Systems. Houghton Mifflin, Boston
Gorbet DJ, Sergio LE (2007) Preliminary sex differences in human cortical. BOLD fMRI activity during the
A
preparation of increasingly complex visually guided movements Eur J Neurosci, 25(4), 1228–1239. https://doi.org/10.1111/j.1460-9568.2007.05358.x
Hermens HJ, Freriks B, Merletti R, Stegeman D, Blok J, Rau G, Hägg G (1999) European
A
recommendations for surface electromyography Roessingh Res Dev, 8(2), 13–54
Heft H (2001) Ecological psychology in context: James Gibson, Roger Barker, and the legacy of William James's radical empiricism. Psychology
Ingel N, Vice V, Dommer C, Csonka J, Moore T, Zaleski A, Sell T (2021) Examining Sex Differences
A
in Visual Reliance During Postural Control in Intercollegiate Athletes. Int J Sports
A
Physical, Therapy 16(5), 1273. https://doi.org/10.26603/001c.28099
Jörges B, Bansal A, Harris LR (2024) Precision and temporal dynamics in heading perception assessed by
A
continuous psychophysics PLoS ONE, 19(10), e0311992. https://doi.org/10.1371/journal.pone.0311992
Jörges B, Harris LR (2021) Object speed perception during lateral visual self-motion. Attention, Perception
& Psychophysics, 1–22. https://doi.org/10.3758/s13414-021-02372-4
Lee DN, Aronson E (1974) Visual proprioceptive control of standing in human infants. Perception &
A
Psychophysics 15, 529–532. https://doi.org/10.3758/BF03199297
Le HA, Baslamisli AS, Mensink T, Gevers T (2018) Three for one and one for three: Flow, Segmentation
A
Normals S arXiv preprint arXiv:1807.07473. https://doi.org/10.48550/arXiv.1807.07473
Liu S, Kersten DJ, Legge GE (2023) Effect of expansive optic flow and lateral motion parallax on depth
A
estimation with normal and artificially reduced acuity. J Vis, 23(12), 3–3
https://doi.org/10.1167/jov.23.12.3
Marigold DS (2008) Role of peripheral visual cues in online visual guidance of locomotion. Exercise and sport
sciences reviews, 36(3), 145–151. https://doi.org/10.1097/jes.0b013e31817bff72
McFadyen BJ, Lamontagne A, Olivier AH, Julien P, Cinelli M, Barbieri FA (2022) Proactive control
A
to navigate our daily environments Brazilian J Motor Behav, 16(5), 315–318. https://doi.org/10.20338/bjmb.v16i5.319
Merritt P, Hirshman E, Wharton W, Stangl B, Devlin J, Lenz A (2007) Evidence for gender differences in
A
visual selective attention Pers Indiv Differ, 43(3), 597–609
https://doi.org/10.1016/j.paid.2007.01.016
Mochizuki L, Duarte M, Amadio A, Zatsiorsky V, Latash M (2006) Changes in Postural Sway and Its
A
Fractions in Conditions of Postural Instability J Appl Biomech, 22, 51–60. https://doi.org/10.1123/jab.22.1.51
Nam H-S, Kim J-H, Lim Y-J (2017) The Effect of the Base of Support on Anticipatory Postural Adjustment
A
Stability P J Korean Phys Therapy, 29(3), 135–141
https://doi.org/10.18857/jkpt.2017.29.3.135
Oldfield RC (1971) The assessment and analysis of handedness: the Edinburgh inventory. Neuropsychologia 9(1):97–113. https://doi.org/10.1016/0028-3932(71)90067-4
Palmisano S, Allison RS, Schira MM, Barry RJ (2015) Future challenges for vection research
definitions, functional significance, measures, and neural bases. Front Psychol, 6, 193
https://doi.org/10.3389/fpsyg.2015.00193
Peterka RJ (2002) Sensorimotor integration in human postural control. J Neurophysiol 88(3):1097–1118. https://doi.org/10.1152/jn.2002.88.3.1097
Phu S, Persiani M, Tan B, Brodie M, Gandevia S, Sturnieks DL, Lord SR (2023) The effects of optic
A
flow on postural stability Influence of age and fall risk. Exp Gerontol, 175, 112146
https://doi.org/10.1016/j.exger.2023.112146
Pourhashemi N, Jaksic K, Keshavarz B, Cleworth TW (2025) The Effects of Delayed Visual Feedback on
A
Dynamic Postural Control Investig Ophthalmol Vis Sci, 66(6), 68–68
https://doi.org/10.1167/iovs.66.6.68
Raffi M, Piras A, Persiani M, Squatrito S (2014) Importance of optic flow for postural stability of male and
A
female young adults Eur J Appl Physiol, 114, 71–83
https://doi.org/10.1007/s00421-013-2750-4
A
Rainoldi A, Melchiorri G, Caruso I (2004) A method for positioning electrodes during surface EMG
A
recordings in lower limb muscles. J Neurosci Methods, 134(1), 37–43
https://doi.org/10.1016/j.jneumeth.2003.10.014
Ramirez BA, López-Moliner J (2020) Active sampling of the optic flow to predict time-to-contact. J of
A
Vision 20(11), 795–795. https://doi.org/10.1167/jov.20.11.795
Shumway-Cook A, Woollacott MH (2007) Motor control: translating research into clinical practice
Lippincott Williams & Wilkins.
Slobounov SM, Slobounova ES, Newell KM (1997) Virtual time-to-collision and human postural
A
control J Mot Behav, 29(3), 263–281. https://doi.org/10.1080/00222899709600841
Smith AT, Wall MB, Thilo KV (2012) Vestibular inputs to human motion-sensitive visual cortex. Cerebral
cortex, 22(5), 1068–1077. https://psycnet.apa.org/doi/10.1093/cercor/bhr179
Vuillerme N, Pinsault N (2007) Re-weighting of somatosensory inputs from. the foot and the ankle for
A
controlling posture during quiet standing following trunk extensor muscles fatigue. Experimental brain
research, 183(3), 323–327. https://doi.org/10.1007/s00221-007-1047-4
Wade MG, Lindquist R, Taylor JR, Treat-Jacobson D (1995) Optical flow, spatial orientation, and the
control of posture in the elderly The Journals of Gerontology Series B: Psychological Sciences and Social
A
Sciences 50(1), P51–P54. https://doi.org/10.1093/geronb/50b.1.p51
Wang G, Yang Y, Wang J, Hao Z, Luo X, Liu J (2022) Dynamic changes of brain networks during
A
standing balance control under visual conflict. Front NeuroSci, 16, 1003996
https://doi.org/10.3389/fnins.2022.1003996
Warren WH (2006) The dynamics of perception and action. Psychol Rev 113(2):358
https://doi.org/10.1037/0033-295x.113.2.358
Warren WH, Hannon DJ (1988) Direction of self-motion is perceived from optical flow. Nature 336(6195):162–163. https://doi.org/10.1038/336162a0
Warren WH, Kay BA, Zosh WD, Duchon AP, Sahuc S (2001) Optic flow is used to control human
A
walking Nat Neurosci, 4(2), 213–216. https://doi.org/10.1038/84054
Warren WH, Mestre DR, Blackwell AW, Morris MW (1991) Perception of circular heading from
A
optical flow J Exp Psychol Hum Percept Perform, 17(1), 28
https://doi.org/10.1037/0096-1523.17.1.28
Wikstrom EA, Tillman MD, Kline KJ, Borsa PA (2006) Gender and limb differences in dynamic
A
postural stability during landing Clin J Sport Med, 16(4), 311–315
https://doi.org/10.1097/00042752-200607000-00005
Winter DA, Patla AE, Prince F, Ishac M, Gielo-Perczak K (1998) Stiffness Control of Balance in Quiet
A
Standing J Neurophysiol, 80, 1211–1221. https://doi.org/10.1152/jn.1998.80.3.1211
Winter DA, Prince F, Frank JS, Powell C, Zabjek KF (1996) Unified theory regarding A/P and M/L
A
balance in quiet stance J Neurophysiol, 75(6), 2334–2343
https://doi.org/10.1152/jn.1996.75.6.2334
Yoo JW, Lee DR, Sim YJ, You JH, Kim CJ (2014) Effects of innovative virtual reality game and
A
EMG biofeedback on neuromotor control in cerebral palsy Biomed Mater Eng, 24(6), 3613–3618. https://doi.org/10.3233/bme-141188
Yu E, Abe M, Masani K, Kawashima N, Eto F, Haga N, Nakazawa K (2008) Evaluation of postural
control in quiet standing using center of mass acceleration: comparison among the young, the elderly, and
A
people with stroke Arch Phys Med Rehabil, 89(6), 1133–1139
https://doi.org/10.1016/j.apmr.2007.11.029
Total words in MS: 7899
Total words in Title: 12
Total words in Abstract: 0
Total Keyword count: 6
Total Images in MS: 3
Total Tables in MS: 0
Total Reference count: 133