Visual routines for detecting causal interactions are tuned to motion direction

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:bioRxiv (Feb 20, 2025)
1. Verfasser: Ohl, Sven
Weitere Verfasser: Rolfs, Martin
Veröffentlicht:
Cold Spring Harbor Laboratory Press
Schlagworte:
Online-Zugang:Citation/Abstract
Full text outside of ProQuest
Tags: Tag hinzufügen
Keine Tags, Fügen Sie das erste Tag hinzu!

MARC

LEADER 00000nab a2200000uu 4500
001 3168897578
003 UK-CbPIL
022 |a 2692-8205 
024 7 |a 10.1101/2023.08.22.554237  |2 doi 
035 |a 3168897578 
045 0 |b d20250220 
100 1 |a Ohl, Sven 
245 1 |a Visual routines for detecting causal interactions are tuned to motion direction 
260 |b Cold Spring Harbor Laboratory Press  |c Feb 20, 2025 
513 |a Working Paper 
520 3 |a Detecting causal relations structures our perception of events in the world. Here, we determined for visual interactions whether generalized (i.e., feature-invariant) or specialized (i.e., feature-selective) visual routines underlie the perception of causality. To this end, we applied a visual adaptation protocol to assess the adaptability of specific features in classical launching events of simple geometric shapes. We asked observers to report whether they observed a launch or a pass in ambiguous test events (i.e., the overlap between two discs varied from trial to trial). After prolonged exposure to causal launch events (the adaptor) defined by a particular set of features (i.e., a particular motion direction, motion speed, or feature conjunction), observers were less likely to see causal launches in subsequent ambiguous test events than before adaptation. Crucially, adaptation was contingent on the causal impression in launches as demonstrated by a lack of adaptation in non-causal control events. We assessed whether this negative aftereffect transfers to test events with a new set of feature values that were not presented during adaptation. Processing in specialized (as opposed to generalized) visual routines predicts that the transfer of visual adaptation depends on the feature-similarity of the adaptor and the test event. We show that the negative aftereffects do not transfer to unadapted launch directions but do transfer to launch events of different speed. Finally, we used colored discs to assign distinct feature-based identities to the launching and the launched stimulus. We found that the adaptation transferred across colors if the test event had the same motion direction as the adaptor. In summary, visual adaptation allowed us to carve out a visual feature space underlying the perception of causality and revealed specialized visual routines that are tuned to a launch's motion direction.Competing Interest StatementThe authors have declared no competing interest.Footnotes* We have added new literature and provide a more detailed discussion regarding findings in causal inference, the neurophysiological implementation and the potential influence of top-down signals. The results have not changed. 
653 |a Visual perception 
653 |a Information processing 
653 |a Adaptor proteins 
653 |a Adaptation 
653 |a Visual stimuli 
653 |a Adaptability 
700 1 |a Rolfs, Martin 
773 0 |t bioRxiv  |g (Feb 20, 2025) 
786 0 |d ProQuest  |t Biological Science Database 
856 4 1 |3 Citation/Abstract  |u https://www.proquest.com/docview/3168897578/abstract/embedded/6A8EOT78XXH2IG52?source=fedsrch 
856 4 0 |3 Full text outside of ProQuest  |u https://www.biorxiv.org/content/10.1101/2023.08.22.554237v3