Multimodal Sensory Individuals

Speaker: Bence Nanay
Date: Thursday 5 February 2015 2:00 pm - 3:30 pm
Venue: University of Warwick, , CV4 7AL
See all events at: University of Warwick

Room F1.11 (Engineering)

I am looking at my really loud coffee machine. When I do this, I am attributing properties (of, say, being black) to some kind of entity visually and I am at the same time also attributing properties (of, say, being loud) to some kind of entity auditorily. Question: what is the relation between these two kinds of entities: to put it simply, the entity I hear and the entity I see? I argue that they are both parts of one multimodal sensory individual. Perception in different sense modalities acquaint us with different parts of this multimodal sensory individual. I compare the perception of multimodal sensory individuals to amodal perception and argue that in both cases, it is mental (visual, auditory, tactile, etc.) imagery that plays a crucial role in representing those parts of the perceived object that we are not directly acquainted with. I close by pointing out the usefulness of the concept of multimodal sensory individuals in understanding perceptually guided actions, cross-modal priming and cross-modal binding.