next up previous
Next: Statistical analysis of multi-sensory Up: Past to Present: More Previous: Improving Machine learning

Discovery of new multi-sensory interactions

The self-supervised model learns from statistical dependencies between stimuli in different modalities and previous work has shown that humans also learn these relationships and that they influence their behavior. We have recently discovered another example of surprising effects from well learned cross-modal relationships. Graduate student Ayse Saygin and I studied visual auditory pattern matching and showed (at VSS 2004, submitted to Psychological Science 2007) that the ability to match visual moving feet from a point-light biological motion stimulus to a sound pattern is better when the feet are part of an upright walker than when part of an inverted walker. The improved matching for upright (natural) walkers is only true when the sound pattern is consistent with footsteps and not when it is of the same frequency but the wrong phase. This result shows that people are affected by learned multisensory experience with walking people.



Virginia de Sa 2007-08-10