
Full text loading...
The social brain hypothesis implies that humans and other primates evolved “modules” for representing social knowledge. Alternatively, no such cognitive specializations are needed because social knowledge is already present in the world — we can simply monitor the dynamics of social interactions. Given the latter idea, what mechanism could account for coalition formation? We propose that statistical learning can provide a mechanism for fast and implicit learning of social signals. Using human participants, we compared learning of social signals with arbitrary signals. We found that learning of social signals was no better than learning of arbitrary signals. While coupling faces and voices led to parallel learning, the same was true for arbitrary shapes and sounds. However, coupling versus uncoupling social signals with arbitrary signals revealed that faces and voices are treated with perceptual priority. Overall, our data suggest that statistical learning is a viable domain-general mechanism for learning social group structure. Keywords: social brain; embodied cognition; distributed cognition; situated cognition; multisensory; audiovisual speech; crossmodal; multimodal