Multimodality, the Parallel Architecture, and a New Linguistic Paradigm, Neil Cohn (Tilburg University)
The last decades have seen a growing recognition of the importance of multimodality on the understanding of language. Yet, the full implications of multimodality have been stymied by pervading notions of language as an amodal and arbitrary system with a primary modality of speech—ideas which have persisted for over a century. As argued in our new book A Multimodal Language Faculty (Cohn & Schilperoord 2024), this conception of language cannot account for many of the basic observations revealed in the past decades of language research. I will present an expansion of Jackendoff’s Parallel Architecture into a multimodal model of language that accounts for all unimodal behaviors across the vocal, bodily, and graphic modalities along with their multimodal combinations. This “grand unified” model directly allows for semiotic promiscuity across iconicity, indexicality, and symbolicity, while warranting new understandings of linguistic innateness, relativity, universals, and evolution. Altogether, this approach heralds a shift to a Multimodal Paradigm of the language sciences, re-imagining both its cognitive faculty and the notion of “language” itself.