PSYCHOLOGY DEPARTMENT INVITED SPEAKER
Human language is a fundamental biological signal with computational properties that differ from other perception-action systems: hierarchical relationships between sounds, words, phrases, and sentences, and the unbounded ability to combine smaller units into larger ones, resulting in a "discrete infinity" of expressions that are often compositional. These properties have long made language hard to account for from a biological systems perspective and within models of cognition. In this talk, I synthesize insights from the language sciences, computation, and neuroscience that center on the idea that time can be used to combine and separate representations. I describe how a well-supported computational model from a related area of cognition capitalizes on time and rhythm in computation, and how neuroscientific experiments can then be instrumentalized to determine the computational bounds on artificial neural network models. I offer examples of the approach from cognitive neuroimaging data and computational simulations, including leveraging other existing models. I outline a developing a theory of how language is represented in the brain that integrates basic insights from linguistics and psycholinguistics with the currency of neural computation.