What we can learn depends on what we already know; a child who can’t count cannot learn arithmetic. Just as in other domains of learning, language acquisition is incremental. Children gradually grow their grammars over the course of development by building on prior knowledge in order to learn from their linguistic input, as they currently represent it. My research investigates how children at the earliest stages of syntax and semantics acquisition represent their input and how they learn from it. Using a combination of behavioral and computational methods, I ask how linguistic, cognitive, and conceptual development interact throughout this process.
Noise-tolerant syntax learning
At early stages of grammatical development, children’s representations of their linguistic input are immature, incomplete, and sometimes inaccurate. Acquiring any piece of grammatical knowledge therefore requires mechanisms for abstracting away from messiness in the data, as a child perceives it. One strand of my research investigates these mechanisms computationally, focusing on the case study of early syntax learning. We model learning as selection among restrictive hypotheses, embedded within a system that also produces “noise” relative to the phenomenon being acquired.
- Perkins & Hunter (under review). “Modeling regularization in language acquisition as noise-tolerant grammar selection.” [lingbuzz]
- Perkins & Hunter (2023). “Noise-tolerant learning as selection among deterministic grammatical hypotheses.” Proceedings of the 6th meeting of the Society for Computation in Linguistics (SCiL 2023).
- Maitra & Perkins (2023). “Filtering input for learning constrained grammatical variability: The case of Spanish word order.” Proceedings of the 6th meeting of the Society for Computation in Linguistics (SCiL 2023).
- Perkins, Feldman, & Lidz (2022). “The power of ignoring: Filtering input for argument structure acquisition.” Cognitive Science.
- Schneider, Perkins, & Feldman (2020). “A noisy channel model for systematizing unpredictable input variation.” Proceedings of the 44th annual Boston University Conference on Language Development (BUCLD 44).
- Perkins, Feldman, & Lidz (2017). “Learning an input filter for argument structure acquisition.” Proceedings of the 7th Workshop on Cognitive Modeling and Computational Linguistics.
Acquiring non-local syntactic dependencies
Another strand of my research investigates how infants identify syntactic dependencies between non-local expressions, such as wh-dependencies, and how this acquisition process interacts with their acquisition of verb argument structure. These dependencies take various forms cross-linguistically, so learners must discover their form in the specific language that they are exposed to. I use behavioral methods to diagnose infants’ early dependency representations, and computational methods to model how they acquire them.
- Perkins, Feldman, & Lidz (under review). “Mind the Gap: Learning the surface forms of movement dependencies.” [lingbuzz]
- Perkins & Lidz (2021). “Eighteen-month-old infants represent nonlocal syntactic dependencies.” Proceedings of the National Academy of Sciences.
- Perkins & Lidz (2020). “Filler-gap dependency comprehension at 15 months: The role of vocabulary.” Language Acquisition.
- Hirzel, Perkins, & Lidz (2020). “19 month-olds represent and incrementally parse filler-gap dependencies.” Poster presented at the 33rd Annual CUNY Human Sentence Processing Conference.
- Perkins (2019). “How grammars grow: Argument structure and the acquisition of non-basic syntax.” Doctoral dissertation, University of Maryland.
Relating scene and sentence percepts in verb learning
If syntactic categories are related in a principled way to conceptual categories, then learners might be able to leverage these correlations as an initial bootstrap into the target grammar. Another collection of projects studies how these correlations inform children’s early inferences about verb meaning. Specifically, I ask how children expect their representations of the argument structure of sentences to correspond to their representations of the participant structure of events.
- Perkins, Knowlton, Williams, & Lidz (2024). “Thematic content, not number matching, drives syntactic bootstrapping.” Language Learning and Development.
- Khlystova, Williams, Lidz, & Perkins (2024). “Visual perception supports 4-place event representations: A case study of TRADING.” Proceedings of the Annual Meeting of the Cognitive Science Society, Volume 46 (CogSci 2024).
- Mateu, Perkins, & Hyams (2023). “Learning unaccusativity: Evidence for split intransitivity in child Spanish.” Proceedings of the 97th Linguistics Society of America Annual Meeting.
- Williams, Perkins, He, Björnsdóttir, & Lidz (2017). “A new test of one-to-one matching between arguments and participants in verb learning.” Poster presented at the 42nd Boston University Conference on Language Development (BUCLD 42), Boston.