Jackson Petty and Bob Frank present at the BU Conference on Language Development

November 16, 2021

At the recent (virtual) BU conference on Language Development (November 4-7, 2021), Jackson Petty and Bob Frank presented their work on “Learning structure-role alignments without linguistic bias: A computational exploration”. This paper studied the degree to which modern neural network-based language models come to acquire facts about argument-structure alternations from vast amounts of linguistic experience, but without any innate language-specific constraints. Their work shows that trained language models like BERT do indeed come to know the regular correspondence that holds between active and passive sentences, but struggle with more lexically idiosyncratic alternations like dative shift.

External link: 
Author: