### Research Interests

My primary research interests lie in connecting mathematical and computational linguistics with mainstream syntax. Mathematical approaches can complement mainstream approaches, for example by identifying the place of human language in the Chomsky hierarchy, and it can also clarify existing theories, by forcing their formulation to be completely explicit.

### Projects

**Semantic parsing into Abstract Meaning Representations**

- Joint work with Jonas Groschwitz, Matthias Lindemann, and Alexander Koller
- A constrained graph algebra for semantic parsing with AMRs (IWCS 2017): a typed algebra (the
*Apply-Modify (AM) Algebra*) for compositionally building and decomposing AMR graphs - AMR dependency parsing: treating AMR composition with the AM-algebra as a form of dependency parsing

**Mathematical models of syntax**

**Parsing Minimalist Languages with Interpreted Regular Tree Grammars**- TAG+ 2017 paper
- Joint work with Alexander Koller
- Using IRTG parsing, we show that MG parsing complexity is O(n
^{2k+3}) (for k = the number of licensing features), rather than O(n^{4k+4}) as previously thought

**Minimalist Grammars with Adjunction**- A model of adjuncts that accounts for both optionality and Cinquesque ordering. Click here for more details
**Multidominant Minimalist Grammars**- A minimalist grammar that generates multidominant graphs, a structure that retains information about the derivation, allowing moved constituent to easily be interpreted correctly in each of their positions. Click here for more details

**Parsing and statistical learning**

- I coded up an Inside/Outside expectation-maximisation
CKY parser/learner based on Lari & Young (1990), and aided by presentations by
Collins, Prescher, and Eisner. Given
a context-free grammar and a corpus, it guesses the probability
of each rule.

I extended the IO algorithm to handle copying, because this algorithm is being used to test the hypothesis that some birds, such as the California Thrasher, have overt copying in their songs, making their language Context Sensitive. This is a project with biologist Charles Taylor, linguist Ed Stabler, and neuroscientist Floris van Vugt.

**Formal Language Theory**, especially subregular tree languages

### Other Research

**Artificial language learning**of syntax- I ran an ALL study to see if people generalise from limited to indefinite word-level repetition. I found that they do, but there is quite a lot of variability in the response pattern. Next steps are (1) category-level repetition and (2) generalisation from optionality to repetition.

**Learnability of optionality and repetition**- As adjuncts are classically optionally and often repeatable, it is useful to see how formal learning algorithms approach optionality and repetition. Some learners treat optionality and repetition as a unified phenomenon. Click here for more details

**Ergativity**- I'm working with Lisa Travis, Jessica Coon, and Richard Compton in the McGill Ergativity Lab. We want to understand the properties of ergative languages. We are starting by putting together a survey probing the relevant properties of as many languages as we can.

**Field work****Q'anjob'al**(Mayan)**Chuj**(Mayan)

**Multiple Multiple Spellout**: a theory of spellout accounting for free word order, PIC effects, and CED effects. Click here for more details

- Bare Grammar approach to
**free word order**in Tagalog

- Output-output correspondence in
French liaison

- Mathematical structure of c-command