mfowlie [at] coli [dot] uni-saarland [dot] de
IPA: [ˈmeɪɡən ˈfaʊli]
I am a post-doc at Universität des Saarlandes in Saarbrücken, Germany, under the supervision of Alexander Koller. Currently I am focused on two projects: deep learning and Abstract Meaning Representations (AMRs). Deep learning is a type of neural network training, and my ultimate goal is to understand something of what, if any, linguistic representations arise in the midst of a complex neural network analysing language. For AMRs, a recent paper with Jonas Groschwitz, Alexander Koller, and Marc Johnson (IWCS 2017) presents an algebra for building AMRs compositionally that is constrained enough to be used in grammar induction, which we are now working on with Antoine Venant and Christoph Teichmann.
My primary research interests lie in connecting mathematical and computational linguistics with mainstream syntax. Mathematical approaches can complement mainstream approaches, for example by identifying the place of human language in the Chomsky hierarchy, and can also clarify existing theories, by forcing their formulation to be completely explicit.
I also work on Minimalist Grammars, Ed Stabler (1997 etc)'s formalisation of Chomsky (1995 etc)'s minimalism. MGs are of interest because they define the right general class of languages (MCFLs), they are an efficient and intuitive formalism, and as they are formalisations of the current work of a great many syntacticians, MGs are a bridge to mainstream syntax.
I have a PhD in linguistics from UCLA. My dissertation is about adjunction. I consider how best to model adjuncts in minimalism, what formal properties these models have, and how adjuncts can be learned. In terms of learnability, I consider learnability in the mathematical algorithm sense, as well as taking an experimental approach.
Click here for the latest news.