CNRS

Rechercher




Accueil > Formation > Cours en ligne

Cycle de conférences de Joe Pater, Professeur invité dans le cadre du Labex

mis à jour le 26 février 2015

Dans le cadre des séminaires de Professeurs Invités organisées par le LabEx Empirical Foundations of Linguistics, nous avons eu le plaisir de recevoir le Professeur Joe Pater du département de linguistique de l’Université du Massachusetts (UMass).

Les conférences ont eu lieu les vendredis 12 décembre,(14h-16h), 9 janvier (14h à 16h), 16 janvier (14h30 -16h30), 23 janvier (14h30 - 16h30), à l’ILPGA, 19 rue des Bernardins, 75005 Paris, sale Rousselot (1er étage).

Programme :

  • Class 1 – 12/12 : Weighted constraint models of phonology and learning In this class, I’ll start by introducing weighted constraint models of phonology primarily using data from French schwa alternations. I won’t assume a background in Optimality Theory, and will explain why violable constraints are useful in analyzing the French data, before moving on to weighted constraints, including a probabilistic MaxEnt model that can handle variation in schwa alternations. I’ll provide a relatively non-technical introduction to the computational modeling of learning with a MaxEnt model, and then discuss an artificial language learning experiment that provides support for the predictions of a MaxEnt phonotactic learning model.

The last part of the class will include material from this paper :

Moreton, Elliott, Joe Pater and Katya Pertsova. 2014. Phonological concept learning. Ms, University of North Carolina and University of Massachusetts Amherst.

  • Class 2 – 09/01 : More on artificial language learning Artificial language learning has recently become a very popular technique for studying the predictions of models of phonology and it’s learning. We’ll survey some of the results of that literature, and discuss the promise, and challenges, of studying learning using web-based experiments. We’ll also discuss the insights that ERP methods can give us in terms of the kind of knowledge is acquired in a lab experiment.
  • Class 3 – 23/01 : Iterated learning This class presents the initial results of a program of research that incorporates learning into typological modeling through iterated learning. Given a theory of grammar, and a theory of learning, we’ll examine how multiple rounds of learning (across or within generations) can shape the predicted typology. I’ll show how typological prediction can be extended into territories generally hostile to grammatical modeling : probabilistic and systemic generalizations (e.g. feature economy). I’ll also discuss results relevant to the choice between ranked and weighted constraints for typological modeling. This class will cover some of the material in Pater and Moreton (2012) and Staubs (2014), as well as some newer ongoing work.

Pater, Joe and Elliott Moreton. 2012. Structurally biased phonology : Complexity in learning and typology. In a special issue of the EFL Journal on phonology, edited by K.G. Vijayakrishnan (The Journal of the English and Foreign Languages University, Hyderabad), 1-44.

Staubs, Robert. 2014. Computational modeling of learning biases in stress typology. Doctoral dissertation, University of Massachusetts Amherst.

  • Class 4 – 06/02 : This lecture introduces serial Harmonic Grammar, a version of Optimality Theory (OT ; Prince and Smolensky 1993/2004) that reverses two of Prince and Smolensky’s basic architectural decisions. One is their choice of constraint ranking over the numerically weighted constraints of its predecessor, Harmonic Grammar (HG ; Legendre, Miyata and Smolensky 1990 ; see Smolensky and Legendre 2006 and Pater 2009 for overviews of subsequent work). The other is their choice of parallel evaluation over a version of OT in which the representation is changed and evaluated iteratively (Harmonic Serialism ; Prince and Smolensky 1993/2004 : ch. 2, McCarthy 2007 et seq.). I introduce serial HG with an analysis of syllabification in Imdlawn Tashlhiyt Berber (Dell and Elmedlaoui 1985, 1988, 2002), the same case that Prince and Smolensky use to introduce OT. This analysis illustrates advantages of both serialism and weighted constraints. I also discuss some of the positive consequences of the adoption of serialism for the typological predictions of HG, as well as some outstanding issues for further research on serial versions of both OT and HG.

It is based on the following paper : Pater, Joe. 2012. Serial Harmonic Grammar and Berber syllabification. In Toni Borowsky, Shigeto Kawahara, Takahito Shinya and Mariko Sugahara (eds.) Prosody Matters : Essays in Honor of Elisabeth O. Selkirk. London : Equinox Press. 43-72.

Voir en ligne : http://www.labex-efl.org/?q=fr/node/281