A programme of talks on Tensors, their Decompositions, and Applications will take place at Queen Mary University of London on the afternoon of Tuesday, 16 August, 2016. Talks will be held in room 2.41 of the Francis Bancroft building.
|13:00||Massimiliano Pontil (UCL)||Some tensor decomposition methods for machine learning|
|14:15||Elina Robeva (Berkeley → MIT)||Orthogonal Tensor Decomposition|
|15:30||Tea in the Bancroft building foyer|
|16:00||Dimitrios Kartsaklis (QMUL)||Tensor-based Models of Natural Language Semantics|
|17:15||Drinks in the Senior Common Room bar|
|18:00||Dinner at Ariana|
If you are interested in attending the dinner, please email Alex to this effect by 15 August (the day before) so that adequate space can be reserved.
We thank EPSRC for funding this meeting, through grant EP/M01245X/1 Algebra and geometry of matroids.
The nearest tube stations to Queen Mary are Stepney Green and Mile End. They are about equally close; the walk will be about seven minutes in either case. Both stations lie on Mile End Road, which Queen Mary fronts onto on the north side.
The Francis Bancroft building is number 31 on this campus map. Do not confuse it with the Bancroft Road building! It is a four-storey brick building, in yellow brick with red bricks for accents and blue window-frames, recognisable by outdoor seating for Mucci's restaurant. It is visible to the north here.
Room 2.41 is on the second floor, to your left as you exit the stairs, to your right if you've taken the lifts, most of the way to the end of the main corridor and then down a side corridor. The signage within the building is ample.
Dimitrios Kartsaklis, Tensor-based Models of Natural Language Semantics. Slides.
Joint work with M. Sadrzadeh and B. Coecke.
Tensor-based models of natural language semantics provide a conceptually motivated procedure to compute the meaning of a sentence, given its grammatical structure and a vectorial representation of the meaning of its parts. The main characteristic of these models is that words with relational nature, such as adjectives and verbs, become (multi-)linear maps acting on vectors representing words of atomic types, e.g. nouns and noun phrases. On the practical side, the tensor-based framework has been proved useful in a number of NLP tasks. On the theoretical side, its rigorous mathematical foundations provide a test-bed for studying compositional aspects of language at a level deeper than most practically-oriented approaches would allow; for example, mathematical structures such as Frobenius algebras and bialgebras have been used to allow the explication of functional words such as relative pronouns, to model linguistic aspects such as coordination and intonation, and to provide accounts of quantification in distributional models. Furthermore, the deep structural similarity of the framework to concepts that explain the behaviour of quantum-mechanical systems has enabled a unique perspective in approaching language-related problems, such as lexical ambiguity and entailment, by leveraging the model to the realm of density operators and complete positive maps via Selinger's CPM construction. This talk aims at providing a comprehensive introduction to this emerging field by presenting the mathematical foundations, discussing important extensions and recent work, and (time permitted) touching implementation issues and practical applications.
Massimiliano Pontil, Some tensor decomposition methods for machine learning. Slides.
Elina Robeva, Orthogonal Tensor Decomposition. Slides.
A symmetric tensor is orthogonally decomposable if it can be written as a linear combination of tensor powers of n orthonormal vectors. Such tensors are interesting because their decomposition can be found efficiently. We study their spectral properties and give a formula for all of their eigenvectors. We also give equations defining all real symmetric orthogonally decomposable tensors. Analogously, we study nonsymmetric orthogonally decomposable tensors, describing their singular vector tuples and giving polynomial equations that define them. Further, we give a description of the variety of orthogonally decomposable tensors.