[Forum SIS] One-day workshop on Algebraic Statistics

Maria Piera Rogantin rogantin a dima.unige.it
Gio 29 Dic 2016 16:06:43 CET


Buongiorno a Tutti,

Si informa che lunedi` 23 gennaio presso il Dipartimento di Matematica dell’Universita` di Genova, aula 705, si terra` un workshop di Statistica Algebrica con il programma in calce. 
Gli interessati sono cordialmente invitati a partecipare.

Cordiali saluti,
Eva Riccomagno  e Maria Piera Rogantin

ore 10.:30 Gherardo Varando, Computational Intelligence Group, Department of Artificial Intelligence, Technical University of Madrid http://cig.fi.upm.es/CIGmembers/gherardo_varando
TITLE: Polynomial representations of Bayesian network classifiers
ABSTRACT: We briefly review our previous works on how to define families of polynomial that sign-represent decision functions induced by Bayesian network classifiers (BNCs). We show how to use those representations to bound the number of decision representable by BNCs and to study some extensions of BNCs to multi-label classification problems. We then expose the connections with algebraic statistic and in particular with the algebraic representation of discrete probability and conditional independence models. We conclude with some possible future lines of research.  

ore 11:30 Luigi Malago', Romanian Institute of Science and Technology - RIST 
TITLE: On the Geometry of Neural Networks: from 20-year-old Amari's natural gradient to these days
ABSTRACT: Artificial neural networks consist of interconnected computational units, called neurons, which act as mathematical functions. From a mathematical perspective, a neural network is commonly defined as a composition of non-linear weighted sums, which transform input vectors to output vectors. Equivalently, from a probabilistic perspective, given an unknown probability distribution for the inputs, the network encodes a parametric conditional probability distribution, where the parameters are given by the connection weights. Information Geometry studies the geometry of statistical models with tools from differential and Riemannian geometry. Not surprisingly, one of the first applications of Information Geometry was related to the training of neural networks using the natural gradient. Neural networks were quite promising in the 80s in the machine learning community, but later on other methods gradually became more popular. More recently, starting from 2011-12, neural networks and their use in deep learning architectures have become prominent, thanks to their state-of-the-art performance in several different fields. In this talk we review the geometry of neural networks from the perspective of Information Geometry after almost 20 years from its conception, and we present some more recent applications of natural gradient-based methods for the training of deep neural networks.

ore 14:00 Giovanni Pistone, ', de Castro Statistics, Collegio Carlo Alberto, Moncalieri
TITLE: Geometries of the Gaussian Model
ABSTRACT: In Information geometry, it is possible to define a number of different geometrical structures on the full Gaussian model: the Fisher-Rao Riemannian Manifold (S.T. Skovgaard 1981), the Wasserstein Riemannian Manifold (A. Takatsu 2011), the Exponential and Mixture Affine manifolds (G. Pistone & C. Sempi 1995). We discuss the features of these geometries, including the second order properties (e.g. Hessians), with special emphasis of the Wasserstein case. This turns out to be a special case of a more general set-up introduced in 2001 by R. Otto. Applications, due to L. Malagò (in progress), are presented.





Maggiori informazioni sulla lista Sis