[Forum SIS] Corso Prof Tamara Broderick (MIT), PhD Statistics, Bocconi

Sonia Petrone sonia.petrone a unibocconi.it
Mar 9 Gen 2018 10:49:53 CET


Carissime/i, 
   re-invio l'annuncio di corso dottorale (PhD in Statistics, Univ Bocconi), su "Variational Bayesian methods and beyond: Bayesian inference for big data", 9-17 gennaio 2018, e seminario, 11 gennaio 2018, tenuti dalla prof.ssa Tamara Broderick (MIT). 
Cari saluti, 
Sonia Petrone
-----
We are pleased to inform that Professor Tamara Broderick (MIT Massachusetts Institute of Technology) will be at Bocconi from January 9 to January 17 as a visiting professor inside the Bocconi PhD Program in Statistics. 
 
Professor Broderick will teach a 12 hours course inside the PhD in Statistics on:
“Variational Bayesian methods and beyond: Bayesian inference for big data”
Contents:
Variational Bayes, mean-field variational Bayes, latent Dirichlet allocation, stochastic gradient and stochastic variational inference, streaming and distributed methods, automatic differentiation and black-box variational inference, linear response variational Bayes, robustness quantification

Lectures will be in the seminar room 3-E4-SR03 (DEC Dept) on:
January   9, 2018: lecture 1, 2:30-4 pm
January 10, 2018: lecture 2 , 2:30-4 pm
January 11, 2018: lecture 3, 10:30-12 am
-- January 11, 2018: Seminar --
January 15, 2018: lecture 4,  2:30-4 pm
January 16, 2018: lecture 5,  2:30-4 pm
January 17, 2018: lecture 6, 10:30-12 am

Professor Broderick will give a seminar on January 11th, 2017 at 12:30 on
          Automated Scalable Bayesian Inference via Data Summarization
(DEC-seminar room 3-E4-SR03 - Via Rontgen 1 - 3rd floor at 12:30pm)
 
Abstract:
The use of Bayesian models in large-scale data settings is attractive because of the rich hierarchical relationships, uncertainty quantification, and prior specification they provide. Standard Bayesian inference algorithms are often computationally expensive, however, making their direct application to large datasets difficult or infeasible. We leverage the insight that data often exhibits redundancies to instead obtain a weighted subset of the data (called a coreset) that is much smaller than the original dataset. We can then use this small coreset in any number of existing posterior inference algorithms without modification. We provide theoretical guarantees on the size and approximation quality of the coreset. The proposed approach also permits efficient construction of the coreset in both streaming and parallel settings.
 
For information please feel free to contact: angela.baldassarre a unibocconi.it

Best regards, 

Sonia Petrone
PhD in Statistics Program Director
Bocconi University, Milano 



Maggiori informazioni sulla lista Sis