Working group on Deep Learning

Contact persons:

Purpose and Topics

The goal of this working group is to enhance expertise of WIAS in the emerging field of deep learning. The main focus is intended to be on theoretical understanding of universal efficiency of deep learning approach for different classes of practical problems.

List of talks

09.06.2016, 16:00HVP 11A, room 4.13Valery AvanesovDeep learning: introduction
30.06.2016, 16:00MS 39, room 406Nazar BuzunDeep Feedforward Networks
07.07.2016, 16:00MS 39, room 406Egor KlochkovRegularization for Deep Learning
14.07.2016, 16:00MS 39, room 406Pavel DvurechenskyOptimization for Training Deep Models
18.08.2016, 16:00MS 39, room 406Pavel DvurechenskyConvolutional Networks
25.08.2016, 16:00MS 39, room 406Andzhey KoziukSequence Modeling: Recurrent and Recursive Nets
01.09.2016, 16:00MS 39, room 406Nazar BuzunLinear Factor Models
08.09.2016, 16:00MS 39, room 406Alexandra SuvorikovaAutoencoders
15.09.2016, 16:00MS 39, room 406CancelledCancelled
22.09.2016, 16:00MS 39, room 406Andzhey KoziukRepresentation Learning
29.09.2016, 16:00MS 39, room 406No seminarNo seminar
06.10.2016, 16:00MS 39, room 406Nazar BuzunBellman's principle for optimal control problems
10.10.2016, 16:00MS 39, room 406Prof. Vladimir SpokoinyDimension reduction
17.10.2016, 16:00MS 39, room 406Prof. Vladimir SpokoinyDimension Reduction
24.10.2016, 16:00MS 39, room 406Prof. Vladimir SpokoinyDimension Reduction
31.10.2016, 16:00MS 39, room 406Egor KlochkovNon-negative Matrix Factorization
07.11.2016, 16:00MS 39, room 406No seminarNo seminar
14.11.2016, 16:00HVP 11A, room 4.01Prof. Vladimir SpokoinyDimension Reduction
21.11.2016, 16:00MS 39, room 406CancelledCancelled
28.11.2016, 16:00HVP 11A, room 4.01Andzhey KoziukInstrumental Variables
05.12.2016, 16:00MS 39, room 406Christian KröningSemi-Supervised Learning
09.01.2017, 16:00MS 39, room 406Prof. Reinhold SchneiderHierarchical tensor representations and deep (convolutional) networks
16.01.2017, 16:00MS 39, room 406Alexandra CarpentierNon-linear Scattering
23.01.2017, 16:00MS 39, room 406Martin EigelTensor Representations
30.01.2017, 16:00MS 39, room 406Christian KröningSemi-Supervised Learning (contd.)
20.02.2017, 16:00MS 39, room 406No seminarNo seminar
27.02.2017, 16:00MS 39, room 406No seminarNo seminar
13.03.2017, 16:00MS 39, room 406John SchoenmakersOverview on regression methods for optimal stopping and control
21.04.2017, 15:00MS 39, room 406Andzhey KoziukConvolutional Sparse Coding
28.04.2017, 15:00MS 39, room 406Nazar BuzunLong Short Term Memory networks and their application for social network users classification
05.05.2017, 15:00MS 39, room 406Alexandra SuvorikovaWasserstein Training of Restricted Boltzmann Machines
12.05.2017, 15:00MS 39, room 406Vladimir Spokoiny and Larisa AdamyanAdaptive Weight Clustering
19.05.2017, 15:00MS 39, room 406No seminarNo seminar
26.05.2017, 15:00MS 39, room 406Pavel GurevichCertainty quantification in neural networks with regression
02.06.2017, 15:00MS 39, room 406John SchoenmakersDeep Learning for stochastical optimal stopping and control

Reading list

Software and Tutorials

  • Deep Learning Tutorials
    The tutorials presented here will introduce you to some of the most important deep learning algorithms and will also show you how to run them using Theano. Theano is a python library that makes writing deep learning models easy, and gives the option of training them on a GPU.
  • Theano: A Python framework for fast computation of mathematical expressions
    Theano is a Python library that allows you to define, optimize, and evaluate mathematical expressions involving multi-dimensional arrays efficiently.
  • Keras: Deep Learning library for Theano and TensorFlow
    Keras is a minimalist, highly modular neural networks library, written in Python and capable of running on top of either TensorFlow or Theano. It was developed with a focus on enabling fast experimentation. Being able to go from idea to result with the least possible delay is key to doing good research.
  • TensorFlow (google)
    TensorFlow is an open source software library for numerical computation using data flow graphs. Nodes in the graph represent mathematical operations, while the graph edges represent the multidimensional data arrays (tensors) communicated between them. The flexible architecture allows you to deploy computation to one or more CPUs or GPUs in a desktop, server, or mobile device with a single API. TensorFlow was originally developed by researchers and engineers working on the Google Brain Team within Google's Machine Intelligence research organization for the purposes of conducting machine learning and deep neural networks research, but the system is general enough to be applicable in a wide variety of other domains as well.
  • TensorFlow Playground
    One can try here to construct a Neural Network without any programming, just using a graphical interface.