LIN 389C Schedule

Course outline Spring 2017

This spring we are focusing on deep learning. We are using Nielsen's Neural Networks and Deep Learning, and Deep Learning by Goodfellow/Bengio/Courville. Both books are available online.
Also look at the DeepMind course on deep learning, available here.

Every research seminar session includes a short discussion about the NN projects that students are undertaking.

On Tuesdays we will typically do research seminars, and round-tables on Thursdays.

Week 1: Jan 17, 19

This week we do course planning and round-table all week

Week 2: Jan 24, 26

Tuesday: Discussion of the NN package we will use

Thursday: special session on variational inference for Bayesian models

Week 3: Jan 31, Feb 2

Tuesday: Reading: Nielsen, Neural networks and deep learning, ch.2, ch. 3,

Week 4: Feb 7, 9

Tuesday: Reading: Nielsen, Neural networks and deep learning, ch. 4, ch. 5, ch. 6

Week 5: Feb 14, 16

Tuesday: Alex Rosenfeld: how neural networks can approximate any function

Week 6: Feb 21, 23

Tuesday: We finish up Nielsen

Week 7: Feb 28, Mar 2

Tuesday: General discussion about research methods

Week 8: Mar 7,  9

Tuesday: Discussion of the tutorial Yoav Goldberg, A primer on neural network models for Natural Language Processing

Week 9: spring break

Week 10: Mar 21, 23

Tuesday: Word2Vec and GloVE

Readings:

For those writing a course paper / second year paper with Katrin: Course paper proposal due Tuesday Mar 21: 1-2 pages, outline of what you are planning do to

Week 11: Mar 28, 30

Tuesday:
Goodfellow/Bengio/Courville ch. 6: Deep Feedforward Networks.

Week 12: April 4, 6

Tuesday: We finish our discussion of Levy/Goldberg and of the GloVe paper.

We also read, on autoencoders and recursive neural networks:

Additional readings that we will not discuss in class:
If you would like to read more about autoencoders, Pengxiang recommends sections 4.6, 6.2 and 7.1, 7.2 from this Bengio tutorial.

If you would like a more in-depth understanding of restricted Boltzmann machines, Su recommends Fischer and Igel, an Introduction to Restricted Boltzmann machines.

Week 13: April 11, 13

Tuesday:

We discuss, on autoencoders and recursive neural networks:

Additional readings:
If you would like to read more about autoencoders, Pengxiang recommends sections 4.6, 6.2 and 7.1, 7.2 from this Bengio tutorial.

If you would like a more in-depth understanding of restricted Boltzmann machines, Su recommends Fischer and Igel, an Introduction to Restricted Boltzmann machines.

For those writing a course paper / second year paper with Katrin: Course paper intermediate report due Tuesday April 11: 2-3 pages, extension of the course proposal document, updated to take comments into account and updated by your progress to date

Week 14: April 18, 20

Tuesday: LSTMs, Long Short Term Memory networks:

Week 15: April 25, 27

Tuesday: We talk about attention. Readings from the Deepmind course on deep learning:


Optional reading: On this day, Su also talks about GANs. His tutorial is here.


Week 16: May 2, 4

Tuesday: Extended round table.

For those writing a course paper / second year paper with Katrin: Course paper final version due date : Thursday May 4. Please submit 8-10 pages, building on the progress report you submitted.


Comments