DSCI 572: Supervised Learning II#
Welcome to Supervised Learning II! In this course, we delve into the world of deep learning using Python and PyTorch. You’ll learn about optimization, the fundamentals of neural networks, and convolutional neural networks. We’ll also explore some advanced topics such as generative adversarial networks.
Important links#
Course learning outcomes#
Click to expand!
By the end of this course, you will be able to:
Identify common computational issues caused by floating-point arithmatic, e.g., rounding, overflow, etc., and program defensively against these errors.
Explain how the gradient descent algorithm and its variants work.
Explain the fundamental concepts of neural networks including layers, nodes, and activation functions and gain proficiency in implementing basic neural networks using
PyTorch
.Illustrate the process of backpropagation in neural network training.
Explain how convolutional neural networks work and implement them for image classification using PyTorch.
Explain and apply transfer learning and the different flavours of it: “out-of-the-box”, “feature extractor”, “fine tuning”.
Describe at a high level the basic principles and architecture of Generative Adversarial Networks (GANs)
Deliverables#
Click to expand!
The following deliverables will determine your course grade:
Assessment |
Weight |
Where to submit |
---|---|---|
Lab Assignment 1 |
12% |
|
Lab Assignment 2 |
12% |
|
Lab Assignment 3 |
12% |
|
Lab Assignment 4 |
12% |
|
iClicker participation |
2% |
|
Quiz 1 |
25% |
|
Quiz 2 |
25% |
See Calendar for the due dates.
Teaching Team#
Click to expand!
Role |
Name |
---|---|
Lecture Instructor |
Varada Kolhatkar |
Lab Instructor |
Varada Kolhatkar |
Teaching Assistant |
Ali Balapour |
Teaching Assistant |
Prajeet Bajpai |
Teaching Assistant |
Wenxuan (Skylar) Fang |
Teaching Assistant |
Abdul Muntakim Rafi |
Lectures#
Format#
Click to expand!
I strongly recommend reviewing the corresponding lecture notes before each lecture. During the lectures, I will focus on the key concepts. It’s also highly advised to download the relevant datasets and run the lecture Jupyter notebooks on your own. Experimenting with the code will greatly improve your understanding.
Lecture schedule#
This course occurs during Block 4 in the 2023/24 school year.
# |
Topic |
Resources and optional readings |
---|---|---|
1 |
Floating Point Errors |
|
2 |
Optimization and Gradient Descent |
|
3 |
Stochastic Gradient Descent |
|
4 |
Introduction to Neural Networks & PyTorch |
|
5 |
Training Neural Networks |
|
6 |
Convolutional Neural Networks Part 1 |
|
7 |
Convolutional Neural Networks Part 2 |
|
8 |
Advanced Deep Learning |
Installation#
We are providing you with a conda
environment file which is available here. You can download this file and create a conda environment for the course and activate it as follows.
conda env create -f dsci572env.yml
conda activate 572
In order to use this environment in Jupyter
, you will have to install nb_conda_kernels
in the environment where you have installed Jupyter
(typically the base
environment). You will then be able to select this new environment in Jupyter
. If you’re unable to see the environment in Jupyter, you might have to install the kernel manually. See the documentation here. For more details on this, refer to your 521 lecture 7.
I’ve only attempted to install this environment file on a few machines, and you may encounter issues with certain packages from the yml
file when executing the commands above. This is not uncommon and may suggest that the specified package version is not yet available for your operating system via conda
. When this occurs, you have a couple of options:
Modify the local version of the
yml
file to remove the line containing that package.Create the environment without that package.
Activate the environment and install the package manually either with
conda install
orpip install
in the environment.
Note that this is not a complete list of the packages we’ll be using in the course and there might be a few packages you will be installing using conda install
later in the course. But this is a good enough list to get you started.
Course communication#
Click to expand!
We are all here to support your learning and success in the course and the program. Here’s how our communication will work during the course.
Clarifications on the lecture notes or lab questions#
If there is any clarification on the lecture material or lab questions, I’ll post a Slack message on our course channel and tag you. It is your responsibility to read the messages whenever you are tagged. (I know that there are too many things for you to keep track of. You do not have to read all the messages but please make sure to carefully read the messages whenever you are tagged.)
Questions on lecture material or labs#
If you have questions about the lecture material or lab questions, please post them on the course Slack channel rather than direct messaging me or the TAs. Here are the advantages of doing so:
You’ll get a quicker response.
Your classmates will benefit from the discussion.
When you ask your question on the course channel, please avoid tagging the instructor unless it’s specific for the instructor (e.g., if you notice some mistake in the lecture notes). If you tag a specific person, other teaching team members or your colleagues are discouraged to respond. This decreases the response rate on the channel.
Please use some consistent convention when you ask questions on Slack to facilitate easy search for others or future you. For example, if you want to ask a question on Exercise 3.2 from Lab 1, start your post with the label lab1-ex2.3
. Or if you have a question on lecture 2 material, start your post with the label lecture2
. Once the question is answered/solved, you can add “(solved)” tag before the label (e.g., (solved) lab1-ex2.3
. Do not delete your post even if you figure out the answer on your own. The question and the discussion can still be beneficial to others.
Reference Material#
Click to expand!
Deep learning resources#
Math for ML#
Other ML resources#
License#
© 2023 Varada Kolhatkar, Arman Seyed-Ahmadi, Tomas Beuzen, Mike Gelbart, and Aaron Berk
Software licensed under the MIT License, non-software content licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0) License. See the license file for more information.