CSCI 315: Artificial Intelligence

CSCI 315: Artificial Intelligence through Deep Learning

  Professor: Simon D. Levy
  Lecture: MWF 2:45-:345 Parmly 307

What I cannot create, I do not understand.Richard Feynman 

Objectives

The goal of this course is to give you the skills and knowledge to participate in the exciting new field of Deep Learning. For the first half of the course you will learn to design, train, and test neural networks using the NumPy package in Python. In the second half of the course you will learn how to use the popular PyTorch Python package to train and test much more powerful deep-learning networks that exploit the Graphical Processing Unit (GPU) available on our computers. In addition to being able to design, train, and test Deep Learning networks, you will gain an understanding of the history and philosophy of AI, the current challenges it faces, and the prospects for the future.

Attendance and Preparation

I look at this course as preparation for professional work in a research or industry setting, and I expect you to act professionally: show up for every class, participate fully, and submit your work on time without excuses. Consistent with our university’s mission statement, I expect everyone to conduct themselves with honor, integrity, and civility: if you are talking, texting, or otherwise causing a distraction in class, I will ask you to leave.

Grading

    • Three hour-long in-class exams: 50%
    • Programming assignments (done individually): 50%

All work should be submitted through Github as Python .py files. The fast pace of the course means that no late work can be accepted.  The only three exceptions to this rule are:

    • Varsity sports commitments, with prior notice
    • Academic conference commitments, with prior notice
    • Serial medical / family / personal emergencies, with a adjustment from the Office of the Dean.

Given the size of the class and the amount of work involved, there will be no opportunity for extra credit if you are not happy with your grade as the end of the course approaches. Serious problems (health / family / personal emergencies) should be handled through the Office of the Dean.

The grading scale will be 93-100 A; 90-92 A-; 87-89 B+; 83-86 B; 80-82 B-; 77-79 C+; 73-76 C; 70-72 C-; 67-69 D+; 63-66 D; 60-62 D-; below 60 F.

Accommodations

Washington and Lee University makes reasonable academic accommodations for qualified students with disabilities. All undergraduate accommodations must be approved through the Office of the Dean of the College. Students requesting accommodations for this course should present an official accommodation letter within the first two weeks of the (fall or winter) term and schedule a meeting outside of class time to discuss accommodations. It is the student’s responsibility to present this paperwork in a timely fashion and to follow up about accommodation arrangements. Accommodations for test-taking should be arranged with the professor at least a week before the date of the test or exam.

Schedule, Including Due Dates and On-line Class Notes

For each exam, you are responsible for all lecture-slide material posted before that exam.

Monday Wednesday Friday
9  Jan
Week 1
Course Outline What is (A)I?

The Myth of a Superhuman AI

Deep Learning Intro Article Linear Least Squares
16 Jan Week 2 Martin Luther King Day; no classes Perceptron Learning Reading:  Buduma Chapter 1

Due: Assignment #1

23 Jan
Week 3
Continue: Perceptron Learning Limits of Perceptrons Limits of Perceptrons
30 Jan
Week 4
Back-propagation with hidden units

Reading: Buduma Chapter 2

Back-prop II: Improvements

Backprop: One Weird Trick

Backprop Cheat Sheet

Review for Exam #1

Due: Assignment #2

6 Feb
Week 5
Exam #1 Discuss Exam #1 Reading: Buduma Chapter 4

Logistic Regression & Soft-Max

13  Feb
Week 6
Logistic Regression & Soft-Max

Reading: Buduma Chapter 3

Logistic Regression & Soft-Max

Due: Assignment #3

Intro to PyTorch

pytorch.py

27 Feb
Week 7
Intro to PyTorch PyTorch

Reading: Buduma Chapter 3

PyTorch
6  Mar
Week 8
PyTorch II

Due: Assignment #4

PyTorch II Review for Exam #2
13 Mar
Week 9
 Exam #2 Discuss Exam #2 Intro to Convolutional Networks

Reading: Buduma Chapter 5

20 Mar
Week 10
Convolutional Networks Recurrent Networks

Optional Reading:
The Unreasonable Effectiveness of Recurrent Networks (alluding to this classic)

Recurrent Networks.

Due: Assignment #5

27 Mar
Week 11
Recurrent Networks

Reading: Buduma

Chapter 7

Reading: Understanding LSTM Networks

Autoencoder networks Attention / Transformers

Attention Is All You Need

3 Apr
Week 12
ChatGPT intro

Geoff Hinton Interview

Review for Exam #3 Exam #3
10 Apr
Finals Week
Work on Assignment #6 Work on Assignment #6 Due: Assignment #6