This course will introduce basic knowledge about machine learning, security, privacy, adversarial machine learning, and game theory. Students will understand different machine learning algorithms, practice their implementations, and analyze their security vulnerabilities through a series of homeworks and projects.
Please contact the instructor if you have questions regarding the course materials. The syllabus is here.
The following table outlines the schedule for the course. We will update it as the quarter progresses.
Date | Lecture | Content | Materials | Homework |
---|---|---|---|---|
1/23 | Course Intro |
Slides reading 1 reading 2 |
||
1/25 | Supervised Learning I | Regression, classification, Gradient |
Slides reading 1 reading 2 reading 3 |
|
1/30 | Supervised Learning II | PAC Learnability, supervised learning in Adversarial Settings |
Slides reading 1 reading 2 |
|
2/1 | Unsupervised Learning I | Clustering, PCA, Matrix completion |
Slides reading 1 reading 2 |
|
2/6 | Homework 1 Walkthrough, Q&A |
Notes |
||
2/8 | Talk: From Heatmaps to Structural and Counterfactual Explanations |
Talk abstract and bio |
Recording |
|
2/13 | Unsupervised Learning II | Unsupervised learning in Adversarial Settings, Categories of Attacks on Machine Learning |
Slides reading 1 |
|
2/15 | Attacks at Decision Time | Evasion Attacks, Anomaly Detection |
Slides reading 1 reading 2 |
|
2/20 | Modeling Decision-time Attacks |
Slides reading 1 reading 2 |
||
2/22 | White-box/black-box Decision-time Attacks and Physical attacks | |||
2/27 | Homework 2 Walkthrough, Q&A |
Notes |
||
3/1 | Defending against decision-time attacks I | Optimal evasion-robust classification | ||
3/6 | Defending against decision-time attacks II | Feature level protection, randomized smoothing |
Slides reading 1 |
|
3/8 | Midterm Exam | |||
3/20 | Knowledge enriched robust learning models |
Slides reading 1 |
||
3/22 | Defending against decision-time attacks III | Adversarial retraining |
Slides reading 1 |
|
3/27 | Data Poisoning attacks | Binary classification, SVM, unsupervised learning, Matrix factorization, general framework |
Slides reading 1 |
|
3/29 | Defending against poisoning attacks | Data sub-sampling, outlier removal | ||
4/3 | Trustworthy Federated Learning | Certifiably robust federated leanring against training-time adversaries |
Slides reading 1 |
|
4/5 | Privacy in trustworthy machine learning | Membership attacks, model inversion attacks |
Slides reading 1 |
|
4/10 | Guest lecture | |||
4/12 | Homework 3 Walkthrough, Q&A |
Notes CMU SafeBench AWS Instruction |
||
4/17 | Differentially private machine learning | Differentially private data generative models |
Slides reading 1 |
|
4/19 | Guest lecture |
Slides |
||
4/24 | Trustworthy generative models | Trustworthy diffusion models against training and testing time adversaries |
Slides reading 1 |
|
4/26 | Trustworthy foundation models | Training foundation models, and diverse trustworthy issues of foundation models |
Slides reading 1 |
|
5/1 | Final review, final project presentation | |||
5/3 | Final Exam |
The course will involve 3 programming homework, a midterm, and a final. Unless otherwise noted by the instructor, all work in this course is to be completed independently. If you are ever uncertain of how to complete an assignment, you can go to office hours or engage in high-level discussions about the problem with your classmates on the Piazza boards.
Grades will be assigned as follows:
The expectations for the course are that students will attend every class, do any readings assigned for class, and actively and constructively participate in class discussions. Class participation will be a measure of contributing to the discourse both in class, through discussion and questions, and outside of class through contributing and responding to the Piazza forum.
More information about course requirements will be made available leading up to the start of classes.
This course will include topics related computer security and privacy. As part of this investigation we may cover technologies whose abuse could infringe on the rights of others. As computer scientists, we rely on the ethical use of these technologies. Unethical use includes circumvention of an existing security or privacy mechanisms for any purpose, or the dissemination, promotion, or exploitation of vulnerabilities of these services. Any activity outside the letter or spirit of these guidelines will be reported to the proper authorities and may result in dismissal from the class and possibly more severe academic and legal sanctions.
The University of Illinois at Urbana-Champaign Student Code should also be considered as a part of this syllabus. Students should pay particular attention to Article 1, Part 4: Academic Integrity. Read the Code at the following URL: http://studentcode.illinois.edu/.
Academic dishonesty may result in a failing grade. Every student is expected to review and abide by the Academic Integrity Policy: http://studentcode.illinois.edu/. Ignorance is not an excuse for any academic dishonesty. It is your responsibility to read this policy to avoid any misunderstanding. Do not hesitate to ask the instructor(s) if you are ever in doubt about what constitutes plagiarism, cheating, or any other breach of academic integrity.
To obtain disability-related academic adjustments and/or auxiliary aids, students with disabilities must contact the course instructor and the as soon as possible. To insure that disability-related concerns are properly addressed from the beginning, students with disabilities who require assistance to participate in this class should contact Disability Resources and Educational Services (DRES) and see the instructor as soon as possible. If you need accommodations for any sort of disability, please speak to me after class, or make an appointment to see me, or see me during my office hours. DRES provides students with academic accommodations, access, and support services. To contact DRES you may visit 1207 S. Oak St., Champaign, call 333-4603 (V/TDD), or e-mail a message to disability@uiuc.edu. Please refer to http://www.disability.illinois.edu/.
Emergency response recommendations can be found at the following website: http://police.illinois.edu/emergency-preparedness/. I encourage you to review this website and the campus building floor plans website within the first 10 days of class: http://police.illinois.edu/emergency-preparedness/building-emergency-action-plans/.
Any student who has suppressed their directory information pursuant to Family Educational Rights and Privacy Act (FERPA) should self-identify to the instructor to ensure protection of the privacy of their attendance in this course. See http://registrar.illinois.edu/ferpa for more information on FERPA.