Skip to content

CS 7301: Spring 2021 Course on Advanced Topics in Optimization in Machine Learning

License

Notifications You must be signed in to change notification settings

karaogluhh/AdvancedOptML

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

50 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Advanced Topics in Optimization for Machine Learning

CS 7301: Spring 2021 Course on Advanced Topics in Optimization for Machine Learning

Video Lectures

Video Lectures are on this youtube playlist: https://www.youtube.com/playlist?list=PLGod0_zT9w92_evaYrf3-rE67AmgPJoUU

Github Link to all Demos

https://github.com/rishabhk108/OptimizationDemos

Link to Google Spreadsheet for Paper Review and Project Topics

https://docs.google.com/spreadsheets/d/1UHHFlo_8QAvmXjWqoU02Calq86S-ewYl7Jczjhgr0wY/edit?usp=sharing

Deadline for finalizing on the papers to cover: February 26th

Deadine for finalizing on the project topic: March 5th

Topics Covered in this Course

  • Week 1
    • Logistics, Outline of this Course
    • Continuous Optimization in ML
    • Convex Sets and Basics of Convexity
  • Week 2: Gradient Descent and Family
    • Convex Functions, Properties, Minima, Subgradients
    • Gradient Descent and Line Search
  • Week 3: Gradient Descent Cont.
    • Accelerated Gradient Descent
    • Projected and Proximal Gradient Descent
  • Week 4
    • Projected GD and Conditional GD (Constrained Case)
    • Second Order Methods (Newton, Quasi-Newton, BFGS, LBFGS)
  • Week 5
    • Second Order Methods Completed
    • Barzelia Borwein and Conjugate GD
    • Coordinate Descent Family
  • Week 6
    • Stochastic Gradient and Family (SGD, SVRG)
    • SGD for Non-Convex Optimization. Modern variants of SGD particularly for deep learning (e.g. Adagrad, Adam, AdaDelta, RMSProp, Momentum etc.)
  • Week 7
    • Submodular Optimization: Basics, Definitions, Properties, and Examples.
  • Week 8
    • Submodular Information Measures: Conditional Gain, Submodular Mutual Information, Submodular Span, Submodular Multi-Set Mutual Information
  • Week 9
    • Submodular Minimization and Continuous Extensions of Submodular Functions. Submodular Minimization under constraints
  • Week 10
    • Submodular Maximization Variants, Submodular Set Cover, Approximate submodularity. Algorithms under different constraints and monotone/non-monotone settings. Also, distributed and streaming algorithms, DS Optimization, Submodular Optimization under Submodular Constraints
  • Week 11
    • Applications of Discrete Optimization: Data Subset Selection, Data Summarization, Feature Selection, Active Learning etc.
  • Rest of the Weeks
    • Paper Presentations/Project Presentations by the Students

Grading

  • 10% for Class Participation (Interaction, asking questions, answering questions)
  • 30% Assignments (2 Assignments, one on continuous optimization and one on discrete optimization)
  • 30% Paper Presentations (1-2 papers per student)
  • 30% for the Final Project
    • Take a new dataset/problem and study how existing optimization algorithms work on them
    • Take an existing problem and compare all optimization algorithms with your implementation from scratch
    • Design a ML optimization toolkit with algorithms implemented from scratch -- if one of you would like to extend my current python demos for optimization, that will be an awesome contribution and I might pick it up for my future classes and acknowledge you :)

Other Similar Courses

Resources/Books/Papers

About

CS 7301: Spring 2021 Course on Advanced Topics in Optimization in Machine Learning

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published