# Welcome to Math 516, Spring 2009

(last updated May 28, 2009)
New!
• HW 5 (last one!) is posted below.
• Starting the week of April 20, we will meet in MOR 219 at 9:02-10:20.
• For option 2 in the course project , some test problems are:
Heart : n=270, p=13, optimal obj = -131.348028 when C=1, rbf kernel
Diabetes : n=768, p=8, optimal obj = -329.617579 when C=1, rbf kernel
German num: n=1000, p=24, optimal obj = -396.768591 when C=1, rbf kernel
a6a: n=11220, p=123, optimal obj = -4140.468504 when C=1, rbf kernel

Problem format: Row 1 contains n and p. Row 2i contains ai and the number of "nonzeros" in zi; row 2i+1 contains the indices and the values of the nonzeros in zi, i=1,...,n. (Some of the nonzeros actually have value zero. This is an artifact of the data set. You are free to remove the zero entries. In matlab, you can use the command importdata to read in the data files into a matrix with 2 columns.)
These data sets are (after reformatting) from the libsvm data set . If you are interested to learn more about current research on methods for SVM, see the introduction of a paper I recently wrote with Sangwoon Yun cgd_svm.pdf

Course Info:
• Instructor: Paul Tseng (Click here for my webpage)
• E-mail:tseng@math.washington.edu
• Office: PDL C-344
• Office Hours: Tuesdays: 2:00-3:30 or whenever I am free
• Lecture: M, W, F 9:30-10:20 AM (changed to M, F 9:02-10:20), Room MOR 219
• Textbook: Dimitri P. Bertsekas, Nonlinear Programming , 2nd edition, Athena Scientific, 1999.
• Course Syllabus

• Prerequisites:
Linear algebra (matrices, vectors, eigenvalues, etc.), multivariable calculus (partial derivatives, tangent plane, linear approximation, vectors, scalar product, max/min), real analysis (sequences, limits, continuity and differentiability of functions, Taylor polynomial). Prior knowledge of optimization is helpful, but not necessary.

• About this course:
Numerical optimization permeates everything from machine learning to data classification to portfolio selection to robotic control to predicting the 3D structure of proteins. This course gives an overview of numerical optimization, i.e., numerical algorithms for solving optimization problems. Algorithms include those for unconstrained smooth optimization (steepest descent, Newton, conjugate gradient, quasi-Newton), for smooth optimization with convex constraints (Frank-Wolfe, gradient projection, manifold minimization), for smooth optimization with nonconvex constraints (penalty, multiplier, sequential quadratic programming), interior point method, problem decomposition by Lagrangian relaxation. Issues such as convergence, convergence rate, complexity will be studied. Some programming is involved.

• Handouts/Slides (Pdf files)

• Homeworks (Pdf files)

Links of Interest: