Kai-Wei Chang: dual coordinate descent for L1- and L2-loss dual SVM, multi-class (Crammer & Singer), L1-regularized classifiers.
Cho-Jui Hsieh: dual coordinate descent for L1- and L2-loss dual SVM, multi-class (Crammer & Singer), L1-regularized classifiers.
Rong-En Fan: Big helper on all issues.
Guo-Xun Yuan and Chia-Hu Ho: L1-regularized classifiers.
Hsiang-Fu Yu and Fang-Lan Huang: dual coordinate descent for L2-regularized logistic regression
Hsiang-Fu Yu: python interface
Chia-Hua Ho: solvers for linear support vector regression
Bo-Yu Chu and Chia-Hua Ho: code for parameter selection
Mu-Chu Lee and Wei-Lin Chiang: abstraction of two major operations (dot and axpy) in solvers. Combining two matrix-vector multiplications on Newton to one.
Chih-Yang Hsia and Ya Zhu: code for improved trust-region update rule in the primal-based Newton method (version 2.11)
Hsiang-Fu Yu and Hsin-Yuan Huang: scipy support in python interface (version 2.11)
Chih-Jen Lin: Project lead and maintainer.