Kai-Wei Chang: dual coordinate descent for L1- and L2-loss dual SVM, multi-class (Crammer & Singer), L1-regularized classifiers.
Cho-Jui Hsieh: dual coordinate descent for L1- and L2-loss dual SVM, multi-class (Crammer & Singer), L1-regularized classifiers.
Rong-En Fan: Big helper on all issues.
Guo-Xun Yuan and Chia-Hu Ho: L1-regularized classifiers.
Hsiang-Fu Yu and Fang-Lan Huang: dual coordinate descent for L2-regularized logistic regression
Hsiang-Fu Yu: python interface
Chia-Hua Ho: solvers for linear support vector regression
Bo-Yu Chu and Chia-Hua Ho: code for parameter selection
Mu-Chu Lee and Wei-Lin Chiang: abstraction of two major operations (dot and axpy) in solvers. Combining two matrix-vector multiplications on Newton to one.
Chih-Yang Hsia and Ya Zhu: code for improved trust-region update rule in the primal-based Newton method (version 2.11)
Hsiang-Fu Yu and Hsin-Yuan Huang: scipy support in python interface (version 2.11)
Chih-Yang Hsia and Wei-Lin Chiang: Preconditioned CG for Newton method (version 2.20)
Jui-Yang Hsia: code for SVR parameter selection (version 2.30)
Hung-Yi Chou and Pin-Yen Lin: one-class SVM solver (version 2.40)
Ching-Pei Lee and Leonardo Galli: line-search Newton method (version 2.40)
Wei-Lin Chiang and Jui-Nan Yen: code for not regularizing bias and preparing 2.40 release (version 2.40)
Hsiang-Fu Yu, Wei-Lin Chiang, Jui-Nan Yen, Yu-Sheng Li, and Leonardo Galli: code and settings for version 2.42
Hsiang-Fu Yu, Jui-Nan Yen, Yu-Sheng Li, and Wei-Lin Chiang helped to create the installation through PyPI for version 2.43.
Guan-Ting Chen helped to improve the implementation of working-set selection in one-class SVM solver and prepared the release of version 2.46.
He-Zhe Lin helped to add a flag for recalculating w in dual solvers and Hung-Chih Chiang helped to prepare the release of version 2.48.
Chih-Jen Lin: Project lead and maintainer.