An Efficient Alternating Newton Method for Learning Factorization Machines
Introduction
This tool solves factorization machines by an alternating Newton method. It converges faster than a popular stochastic gradient method. Details and comparisons can be found in the following paper.
Wei-Sheng Chin, Bo-Wen Yuan, Meng-Yuan Yang, and Chih-Jen Lin.
An Efficient Alternating Newton Method for Learning Factorization Machines.
Technique Report, 2016. (experimental code)
We provide MATLAB scripts because based on optimized matrix operations, they are often as efficient as a C/C++ implementation.
If you find this tool useful, please cite the above work.
Download MATLAB Scripts
Please download
code.zip.
The zipped file includes several files and a toy data set. The file example.m can be viewed as an example of using our solver implemented in fm_train.m. We also provide fm_predict.m for prediction. See a detailed usage below.
[w, U, V] = fm_train(y, X, lambda_w, lambda_U, lambda_V, d, epsilon, do_pcond, sub_rate);
Input parameters are
- y: training labels, an l-dimensional binary vector. Each element should be either +1 or -1.
- X: training instances. X is an l-by-n matrix if you have l training instances in an n-dimensional feature space.
- lambda_w: the regularization coefficient of linear term.
- lambda_U, lambda_V: the regularization coefficients of the two interaction matrices.
- d: dimension of the latent space.
- epsilon: stopping tolerance in (0,1). Use a larger value if the training time is too long.
- do_pcond: a flag. Use 1/0 to enable/disable the diagonal preconditioner.
- sub_rate: sampling rate in (0,1] to select instances for the sub-sampled Hessian matrix.
Outputs are
- w: linear coefficients. An n-dimensional vector.
- U, V: the interaction (d-by-n) matrices.
y_tilde = fm_predict(X, w, U, V);
For the input parameters, see fm_train's usage. The output is the prediction values of the input instances, an l-dimensional column vector.
Please read the COPYRIGHT notice before using it. Please send comments and suggestions to Chih-Jen Lin.