An Efficient Alternating Newton Method for Learning Factorization Machines

Machine Learning Group at National Taiwan University
Contributors


Introduction

This tool solves factorization machines by an alternating Newton method. It converges faster than a popular stochastic gradient method. Details and comparisons can be found in the following paper.

Wei-Sheng Chin, Bo-Wen Yuan, Meng-Yuan Yang, and Chih-Jen Lin. An Efficient Alternating Newton Method for Learning Factorization Machines. Technique Report, 2016. (experimental code)

We provide MATLAB scripts because based on optimized matrix operations, they are often as efficient as a C/C++ implementation.

If you find this tool useful, please cite the above work.


Download MATLAB Scripts

Please download code.zip. The zipped file includes several files and a toy data set. The file example.m can be viewed as an example of using our solver implemented in fm_train.m. We also provide fm_predict.m for prediction. See a detailed usage below.

[w, U, V] = fm_train(y, X, lambda_w, lambda_U, lambda_V, d, epsilon, do_pcond, sub_rate);

Input parameters are

Outputs are

y_tilde = fm_predict(X, w, U, V);

For the input parameters, see fm_train's usage. The output is the prediction values of the input instances, an l-dimensional column vector.


Please read the COPYRIGHT notice before using it. Please send comments and suggestions to Chih-Jen Lin.