FastForest: Learning Gradient-Boosted Regression Trees for Classification, Regression and Ranking


Nov. 11 2017

“FastForest: Learning Gradient-Boosted Regression Trees for Classification, Regression and Ranking” is supported by Ca’ Foscari University of Venice Starting Grant (Fondo Primo Insediamento).

Abstract. Gradient Boosted Regression Trees (GBRTs)are today considered one of the most effective machine learning tools. Indeed, they are exploited within a Learning-to-Rank (LtR) framework by major Web companies including Microsoft, Google, Yahoo, Amazon, Facebook, etc., and learning high-quality GBRT models is remarkably expensive. The goal of our proposal is to design a novel learning algorithms for the learning more efficient and effective GBRT models.

Short Term Visits

Presentations

Publications

[1], [2], [3], [4].

References

[1]   Francesco Lettich, Claudio Lucchese, Franco Maria Nardini, Salvatore Orlando, Raffaele Perego, Nicola Tonellotto, and Rossano Venturini. Parallel traversal of large ensembles of decision trees. IEEE Transactions on Parallel and Distributed Systems, PP(99), to appear.

[2]   Claudio Lucchese and Franco Maria Nardini. Efficiency/effectiveness trade-offs in learning to rank. In Machine Learning and Knowledge Discovery in Databases - European Conference, ECML PKDD 2018, 2018.

[3]   Claudio Lucchese, Franco Maria Nardini, Salvatore Orlando, Raffaele Perego, Fabrizio Silvestri, and Salvatore Trani. X-cleaver: Learning ranking ensembles by growing and pruning trees. ACM Transactions on Intelligent Systems and Technology, to appear.

[4]   Claudio Lucchese, Franco Maria Nardini, Salvatore Orlando, Raffaele Perego, and Salvatore Trani. Selective gradient boosting for effective learning to rank. In SIGIR ’18: Proceedings of the 41th International ACM SIGIR Conference on Research and Development in Information Retrieval, 2018.

Share on