Research

Abacus.AI at NeurIPS 2022

The Abacus.AI team has published two papers to appear at the Conference on Neural Information Processing Systems (NeurIPS) 2022. NeurIPS is a top machine learning conference held every December. Abacus.AI is also planning a social event during NeurIPS – stay tuned for more information!
We give brief descriptions of both NeurIPS 2022 papers below.

On the Generalizability and Predictability of Recommender Systems

Despite the widespread usage of recommender systems across many companies such as Amazon, YouTube, and Netflix, designing recommender system models still requires a high level of human effort. While it is common in other areas such as computer vision or natural language processing to start with pre-trained models, recommender system datasets tend to be extremely diverse and heterogeneous.

In this work, we start by giving the first large-scale study of recommender system approaches by comparing 18 algorithms and 100 sets of hyperparameters across 85 datasets and 315 metrics. We find that the best algorithms and hyperparameters are highly dependent on the dataset and performance metric, however, there are also strong correlations between the performance of each algorithm and various meta-features of the datasets. Motivated by these findings, we create RecZilla, a meta-learning approach to recommender systems that uses a model to predict the best algorithm and hyperparameters for new, unseen datasets. Abacus.AI is pleased to announce that we are releasing all of the RecZilla code and pre-trained models, for everybody to train their own recommender system models!

Paper: https://arxiv.org/abs/2206.11886
Code: https://github.com/naszilla/reczilla
Video: https://www.youtube.com/watch?v=NkNdxF5chZY

NAS-Bench-Suite-Zero: Accelerating Research on Zero Cost Proxies

Have you ever wanted to predict the performance of a neural network, before training it?

Zero-cost proxies (ZC proxies) are a recent architecture performance prediction technique aiming to significantly speed up algorithms for neural architecture search (NAS). Although these techniques show great promise, they are still largely under-explored.

In this work, we create NAS-Bench-Suite-Zero: we evaluate 13 ZC proxies across 28 tasks, creating by far the largest dataset (and unified codebase) for ZC proxies, enabling orders-of-magnitude faster experiments on ZC proxies, while avoiding confounding factors stemming from different implementations. To demonstrate the usefulness of NAS-Bench-Suite-Zero, we run a large-scale analysis of ZC proxies, including the first information-theoretic analysis, concluding that they capture substantial complementary information. Motivated by these findings, we show that incorporating all 13 ZC proxies into the surrogate models used by NAS algorithms can improve their predictive performance by up to 42%.

Paper: https://openreview.net/pdf?id=yWhuIjIjH8k
Code: https://github.com/automl/NASLib/tree/zerocost

If you’re interested in learning more, you can come to our poster sessions and social event at the NeurIPS Conference, Nov. 28 – Dec. 3 in New Orleans, Louisiana, USA, or virtually Nov. 28 – Dec. 9. Visit https://neurips.cc/ to register.

Related posts
ResearchTech

Closing the Gap to Closed Source LLMs - 70B Giraffe 32k

ResearchTutorial

Debiasing Facial Prediction Models with Adversarial Fine-Tuning

Research

Local Search is State of the Art for Neural Architecture Search Benchmarks

ResearchTutorial

Post-Hoc Methods for Debiasing Neural Networks

Leave a Reply

%d bloggers like this: