recpack.algorithms
The algorithms module in recpack contains a wide array of state-of-the-art collaborative filtering algorithms. Also included are some baseline algorithms, as well as several reusable building blocks such as commonly used loss functions and sampling methods.
Example of use:
from scipy.sparse import csr_matrix
from recpack.algorithms import Random
X = csr_matrix(np.array([[1, 0, 1], [1, 1, 0], [1, 1, 0]]))
# Set hyper-parameter values
algo = Random(K=3)
# Fit algorithm
algo.fit(X)
# Get random recos for each nonzero user
predictions = algo.predict(X)
# Predictions is a csr matrix, inspecting the scores with
predictions.toarray()
Baselines
In recpack, baseline algorithms are algorithms that are not personalized. Use these baselines if you wish to quickly test a pipeline, or for comparison in experiments.
|
Baseline algorithm recommending the most popular items in training data. |
|
Uniform random algorithm, each item has an equal chance of getting recommended. |
Item Similarity Algorithms
Item similarity algorithms exploit relationships between items to make recommendations. At prediction time, the user is represented by the items they have interacted with.
|
Implementation of the SLIM model. |
|
Item K Nearest Neighbours model. |
|
Item Probabilistic Nearest Neighbours model. |
|
Computes similarities between items as the similarity between their NMF item embeddings. |
|
Computes similarities between items as the similarity between their SVD embeddings. |
|
Prod2Vec algorithm from the paper: "E-commerce in Your Inbox: Product Recommendations at Scale". |
|
Clustered Prod2Vec implementation outlined in: E-commerce in Your Inbox: Product Recommendations at Scale (https://arxiv.org/abs/1606.07154) |
Hybrid Similarity Algorithms
Hybrid similarity algorithms use a combination of user and item similarities to generate recommendations.
|
Unified Nearest Neighbour algorithm combining user and item neighbourhood methods. |
Factorization Algorithms
Factorization algorithms factorize the interaction matrix into a user embeddings (U) and item embeddings (V) matrix, that can be user to reconstruct the original interaction matrix R = UV^T.
|
Non negative matrix factorization. |
|
Singular Value Decomposition used as a matrix factorization algorithm. |
|
WMF Algorithm by Yifan Hu, Yehuda Koren and Chris Volinsky et al. |
|
Implements Matrix Factorization by using the BPR-OPT objective and SGD optimization. |
Autoencoder Algorithms
Autoencoder algorithms aim to learn a function f, such that X = f(X). More information on autoencoders can be found on Wikipedia
|
RecVAE Algorithm as first discussed in 'RecVAE: a New Variational Autoencoder for Top-NRecommendations with Implicit Feedback', I. |
|
MultVAE Algorithm as first discussed in 'Variational Autoencoders for Collaborative Filtering', D. |
|
Implementation of the EASEr algorithm. |
Session-Based Algorithms
|
A recurrent neural network for session-based recommendations. |
|
A recurrent neural network for session-based recommendations. |
|
Sequence and Time Aware Neighbourhoods algorithm. |
|
Recommends the item that most likely follows a user's last interaction. |
Time Aware Algorithms
|
Framework for time aware variants of the ItemKNN algorithm. |
|
Time aware variant of ItemKNN which uses an exponential decay function at prediction time and cosine similarity. |
|
Time aware variant of ItemKNN which uses a hard-coded decay matrix and cosine or pearson similarity. |
|
Time aware variant of ItemKNN which uses an exponential decay function and cosine similarity. |
|
Time aware variant of ItemKNN which uses a logarithmic decay function. |
|
Time aware variant of ItemKNN which uses a exponential decay function and pearson similarity. |
|
Framework for time aware variants of ItemKNN that consider the time between two interactions when computing similarity between two items. |
|
Time aware variant of ItemKNN that considers the time between two interactions when computing similarity between two items, as well as the age of an event. |
|
Time aware variant of ItemKNN that considers the time between two interactions when computing similarity between two items. |
Abstract Base Classes
Recpack algorithm implementations inherit from one of these base classes. These base classes provide the basic building blocks to easily create new algorithm implementations that can be used within the recpack evaluation framework.
For more information on how to create your own recpack algorithm, see Creating your own algorithms.
Base class for all recpack algorithm implementations. |
|
Base algorithm for algorithms that fit an item to item similarity model |
|
Base algorithm for algorithms that fit an item to item similarity model with K similar items for every item |
|
|
Base class for factorization algorithms |
|
Base class for PyTorch algorithms optimized by means of gradient descent/ascent |
Stopping Criterion
When creating an algorithm that learns a model iteratively, we need a way to decide which is the best model, and when to stop. The Stopping Criterion module provides this functionality.
|
StoppingCriterion provides a wrapper around any loss function used in the validation stage of an iterative algorithm. |
Raised when Early Stopping condition is met. |
Loss Functions
Recommendation models learned iteratively by means of gradient descent (or ascent) require a loss function. in this module you will find some of the most common loss functions that can be used with any TorchMLAlgorithm.
To use these loss functions in a StoppingCriterion, we also provide metric wrappers around the raw loss functions.
|
Covariance loss. |
|
WARP loss |
|
Metric wrapper around the |
|
Bayesian Personalized Ranking loss. |
|
Wrapper around |
|
VAE loss function for use with Auto Encoders. |
|
Bayesian Personalized Ranking Max Loss. |
|
TOP1 Loss. |
|
TOP1 Max Loss. |
Samplers
In multiple recommendation algorithms (e.g. BPRMF) sampling methods play an important role. As such recpack contains a number of commonly used sampling methods.
|
Samples linked positive and negative interactions for users. |
|
Sampler that samples positives with replacement. |
|
Samples num_negatives negatives for each positive. |
|
Samples batches of user, input sequences. |
Samples num_negatives negatives for every positive in a sequence. |
Utility Functions
The util
module contains a number of utility functions
used across algorithms.
Use these to simplify certain tasks (such as batching) when creating a new algorithm.
|
Get batches from an iterable. |
|
Samples rows from the matrices |
|
Naively converts sparse csr_matrix to torch Tensor. |
|
Converts torch Tensor to sparse csr_matrix. |