package recommendation
- Alphabetic
- Public
- All
Type Members
-
class
ALS extends Estimator[ALSModel] with ALSParams with DefaultParamsWritable
Alternating Least Squares (ALS) matrix factorization.
Alternating Least Squares (ALS) matrix factorization.
ALS attempts to estimate the ratings matrix
R
as the product of two lower-rank matrices,X
andY
, i.e.X * Yt = R
. Typically these approximations are called 'factor' matrices. The general approach is iterative. During each iteration, one of the factor matrices is held constant, while the other is solved for using least squares. The newly-solved factor matrix is then held constant while solving for the other factor matrix.This is a blocked implementation of the ALS factorization algorithm that groups the two sets of factors (referred to as "users" and "products") into blocks and reduces communication by only sending one copy of each user vector to each product block on each iteration, and only for the product blocks that need that user's feature vector. This is achieved by pre-computing some information about the ratings matrix to determine the "out-links" of each user (which blocks of products it will contribute to) and "in-link" information for each product (which of the feature vectors it receives from each user block it will depend on). This allows us to send only an array of feature vectors between each user block and product block, and have the product block find the users' ratings and update the products based on these messages.
For implicit preference data, the algorithm used is based on "Collaborative Filtering for Implicit Feedback Datasets", available at https://doi.org/10.1109/ICDM.2008.22, adapted for the blocked approach used here.
Essentially instead of finding the low-rank approximations to the rating matrix
R
, this finds the approximations for a preference matrixP
where the elements ofP
are 1 if r is greater than 0 and 0 if r is less than or equal to 0. The ratings then act as 'confidence' values related to strength of indicated user preferences rather than explicit ratings given to items.Note: the input rating dataset to the ALS implementation should be deterministic. Nondeterministic data can cause failure during fitting ALS model. For example, an order-sensitive operation like sampling after a repartition makes dataset output nondeterministic, like
dataset.repartition(2).sample(false, 0.5, 1618)
. Checkpointing sampled dataset or adding a sort before sampling can help make the dataset deterministic.- Annotations
- @Since( "1.3.0" )
-
class
ALSModel extends Model[ALSModel] with ALSModelParams with MLWritable
Model fitted by ALS.
Model fitted by ALS.
- Annotations
- @Since( "1.3.0" )
Value Members
-
object
ALS extends DefaultParamsReadable[ALS] with Logging with Serializable
An implementation of ALS that supports generic ID types, specialized for Int and Long.
An implementation of ALS that supports generic ID types, specialized for Int and Long. This is exposed as a developer API for users who do need other ID types. But it is not recommended because it increases the shuffle size and memory requirement during training. For simplicity, users and items must have the same type. The number of distinct users/items should be smaller than 2 billion.
-
object
ALSModel extends MLReadable[ALSModel] with Serializable
- Annotations
- @Since( "1.6.0" )