algorithm to gain proficiency with these inactive factors.

There are different algorithms, classes and functions likewise as a piece of the mllib package. At this point, let us comprehend an demonstration on

The accompanying example is of collaborative filtering utilizing ALS algorithm to construct the proposal model and assess it on preparing information.

```
1,1,5.0
1,2,1.0
1,3,5.0
1,4,1.0
2,1,5.0
2,2,1.0
2,3,5.0
2,4,1.0
3,1,1.0
3,2,5.0
3,3,1.0
3,4,5.0
4,1,1.0
4,2,5.0
4,3,1.0
4,4,5.0
```

```
--------------------------------------recommend.py----------------------------------------
from __future__ import print_function
from pyspark import SparkContext
from pyspark.mllib.recommendation import ALS, MatrixFactorizationModel, Rating
if __name__ == "__main__":
sc = SparkContext(appName="Pspark mllib Example">
```

data = sc.textFile("test.data">

ratings = data.map(lambda l: l.split(','>

>

\
.map(lambda l: Rating(int(l[0]>

, int(l[1]>

, float(l[2]>

>

>

# Build the recommendation model using Alternating Least Squares
rank = 10
numIterations = 10
model = ALS.train(ratings, rank, numIterations>

# Evaluate the model on training data
testdata = ratings.map(lambda p: (p[0], p[1]>

>

predictions = model.predictAll(testdata>

.map(lambda r: ((r[0], r[1]>

, r[2]>

>

ratesAndPreds = ratings.map(lambda r: ((r[0], r[1]>

, r[2]>

>

.join(predictions>

MSE = ratesAndPreds.map(lambda r: (r[1][0] - r[1][1]>

**2>

.mean(>

print("Mean Squared Error = " + str(MSE>

>

# Save and load model
model.save(sc, "target/tmp/myCollaborativeFilter">

sameModel = MatrixFactorizationModel.load(sc, "target/tmp/myCollaborativeFilter">

--------------------------------------recommend.py----------------------------------------

`$SPARK_HOME/bin/spark-submit recommend.py`

`Mean Squared Error = 1.20536041839e-05`