...

Cloud Computing - RDD SPARK

Back to Course

Lesson Description


Lession - #1475 Filter Function


Spark Filter Function

In Spark, the Filter work returns a new dataset framed by choosing those components of the source on which the capacity brings valid back. Along these lines, it recovers just the components that fulfill the given condition.

Example of Filter function
In this model, we channel the given information and recover every one of the qualities with the exception of 35.
    To open the flash in Scala mode, follow the beneath order.

$ spark-shell



    Create an RDD using parallelized collection.

scala> val data = sc.parallelize(List(10,20,35,40>
>

Presently, we can peruse the produced outcome by utilizing the accompanying order.
scala> data.collect



    Apply channel capacity and pass the articulation expected to perform.

scala> val filterfunc = data.filter(x => x!=35>

    Presently, we can peruse the created outcome by utilizing the accompanying order.

scala> filterfunc.collect