Spark Filter Function
In Spark, the Filter work returns a new dataset framed by choosing those components of the source on which the capacity brings valid back. Along these lines, it recovers just the components that fulfill the given condition.
Example of Filter function
In this model, we channel the given information and recover every one of the qualities with the exception of 35.
To open the flash in Scala mode, follow the beneath order.
Create an RDD using parallelized collection.
scala> val data = sc.parallelize(List(10,20,35,40>
Presently, we can peruse the produced outcome by utilizing the accompanying order.
Apply channel capacity and pass the articulation expected to perform.
scala> val filterfunc = data.filter(x => x!=35>
Presently, we can peruse the created outcome by utilizing the accompanying order.