...

Cloud Computing - spark

Back to Course

Lesson Description


Lession - #776 Cartesian Function


Spark Cartesian Function

In Spark, the Cartesian capacity creates a Cartesian result of two datasets and returns all the conceivable blend of matches. Here, every component of one dataset is matched with every component of another dataset.
Example of Cartesian function
In this model, we create a Cartesian result of two datasets.
  • To open the Spark in Scala mode, follow the underneath order.

$ spark-shell



  • Make a RDD utilizing the parallelized assortment.

scala> val data1 = sc.parallelize(List(1,2,3>
>

  • Presently, we can peruse the produced outcome by utilizing the accompanying order.

scala> data1.collect



  • Make another RDD utilizing the parallelized assortment.

scala> val data2 = sc.parallelize(List(3,4,5>
>

  • Presently, we can peruse the produced outcome by utilizing the accompanying order.

scala> data2.collect



  • Apply cartesian(>
    capacity to return the Cartesian result of the components.

scala> val cartesianfunc = data1.cartesian(data2>

  • Presently, we can peruse the produced outcome by utilizing the accompanying order.

scala> cartesianfunc.collect