...

Cloud Computing - spark

Back to Course

Lesson Description


Lession - #755 Spark Installation


How does Spark Install?

Flash is Hadoop's sub-project. Accordingly, it is smarter to introduce Spark into a Linux based framework. The accompanying advances tell the best way to introduce Apache Spark.

Step 1: Verifying Java Installation
Java establishment is one of the obligatory things in introducing Spark. Attempt the accompanying order to check the JAVA adaptation.

$java -version

Assuming Java is as of now, introduced on your framework, you get to see the accompanying reaction.

java variant "1.7.0_71" Java(TM>
SE Runtime Environment (assemble 1.7.0_71-b13>
Java HotSpot(TM>
Client VM (assemble 25.0-b02, blended mode>


On the off chance that you don't have Java introduced on your framework, then, at that point, Install Java prior to continuing to following stage.


Step 2: Verifying Scala installation
You should Scala language to execute Spark. So let us check Scala establishment utilizing following order.

$scala -version

In the event that Scala is as of now introduced on your framework, you get to see the accompanying reaction −

Scala code runner version 2.11.6 -- Copyright 2002-2013, LAMP/EPFL

On the off chance that you don't have Scala introduced on your framework, then, at that point, continue to following stage for Scala establishment.

Step 3: Downloading Scala
Download the most recent adaptation of Scala by visit the accompanying connection Download Scala. For this instructional exercise, we are utilizing scala-2.11.6 variant. Subsequent to downloading, you will observe the Scala tar document in the download organizer.


Step 4: Installing Scala
Follow the underneath given strides for introducing Scala.

Remove the Scala tar record

Type the accompanying order for separating the Scala tar document.

$ tar xvf scala-2.11.6.tgz

Move Scala programming records.

Utilize the accompanying orders for moving the Scala programming records, to individual registry (/usr/nearby/scala>
.

$ su -
Secret word:
# plate/home/Hadoop/Downloads/
# mv scala-2.11.6/usr/close by/scala
# exit


Set PATH for Scala
Utilize the accompanying order for setting PATH for Scala.

$ send out PATH = $PATH:/usr/nearby/scala/canister.

Checking Scala Installation.

After establishment, checking it is better. Utilize the accompanying order for confirming Scala establishment.

A$scala -version

Assuming Scala is now introduced on your framework, you get to see the accompanying reaction −

Scala code sprinter rendition 2.11.6 - - Copyright 2002-2013, LAMP/EPFL


Step 5: Downloading Apache Spark
Download the most recent rendition of Spark by visiting the accompanying connection Download Spark. For this instructional exercise, we are utilizing flash 1.3.1-canister hadoop2.6 rendition. In the wake of downloading it, you will observe the Spark tar document in the download organizer.


Step 6: Installing Spark
Follow the means given beneath for introducing Spark.

Separating Spark tar
The accompanying order for separating the flash tar record.

$ tar xvf flash 1.3.1-container hadoop2.6.tgz

Moving Spark programming documents
The accompanying orders for moving the Spark programming records to individual registry (/usr/neighborhood/flash>
.

$ su -
Secret word:

# disc/home/Hadoop/Downloads/
# mv flash 1.3.1-canister hadoop2.6/usr/neighborhood/flash
# exit

Setting up the climate for Spark

Add the accompanying line to ~/.bashrc document. It implies adding the area, where the flash programming document are situated to the PATH variable.

send out PATH=$PATH:/usr/neighborhood/flash/container

Utilize the accompanying order for obtaining the ~/.bashrc record.

$ source ~/.bashrc


Step 7: Verifying the Spark Installation
Compose the accompanying order for opening Spark shell. $flash shell On the off chance that flash is introduced effectively, you will view as the accompanying result.

Flash get together has been worked with Hive, including Datanucleus containers on classpath Utilizing Spark's default log4j profile: organization/apache/flash/log4j-defaults.properties 15/06/04 15:25:22 INFO SecurityManager: Changing perspective leg tendons to: hadoop 15/06/04 15:25:22 INFO SecurityManager: Changing alter upper leg tendons to: hadoop 15/06/04 15:25:22 INFO SecurityManager: SecurityManager: validation debilitated; ui upper leg tendons debilitated; clients with view consents: Set(hadoop>
; clients with change authorizations: Set(hadoop>
15/06/04 15:25:22 INFO HttpServer: Starting HTTP Server 15/06/04 15:25:23 INFO Utils: Successfully began administration 'HTTP class server' on port 43292.

Utilizing Scala variant 2.10.4 (Java HotSpot(TM>
64-Bit Server VM, Java 1.7.0_71>
Type in articulations to have them assessed. Flash setting accessible as sc scala>s