更新时间:2022-06-07 05:58:22
我收到了该错误消息.可能有几个根本原因,但这是我如何调查和解决问题的方法(在Linux上):
I had that error message. It probably may have several root causes but this how I investigated and solved the problem (on linux):
spark-submit
,而是尝试使用 bash -x spark-submit
来查看哪一行失败.spark-submit
, try using bash -x spark-submit
to see which line fails./usr/lib/jvm/java-8-openjdk-amd64/jre/bin/java -cp'/opt/spark-2.2.0-bin-hadoop2.7/conf/:/opt/spark-2.2.0-bin-hadoop2.7/jars/*'-Xmx1g org.apache.spark.deploy.SparkSubmit --class org.apache.spark.repl.Main --name'Spark shell'spark-shell
因此,spark-submit启动一个Java进程,并且使用/opt/spark-2.2.0-bin-hadoop2.7/中的文件找不到org.apache.spark.launcher.Main类.jars/*
(请参见上面的-cp选项).我在这个jars文件夹中做了一个ls,并且计算了4个文件,而不是整个火花分配(〜200个文件).在安装过程中可能是一个问题.因此,我重新安装了spark,检查了jar文件夹,它的工作原理就像一个吊饰.
So, spark-submit launches a java process and can't find the org.apache.spark.launcher.Main class using the files in /opt/spark-2.2.0-bin-hadoop2.7/jars/*
(see the -cp option above). I did an ls in this jars folder and counted 4 files instead of the whole spark distrib (~200 files).
It was probably a problem during the installation process. So I reinstalled spark, checked the jar folder and it worked like a charm.
因此,您应该:
java
命令(cp选项)希望有帮助.