且构网

分享程序员开发的那些事...
且构网 - 分享程序员编程开发的那些事

Spark 安装 - 错误:无法找到或加载主类 org.apache.spark.launcher.Main

更新时间:2022-03-18 08:19:08

我收到了那个错误信息.它可能有几个根本原因,但这是我调查和解决问题的方式(在 linux 上):

I had that error message. It probably may have several root causes but this how I investigated and solved the problem (on linux):

  • 不要启动 spark-submit,而是尝试使用 bash -x spark-submit 来查看哪一行失败.
  • 多次执行该过程(因为 spark-submit 调用嵌套脚本),直到找到调用的底层过程:在我的情况下类似于:
  • instead of launching spark-submit, try using bash -x spark-submit to see which line fails.
  • do that process several times ( since spark-submit calls nested scripts ) until you find the underlying process called : in my case something like :

/usr/lib/jvm/java-8-openjdk-amd64/jre/bin/java -cp '/opt/spark-2.2.0-bin-hadoop2.7/conf/:/opt/spark-2.2.0-bin-hadoop2.7/jars/*' -Xmx1g org.apache.spark.deploy.SparkSubmit --class org.apache.spark.repl.Main --name 'Spark shell' spark-shell代码>

因此,spark-submit 启动了一个 java 进程,但使用 /opt/spark-2.2.0-bin-hadoop2.7/中的文件找不到 org.apache.spark.launcher.Main 类jars/* (参见上面的 -cp 选项).我在这个 jars 文件夹中做了一个 ls 并计算了 4 个文件而不是整个 spark 分发(约 200 个文件).这可能是安装过程中的一个问题.所以我重新安装了 spark,检查了 jar 文件夹,它就像一个魅力.

So, spark-submit launches a java process and can't find the org.apache.spark.launcher.Main class using the files in /opt/spark-2.2.0-bin-hadoop2.7/jars/* (see the -cp option above). I did an ls in this jars folder and counted 4 files instead of the whole spark distrib (~200 files). It was probably a problem during the installation process. So I reinstalled spark, checked the jar folder and it worked like a charm.

所以,你应该:

  • 检查 java 命令(cp 选项)
  • 检查您的 jars 文件夹(它至少包含所有 spark-*.jar 吗?)

希望对你有帮助.