更新时间:2021-10-23 18:48:10
我遇到了同样的问题.对我来说结束的工作是使用与spark-submit一起使用的--driver-class-path参数.
I had the same problem. What ended working for me is to use the --driver-class-path parameter used with spark-submit.
主要是将整个spark类路径添加到--driver-class-path
The main thing is to add the entire spark class path to the --driver-class-path
这是我的步骤:
我的驱动程序类路径最终看起来像这样:
My driver class path ended up looking like this:
-驱动程序类路径/home/hadoop/jars/mysql-connector-java-5.1.35.jar:/etc/hadoop/conf:/usr/lib/hadoop/:/usr/lib/hadoop-hdfs/:/usr/lib/hadoop-mapreduce/:/usr/lib/hadoop-yarn/:/usr/lib/hadoop-lzo/lib/:/usr/share/aws/emr/emrfs/conf:/usr/share/aws/emr/emrfs/lib/:/usr/share/aws/emr/emrfs/auxlib/* >
--driver-class-path /home/hadoop/jars/mysql-connector-java-5.1.35.jar:/etc/hadoop/conf:/usr/lib/hadoop/:/usr/lib/hadoop-hdfs/:/usr/lib/hadoop-mapreduce/:/usr/lib/hadoop-yarn/:/usr/lib/hadoop-lzo/lib/:/usr/share/aws/emr/emrfs/conf:/usr/share/aws/emr/emrfs/lib/:/usr/share/aws/emr/emrfs/auxlib/*
这与使用Java和Spark 1.5.0的EMR 4.1一起使用. 我已经在Maven pom.xml中将MySQL JAR添加为依赖项
This worked with EMR 4.1 using Java with Spark 1.5.0. I had already added the MySQL JAR as a dependency in the Maven pom.xml
You may also want to look at this answer as it seems like a cleaner solution. I haven't tried it myself.