且构网

分享程序员开发的那些事...
且构网 - 分享程序员编程开发的那些事

在 Spark 中找不到适合 jdbc 的驱动程序

更新时间:2022-05-16 03:36:23

有 3 种可能的解决方案,

There is 3 possible solutions,

  1. 您可能希望使用构建管理器(Maven、SBT)组装您的应用程序,因此您无需在 spark-submit cli 中添加依赖项.
  2. 您可以在 spark-submit cli 中使用以下选项:

  1. You might want to assembly you application with your build manager (Maven,SBT) thus you'll not need to add the dependecies in your spark-submit cli.
  2. You can use the following option in your spark-submit cli :

--jars $(echo ./lib/*.jar | tr ' ' ',')

说明:假设您在项目根目录的 lib 目录中拥有所有 jar,这将读取所有库并将它们添加到应用程序提交中.

Explanation : Supposing that you have all your jars in a lib directory in your project root, this will read all the libraries and add them to the application submit.

您也可以尝试在 SPARK_HOME/conf/spark 中配置这 2 个变量:spark.driver.extraClassPathspark.executor.extraClassPath-default.conf 文件并将这些变量的值指定为jar文件的路径.确保工作节点上存在相同的路径.

You can also try to configure these 2 variables : spark.driver.extraClassPath and spark.executor.extraClassPath in SPARK_HOME/conf/spark-default.conf file and specify the value of these variables as the path of the jar file. Ensure that the same path exists on worker nodes.