且构网

分享程序员开发的那些事...
且构网 - 分享程序员编程开发的那些事

Spark-submit 找不到本地文件

更新时间:2022-05-24 18:24:34

假设您希望 spark-submit 将位于 /home/user/scripts/spark_streaming 的 Python 脚本 YARN.py,正确的语法如下:

Supposing you want to spark-submit to YARN a Python script located at /home/user/scripts/spark_streaming.py, the correct syntax is as follows:

spark-submit --master yarn --deploy-mode client /home/user/scripts/spark_streaming.py

各种标志的顺序可以互换,但脚本本身必须在最后;如果您的脚本接受参数,则它们应遵循脚本名称(例如,请参阅 这个例子 用 10 位十进制数字计算 pi).

You can interchange the ordering of the various flags, but the script itself must be at the end; if your script accepts arguments, they should follow the script name (e.g. see this example for calculating pi with 10 decimal digits).

对于使用 2 个内核在本地执行,您应该使用 --master local[2] - 对所有可用的本地使用 --master local[*]核心(在两种情况下都没有 deploy-mode 标志).

For executing locally with, say, 2 cores, you should use --master local[2] - use --master local[*] for all available local cores (no deploy-mode flag in both cases).

查看 docs 以获取更多信息(尽管不可否认他们在 pyspark 演示中相当糟糕).

Check the docs for more info (although admittedly they are rather poor in pyspark demonstrations).

PS 提到Jupyter,以及错误信息中显示的路径非常令人费解...

PS The mention of Jupyter, as well the path shown in your error message are extremely puzzling...

UPDATE:似乎 PYSPARK_DRIVER_PYTHON=jupyter 搞砸了一切,通过 Jupyter 漏斗执行(这在这里是不可取的,它可能解释了奇怪的错误消息).尝试修改 .bashrc 中的环境变量,如下所示:

UPDATE: Seems that PYSPARK_DRIVER_PYTHON=jupyter messes up everything, funneling the execution through Jupyter (which is undesirable here, and it may explain the weird error message). Try modifying the environment variables in your .bashrc as follows:

export SPARK_HOME="/usr/local/spark"  # do not include /bin
export PYSPARK_PYTHON=python
export PYSPARK_DRIVER_PYTHON=python
export PYSPARK_DRIVER_PYTHON_OPTS=""

源.bashrc.