且构网

分享程序员开发的那些事...
且构网 - 分享程序员编程开发的那些事

如何防止EMR Spark步骤重试?

更新时间:2022-05-16 03:35:41

您应尝试将spark.task.maxFailures设置为1(默认为4).

You should try to set spark.task.maxFailures to 1 (4 by default).

含义:

在放弃工作之前,任何特定任务的失败次数.分布在不同任务上的故障总数不会导致作业失败.特定任务必须使此尝试次数失败.应该大于或等于1.允许的重试次数=此值-1.

Number of failures of any particular task before giving up on the job. The total number of failures spread across different tasks will not cause the job to fail; a particular task has to fail this number of attempts. Should be greater than or equal to 1. Number of allowed retries = this value - 1.