更新时间:2023-10-16 20:21:52
我最终可以解决问题.主要问题是损坏的应用程序"配置,该配置看起来必须类似于以下内容:
I could solve the problem eventually. Main problem was the broken "Applications" configuration, which has to look like the following instead:
Applications=[{
'Name': 'Spark'
},
{
'Name': 'Hive'
}],
最终步骤"元素:
Steps=[{
'Name': 'lsr-step1',
'ActionOnFailure': 'TERMINATE_CLUSTER',
'HadoopJarStep': {
'Jar': 'command-runner.jar',
'Args': [
"spark-submit", "--class", "org.apache.spark.examples.SparkPi",
"s3://support.elasticmapreduce/spark/1.2.0/spark-examples-1.2.0-hadoop2.4.0.jar", "10"
]
}
}]