更新时间:2023-12-05 13:16:28
您应该能够通过设置写入模式 update(或upsert)并传递您的脚本为脚本(取决于ES版本)。
You should be able to do it by setting write mode "update" ( or upsert) and passing your script as "script" (depends on ES version).
EsSpark.saveToEs(rdd, "spark/docs", Map("es.mapping.id" -> "id", "es.write.operation" -> "update","es.update.script.inline" -> "your script" , ))
可能您想使用 upsert
Probably you want to use "upsert"
那里有一些很好的级联集成中的单元测试在同一库中;这些设置应该都适用于火花,因为它们都使用相同的编写器。
There are some good unit tests in cascading integration in same library; These settings should be good for spark as both uses same writer.
我建议阅读单元测试以为您的ES版本选择正确的设置。
I suggest to read unit tests to pick correct settings for your ES version.