且构网

分享程序员开发的那些事...
且构网 - 分享程序员编程开发的那些事

如何在Spark中强制执行DataFrame评估

更新时间:2023-10-15 09:50:10

我想只是从DataFrame获取基础rdd并对其触发操作即可实现您所需要的.

I guess simply getting an underlying rdd from DataFrame and triggering an action on it should achieve what you're looking for.

df.withColumn("test",myUDF($"id")).rdd.count // this gives proper exceptions