且构网

分享程序员开发的那些事...
且构网 - 分享程序员编程开发的那些事

如何在 Spark 1.5 中转置数据帧(没有可用的枢轴运算符)?

更新时间:2023-11-18 21:59:22

不确定效率如何,但您可以使用 collect 获取所有不同的日期,然后添加这些列,然后使用 groupBysum:

Not sure how efficient that is, but you can use collect to get all the distinct days, and then add these columns, then use groupBy and sum:

// get distinct days from data (this assumes there are not too many of them):
val days: Array[String] = df.select("Day")
    .distinct()
    .collect()
    .map(_.getAs[String]("Day"))

// add column for each day with the Sale value if days match:
val withDayColumns = days.foldLeft(df) { 
    case (data, day) => data.selectExpr("*", s"IF(Day = '$day', Sales, 0) AS $day")
}

// wrap it up 
val result = withDayColumns
   .drop("Day")
   .drop("Sales")
   .groupBy("Customer")
   .sum(days: _*)

result.show()

打印(几乎)您想要的内容:

Which prints (almost) what you wanted:

+--------+--------+--------+--------+--------+--------+--------+
|Customer|sum(Tue)|sum(Thu)|sum(Sun)|sum(Fri)|sum(Mon)|sum(Wed)|
+--------+--------+--------+--------+--------+--------+--------+
|       1|      10|      15|       0|       2|      12|       0|
|       2|       0|       4|      10|       3|       0|       5|
+--------+--------+--------+--------+--------+--------+--------+

如果需要,我会让你重命名/重新排序列.

I'll leave it to you to rename / reorder the columns if needed.