且构网

分享程序员开发的那些事...
且构网 - 分享程序员编程开发的那些事

如何将自定义Java类转换为Spark数据集

更新时间:2022-06-16 21:59:35

您的用于创建DataFrame的注释中的代码是正确的。但是,定义 Test 的方式存在问题。您只能使用Java Bean中的代码创建DataFrame。您的测试类不是 Java Bean 。解决之后,您可以使用以下代码创建DataFrame:

Your code in the comments to create a DataFrame is correct. However, there is a problem with the way you define Test. You can create DataFrames using your code only from Java Beans. Your Test class is not a Java Bean. Once you fix that, you can use the following code to create a DataFrame:

Dataset<Row> dataFrame = spark.createDataFrame(listOfTestClasses, Test.class);

以及创建类型化数据集的这些行:

and these lines to create a typed Dataset:

Encoder<Test> encoder = Encoders.bean(Test.class);
Dataset<Test> dataset = spark.createDataset(listOfTestClasses, encoder);