【发布时间】:2017-04-27 16:12:28
【问题描述】:
我有一个具有以下结构的数据框 (df):
数据
label pa_age pa_gender_category
10000 32.0 male
25000 36.0 female
45000 68.0 female
15000 24.0 male
目标
我想为列 'label' 构建一个 RandomForest 分类器,其中列 'pa_age' 和 'pa_gender_category' 是特征
流程遵循
// Transform the labels column into labels index
val labelIndexer = new StringIndexer().setInputCol("label")
.setOutputCol("indexedLabel").fit(df)
// Transform column gender_category into labels
val featureTransformer = new StringIndexer().setInputCol("pa_gender_category")
.setOutputCol("pa_gender_category_label").fit(df)
// Convert indexed labels back to original labels.
val labelConverter = new IndexToString()
.setInputCol("prediction")
.setOutputCol("predictedLabel")
.setLabels(labelIndexer.labels)
// Train a RandomForest model.
val rf = new RandomForestClassifier()
.setLabelCol("indexedLabel")
.setFeaturesCol("indexedFeatures")
.setNumTrees(10)
上述步骤的预期输出:
label pa_age pa_gender_category indexedLabel pa_gender_category_label
10000 32.0 male 1.0 1.0
25000 36.0 female 2.0 2.0
45000 68.0 female 3.0 2.0
10000 24.0 male 1.0 1.0
现在我需要将数据转换为“标签”和“特征”格式
val featureCreater = new VectorAssembler().setInputCols(Array("pa_age", "pa_gender_category"))
.setOutputCol("features").fit(df)
管道
val pipeline = new Pipeline().setStages(Array(labelIndexer, featureTransformer,
featureCreater, rf, labelConverter))
问题
error: value fit is not a member of org.apache.spark.ml.feature.VectorAssembler
val featureCreater = new VectorAssembler().setInputCols(Array("pa_age", "pa_gender_category_label")).setOutputCol("features").fit(df)
基本上是将数据转换为标签和特征的步骤 我遇到麻烦的格式。
我的流程/管道在这里正确吗?
【问题讨论】:
标签: scala apache-spark