【发布时间】:2018-06-29 16:31:04
【问题描述】:
我需要将数据上传到现有模型。这必须每天进行。我想需要在索引文件中进行一些更改,但我无法弄清楚。我尝试推送具有相同模型名称的数据,但父数据已被删除。
任何帮助将不胜感激。
这是摄取的 json 文件:
{
"type" : "index",
"spec" : {
"dataSchema" : {
"dataSource" : "mksales",
"parser" : {
"type" : "string",
"parseSpec" : {
"format" : "json",
"dimensionsSpec" : {
"dimensions" : ["Address",
"City",
"Contract Name",
"Contract Sub Type",
"Contract Type",
"Customer Name",
"Domain",
"Nation",
"Contract Start End Date",
"Zip",
"Sales Rep Name"
]
},
"timestampSpec" : {
"format" : "auto",
"column" : "time"
}
}
},
"metricsSpec" : [
{ "type" : "count", "name" : "count", "type" : "count" },
{"name" : "Price","type" : "doubleSum","fieldName" : "Price"},
{"name" : "Sales","type" : "doubleSum","fieldName" : "Sales"},
{"name" : "Units","type" : "longSum","fieldName" : "Units"}],
"granularitySpec" : {
"type" : "uniform",
"segmentGranularity" : "day",
"queryGranularity" : "none",
"intervals" : ["2000-12-01T00:00:00Z/2030-06-30T00:00:00Z"],
"rollup" : true
}
},
"ioConfig" : {
"type" : "index",
"firehose" : {
"type" : "local",
"baseDir" : "mksales/",
"filter" : "mksales.json"
},
"appendToExisting" : false
},
"tuningConfig" : {
"type" : "index",
"targetPartitionSize" : 10000000,
"maxRowsInMemory" : 40000,
"forceExtendableShardSpecs" : true
}
}
}
【问题讨论】:
-
你能分享更多细节吗,例如。摄取任务 json?
-
有一个属性——appendToExisting。我要让它成真吗??
-
你确定间隔吗?它涵盖了巨大的时期?“intervals”:[“2000-12-01T00:00:00Z/2030-06-30T00:00:00Z”],您所有的摄取数据都在单个文件“mksales.json”中吗?
-
我为间隔取了虚拟值。我希望这不会影响摄取任务。不,mksales.json 是增量数据
标签: druid