【问题标题】:How to create csv file from json如何从 json 创建 csv 文件
【发布时间】:2021-04-11 04:05:57
【问题描述】:

我有一个类似的json

{
    "connectEnd": 1366.2749999930384,
    "connectStart": 175.91999999422114,
    "decodedBodySize": 3360,
    "domComplete": 10424.984999990556,
    "domContentLoadedEventEnd": 6581.454999992275,
    "domContentLoadedEventStart": 6581.454999992275,
    "domInteractive": 6581.420000002254,
    "domainLookupEnd": 175.91999999422114,
    "domainLookupStart": 12.000000002444722,
    "duration": 10425.015000000712,
    "encodedBodySize": 1279,
    "entryType": "navigation",
    "fetchStart": 0.22499999613501132,
    "initiatorType": "navigation",
    "loadEventEnd": 10425.015000000712,
    "loadEventStart": 10424.994999993942,
    "name": "https://something/login",
    "nextHopProtocol": "http/1.1",
    "redirectCount": 0,
    "redirectEnd": 0,
    "redirectStart": 0,
    "requestStart": 1366.394999990007,
    "responseEnd": 2062.7999999996973,
    "responseStart": 2059.7599999891827,
    "secureConnectionStart": 414.94000000238884,
    "serverTiming": [],
    "startTime": 0,
    "transferSize": 2679,
    "type": "navigate",
    "unloadEventEnd": 0,
    "unloadEventStart": 0,
    "workerStart": 0,
    "workerTiming": []
}

我使用 papaparse 将 JSON 转换为 csv,我得到了这个:

"请求时间","第一个字节的时间","响应时间","请求响应时间","缓存搜索加响应时间","Dom 交互","Dom 完成","传输大小","持续时间","域查找时间","连接时间" 693.3649999991758,3.040000010514632,3.040000010514632,696.4050000096904,2062.5750000035623,6581.420000002254,10424.984999990556,2679,10425.015000000712,163.91999999177642,1190.3549999988172 P>

我打算使用名为 BenchMark Evaluator 的 Jenkins 插件

此插件仅接受以下格式的 csv: csv table image link

我的问题陈述:如何将已解析的 csv 结构更改为所需的 csv 格式。有没有npm包可以直接给我我想要的或者解析后的csv需要转换。

【问题讨论】:

  • 谷歌node.js csv..

标签: javascript csv papaparse


【解决方案1】:

将 json 转换为 csv 的更优雅的方法是使用 map 函数,无需任何框架:

var json = json3.items
var fields = Object.keys(json[0])
var replacer = function(key, value) { return value === null ? '' : value } 
var csv = json.map(function(row){
  return fields.map(function(fieldName){
    return JSON.stringify(row[fieldName], replacer)
  }).join(',')
})
csv.unshift(fields.join(',')) // add header column
 csv = csv.join('\r\n');
console.log(csv)

输出:

title,description,link,timestamp,image,embed,language,user,user_image,user_link,user_id,geo,source,favicon,type,domain,id
"Apple iPhone 4S Sale Cancelled in Beijing Amid Chaos (Design You Trust)","Advertise here with BSA Apple cancelled its scheduled sale of iPhone 4S in one of its stores in China’s capital Beijing on January 13. Crowds outside the store in the Sanlitun district were waiting on queues overnight. There were incidents of scuffle between shoppers and the store’s security staff when shoppers, hundreds of them, were told that the sales [...]Source : Design You TrustExplore : iPhone, iPhone 4, Phone","http://wik.io/info/US/309201303","1326439500","","","","","","","","","wikio","http://wikio.com/favicon.ico","blogs","wik.io","2388575404943858468"
"Apple to halt sales of iPhone 4S in China (Fame Dubai Blog)","SHANGHAI – Apple Inc said on Friday it will stop selling its latest iPhone in its retail stores in Beijing and Shanghai to ensure the safety of its customers and employees. Go to SourceSource : Fame Dubai BlogExplore : iPhone, iPhone 4, Phone","http://wik.io/info/US/309198933","1326439320","","","","","","","","","wikio","http://wikio.com/favicon.ico","blogs","wik.io","16209851193593872066"

使用这种不太密集的语法和 JSON.stringify 来为字符串添加引号,同时保持数字不被引用:

const items = json3.items
const replacer = (key, value) => value === null ? '' : value // specify how you want to handle null values here
const header = Object.keys(items[0])
const csv = [
  header.join(','), // header row first
  ...items.map(row => header.map(fieldName => JSON.stringify(row[fieldName], replacer)).join(','))
].join('\r\n')

console.log(csv)

【讨论】:

猜你喜欢
  • 1970-01-01
  • 1970-01-01
  • 2016-05-28
  • 1970-01-01
  • 2015-09-06
  • 1970-01-01
  • 1970-01-01
  • 1970-01-01
  • 2017-10-04
相关资源
最近更新 更多