【问题标题】:Print multiple row outputs in csv file在 csv 文件中打印多行输出
【发布时间】:2019-10-16 21:43:25
【问题描述】:

假设

A='First'
B='Random'
C='Degree'
D='Largest'

A='Second'
B='Odd'
C='Inclined'
D='Maximum'

A='Third'
B='Even'
C='Steep'
D='Smallest'

A='Fourth'
B='Prime'
C='Gradient'
D='Minimum'

c = ['Group', 'Number', 'Angle', 'Max value']

df = pd.DataFrame([[A, B, C, D]], columns=c)
print (df)

#to csv
df.to_csv('Output.csv', encoding='utf-8', index=False)

实际输出:

它只显示最后一个输出,而不是每个值。像这样,如果有 N 个值,则 csv 必须将所有 N 个值存储在一个之下

Group  Number Angle    Max Value 
Fourth Prime  Gradient Minimum

预期输出:

【问题讨论】:

  • 你需要在循环中将数据追加到数据框,然后保存为csv
  • 你每次都在覆盖你的变量A, B, C, D
  • @ZarakiKenpachi - 您能否建议任何方法将此数据附加到数据框中,然后在 csv 中显示。
  • @Akshay K. 首先您需要将数据设置为列表或字典,如下所示,然后您会看到:pandas.pydata.org/pandas-docs/stable/reference/api/…

标签: python pandas csv sorting printing


【解决方案1】:

您应该遵循以下程序:

首先列出所有列,如下所示:

A = ['First', 'Random', 'Degree', 'Largest']
B = ['Second', 'Odd', 'Inclined', 'Maximum']
C = ['Third', 'Even', 'Steep', 'Smallest']
D = ['Fourth', 'Prime', 'Gradient', 'Minimum']

然后

c = ['Group', 'Number', 'Angle', 'Max value']
df = pd.DataFrame([A, B, C, D], columns=c)
print (df)

#to csv
df.to_csv('Output.csv', encoding='utf-8', index=False)

输出:

    Group  Number     Angle Max value
0   First  Random    Degree   Largest
1  Second     Odd  Inclined   Maximum
2   Third    Even     Steep  Smallest
3  Fourth   Prime  Gradient   Minimum

【讨论】:

    【解决方案2】:

    每次重新定义A B C D 变量时,它们都会被覆盖。您应该将它们添加到嵌套列表或使用循环:

    data = [
        ['First', 'Random', 'Degree', 'Largest'],
        ['Second', 'Odd', 'Inclined', 'Maximum'],
        ['Third', 'Even', 'Steep', 'Smallest'],
        ['Fourth', 'Prime', 'Gradient', 'Minimum']
    ]
    
    df = pd.DataFrame(data, columns=c)
    

    或者....

    A = ['First', 'Random', 'Degree', 'Largest']
    B = ['Second', 'Odd', 'Inclined', 'Maximum']
    C = ['Third', 'Even', 'Steep', 'Smallest']
    D = ['Fourth', 'Prime', 'Gradient', 'Minimum']
    
    df = pd.DataFrame([A, B, C, D], columns=c]
    

    【讨论】:

      【解决方案3】:

      你每次都在覆盖你的变量A, B, C, D,所以最后 in 只包含你上次迭代的值。

      变量的构造有点反直觉,但以下内容适用于您的情况:

      A = ['First', 'Second', 'Third', 'Fourth']
      B = ['Random', 'Odd', 'Even', 'Prime']
      C = ['Degree', 'Inclined', 'Steep', 'Gradient']
      D = ['Largest', 'Maximum', 'Smallest', 'Minimum']
      
      c = ['Group', 'Number', 'Angle', 'Max value']
      
      df = pd.DataFrame(data =[A, B, C, D])
      
      df = df.T
      df.columns = c
      print (df)
      
      #to csv
      df.to_csv('Output.csv', encoding='utf-8', index=False)
      

          Group  Number     Angle Max value
      0   First  Random    Degree   Largest
      1  Second     Odd  Inclined   Maximum
      2   Third    Even     Steep  Smallest
      3  Fourth   Prime  Gradient   Minimum
      

      【讨论】:

        【解决方案4】:

        您可以使用以下代码,

        import pandas as pd
        
        data = [
            ['First', 'Random', 'Degree', 'Largest'],
            ['Second', 'Odd', 'Inclined', 'Maximum'],
            ['Third', 'Even', 'Steep', 'Smallest'],
            ['Fourth', 'Prime', 'Gradient', 'Minimum']
        ]
        
        
        c = ['Group', 'Number', 'Angle', 'Max value']
        
        df = pd.DataFrame(data, columns=c)
        print (df)
        
        df.to_csv('Output.csv', encoding='utf-8', index=False)
        

        输出是

        还有你得到的 CSV 文件

        【讨论】:

          【解决方案5】:

          试试这个代码很简单

          dictionary = {'Group':['First','Second','Third','Fourth'],
          'Number' :['Random','Odd','Even','Prime'],
          'Angle':['Degree','Inclined','Steep','Gradient'],
          'Max value' :['Largest','Maximum','Smallest','Minimum']}
          df = pd.DataFrame(dictionary)
          print (df)
          #to csv
          df.to_csv('Output.csv', encoding='utf-8', index=False)
          

          【讨论】:

            猜你喜欢
            • 1970-01-01
            • 1970-01-01
            • 1970-01-01
            • 1970-01-01
            • 2020-01-16
            • 2017-04-18
            • 1970-01-01
            • 1970-01-01
            • 2011-12-19
            相关资源
            最近更新 更多