【问题标题】:Uploading Log file to S3 bucket?将日志文件上传到 S3 存储桶?
【发布时间】:2020-10-06 15:21:27
【问题描述】:

我有一个 Python 程序,它运行良好。但是,我决定做一些日志记录来跟踪进度并在日志文件中输出步骤。自从我第一次使用日志记录 python 库以来,我遇到了问题。目标是将步骤记录在文件中并将其上传到 S3。我错过了什么,请查看下面的代码?

start_time = time.time()
logging.basicConfig(filename='myLogFile.log', format='%(asctime)s %(levelname)s %(name)s: %(message)s', datefmt='%d-%b-%y %H:%M:%S', level=logging.INFO)
logger = logging.getLogger("GlueJob")
logging.info("Program started ....")
logger.setLevel(logging.INFO)
log_stringio = io.StringIO()
handler = logging.StreamHandler(log_stringio)
logger.addHandler(handler)

 Do Something .................
logging.info("List has all objects from S3 ... good")

Do Something ........................
logging.info("All created lists are populated with elements from S3 ... good")

DO Something ...........................
logging.info("Dictionary and Dataframe has been created ... good")

Do Something .......................
logging.info("Convert dataframe to csv ... good")

# here is the problem ....... Logfile is not uploading to S3 ### What am I missing??
s3.Bucket('my-bucket').upload_file(Filename='myLogFile.log', Key='/Asset_Filename_Database/folder1/folder2/myLogFile.log')

 print("Process Finsihed --- %s seconds ---" %(time.time() - start_time))

谢谢!!!

【问题讨论】:

  • 注意:我使用的是 Boto3 资源。
  • 你遇到了什么错误?
  • 没有错误...上传没有发生...

标签: python-3.x amazon-s3 logging aws-glue


【解决方案1】:

当您在键名Key='/Asset_Filename_Database/ 中使用/ 时会创建一个无名文件夹,请改用Key='Asset_Filename_Database


我尝试使用所有三个实例(对象、客户端和存储桶)运行此示例,它适用于我。

import logging
import io
import time
import boto3

start_time = time.time()
logging.basicConfig(filename='myLogFile.log', format='%(asctime)s %(levelname)s %(name)s: %(message)s', datefmt='%d-%b-%y %H:%M:%S', level=logging.INFO)
logger = logging.getLogger("GlueJob")
logging.info("Program started ....")
logger.setLevel(logging.INFO)
log_stringio = io.StringIO()
handler = logging.StreamHandler(log_stringio)
logger.addHandler(handler)

logging.info("List has all objects from S3 ... good")
logging.info("All created lists are populated with elements from S3 ... good")
logging.info("Dictionary and Dataframe has been created ... good")
logging.info("Convert dataframe to csv ... good")

s3 = boto3.resource('s3')
s3_client = boto3.client('s3')

s3_client.upload_file('myLogFile.log', 'test-kayd-bucket', 'client/myLogFile.log')

s3.Object('test-kayd-bucket', 'object/myLogFile.log').upload_file('myLogFile.log')
s3.Bucket('test-kayd-bucket').upload_file(Filename='myLogFile.log',  Key='bucket/myLogFile.log1')

print("Process Finsihed --- %s seconds ---" %(time.time() - start_time))

【讨论】:

    猜你喜欢
    • 1970-01-01
    • 2018-04-25
    • 2018-03-25
    • 2018-02-07
    • 1970-01-01
    • 1970-01-01
    • 1970-01-01
    • 2021-02-07
    • 2018-04-04
    相关资源
    最近更新 更多