【问题标题】:How to publish and download artifacts from multiple jobs into one location on pipeline?如何将多个作业中的工件发布和下载到管道上的一个位置?
【发布时间】:2025-12-19 06:45:11
【问题描述】:

原始问题(检查下一部分的更新)

我想将多个作业生成的文件下载到 Azure 管道上的一个文件夹中。 这是我想要完成的架构:

jobs:                                                                           
 - job: job1                                                         
    pool: {vmImage: 'Ubuntu-16.04'}                                             
    steps:                                                                      
      - bash: |                                                                 
          printf "Hello form job1\n" > $(Pipeline.Workspace)/file.1 
      - task: PublishPipelineArtifact@1                                         
        inputs:                                                                 
            targetPath: $(Pipeline.Workspace)/file.1

 - job: job2                                                         
    pool: {vmImage: 'Ubuntu-16.04'}                                             
    steps:                                                                      
      - bash: |                                                                 
          printf "Hello form job2\n" > $(Pipeline.Workspace)/file.2 
      - task: PublishPipelineArtifact@1                                         
        inputs:                                                                 
            targetPath: $(Pipeline.Workspace)/file.2

 - job: check_prev_jobs
   dependsOn: "all other jobs"
    pool: {vmImage: 'Ubuntu-16.04'}                                             
    steps:                                                                      
      - bash: |                                                                 
          mkdir -p  $(Pipeline.Workspace)/previous_artifacts
      - task: DownloadPipelineArtifact@2                                         
        inputs:
            source: current
            path: $(Pipeline.Workspace)/previous_artifacts       

其中目录$(Pipeline.Workspace)/previous_artifacts 仅包含file.1file.2,没有目录job1job2 分别包含/file.1/file.2

谢谢!

更新

使用@Yujun Ding-MSFT's 回答。我创建了以下azure-pipelines.yml 文件:

stages:
- stage: generate
  jobs:
    - job: Job_1
      displayName: job1
      pool:
        vmImage: ubuntu-20.04
      variables:
          JOB_NAME: $(Agent.JobName)
          DIR: $(Pipeline.Workspace)/$(JOB_NAME)
      steps:
      - checkout: self
      - bash: |
          mkdir -p $DIR 
          cd $DIR
          printf "Time form job1\n" > $JOB_NAME.time
          printf "Hash form job1\n" > $JOB_NAME.hash
          printf "Raw form job1\n" > $JOB_NAME.raw
          printf "Nonesense form job1\n" > $JOB_NAME.nonesense
        displayName: Generate files
      - task: PublishPipelineArtifact@1
        displayName: Publish Pipeline Artifact
        inputs:
          path: $(DIR)
          artifactName: job1
      
    - job: Job_2
      displayName: job2
      pool:
        vmImage: ubuntu-20.04
      variables:
          JOB_NAME: $(Agent.JobName)
          DIR: $(Pipeline.Workspace)/$(JOB_NAME)
      steps:
      - checkout: self
      - bash: |
          mkdir -p $DIR 
          cd $DIR
          printf "Time form job2\n" > $JOB_NAME.time
          printf "Hash form job2\n" > $JOB_NAME.hash
          printf "Raw form job2\n" > $JOB_NAME.raw
          printf "Nonesense form job2\n" > $JOB_NAME.nonesense
        displayName: Generate files
      - task: PublishPipelineArtifact@1
        displayName: Publish Pipeline Artifact copy
        inputs:
          path: $(DIR)
          artifactName: job2

- stage: analyze
  jobs:
    - job: download_display
      displayName: Download and display
      pool:
        vmImage: ubuntu-20.04
      variables:
          DIR: $(Pipeline.Workspace)/artifacts
      steps:
      - checkout: self
      - bash: |
          mkdir -p $DIR 
      - task: DownloadPipelineArtifact@2
        displayName: Download Pipeline Artifact
        inputs:
          path: $(DIR)
          patterns: '**/*.time'
      - bash: |
          ls -lR $DIR
          cd $DIR
        displayName: Check dir content

但是,如下面的屏幕截图所示,我仍然将每个 .time 文件放在一个单独的与作业相关的目录中:

不幸的是,在我看来,使用 Pipeline.Artifacts 可能无法实现我想要的,如this Microsoft discussion 中所述。考虑到 Build.Artifacts 目前已被弃用,这将是一个遗憾。

【问题讨论】:

    标签: azure-devops azure-pipelines azure-pipelines-yaml


    【解决方案1】:

    在您目前的情况下,我们建议您可以将关键字:artifactName 添加到您的 publishArtifact 任务中。我修改了您的脚本并在我这边进行了测试。希望这会对您有所帮助:

    trigger: none
    
    # pool:
    #   vmImage: ubuntu-latest
    
    jobs:
    - job: Job_1
      displayName: job 1
      pool:
        vmImage: ubuntu-20.04
      steps:
      - checkout: self
        persistCredentials: True
      - task: Bash@3
        displayName: Bash Script
        inputs:
          targetType: inline
          script: 'printf "Hello form job1\n" > $(Pipeline.Workspace)/file.1 '
      - task: PublishPipelineArtifact@1
        displayName: Publish Pipeline Artifact
        inputs:
          path: $(Pipeline.Workspace)/file.1
          artifactName: job1
      
    - job: Job_2
      displayName: job2
      pool:
        vmImage: ubuntu-20.04
      steps:
      - checkout: self
      - task: Bash@3
        displayName: Bash Script copy
        inputs:
          targetType: inline
          script: 'printf "Hello form job1\n" > $(Pipeline.Workspace)/file.2 '
      - task: PublishPipelineArtifact@1
        displayName: Publish Pipeline Artifact copy
        inputs:
          path: $(Pipeline.Workspace)/file.2
          artifactName: job2
          
    - job: Job_3
      displayName: Agent job
      dependsOn:
      - Job_1
      - Job_2
      pool:
        vmImage: ubuntu-20.04
      steps:
      - checkout: self
      - task: Bash@3
        displayName: Bash Script
        inputs:
          targetType: inline
          script: ' mkdir -p  $(Pipeline.Workspace)/previous_artifacts'
      - task: DownloadPipelineArtifact@2
        displayName: Download Pipeline Artifact
        inputs:
          path: '$(Pipeline.Workspace)/previous_artifacts   '
    

    附上我的测试结果:

    更新: 因为作业正在使用不同的会话运行。所以我们不能只复制文件或使用发布工件来帮助我们合并两个作业工件。我修改了您的 yaml 文件,这可能会对您有所帮助:

    stages:
    - stage: generate
      jobs:
        - job: Job_1
          displayName: job1
          pool:
            vmImage: ubuntu-20.04
          variables:
              JOB_NAME: $(Agent.JobName)
              DIR: $(Pipeline.Workspace)/$(JOB_NAME)
          steps:
          - checkout: self
          - bash: |
              mkdir -p $DIR 
              cd $DIR
              printf "Time form job1\n" > $JOB_NAME.time
              printf "Hash form job1\n" > $JOB_NAME.hash
              printf "Raw form job1\n" > $JOB_NAME.raw
              printf "Nonesense form job1\n" > $JOB_NAME.nonesense
            displayName: Generate files
          - task: PublishPipelineArtifact@1
            displayName: Publish Pipeline Artifact
            inputs:
              path: $(DIR)
              artifactName: job1
    
        - job: Job_2
          displayName: job2
          dependsOn: 
          - Job_1
          pool:
            vmImage: ubuntu-20.04
          variables:
              JOB_NAME: $(Agent.JobName)
              DIR: $(Pipeline.Workspace)/$(JOB_NAME)
          steps:
          - checkout: self
          - bash: |
              mkdir -p $DIR 
              cd $DIR
              printf "Time form job2\n" > $JOB_NAME.time
              printf "Hash form job2\n" > $JOB_NAME.hash
              printf "Raw form job2\n" > $JOB_NAME.raw
              printf "Nonesense form job2\n" > $JOB_NAME.nonesense
            displayName: Generate files
          - task: DownloadPipelineArtifact@2
            displayName: Download Pipeline Artifact
            inputs:
              buildType: 'current'
              artifactName: 'job1'
              targetPath: '$(DIR)'
          - task: PublishPipelineArtifact@1
            displayName: Publish Pipeline Artifact copy
            inputs:
              path: $(DIR)
              artifactName: job2
     
    - stage: analyze
      jobs:
        - job: download_display
          displayName: Download and display
          pool:
            vmImage: ubuntu-20.04
          variables:
              DIR: $(Pipeline.Workspace)/artifacts
          steps:
          - checkout: self
          - bash: |
              mkdir -p $DIR 
          - task: DownloadPipelineArtifact@2
            displayName: Download Pipeline Artifact
            inputs:
              buildType: 'current'
              artifactName: 'job2'
              itemPattern: '**/*.time'
              targetPath: '$(DIR)'
              
          - bash: |
              ls -lR $DIR
              cd $DIR
              cd $(System.ArtifactsDirectory)
            displayName: Check dir content
    

    附加构建结果:

    【讨论】:

      最近更新 更多