【发布时间】:2025-12-19 06:45:11
【问题描述】:
原始问题(检查下一部分的更新)
我想将多个作业生成的文件下载到 Azure 管道上的一个文件夹中。 这是我想要完成的架构:
jobs:
- job: job1
pool: {vmImage: 'Ubuntu-16.04'}
steps:
- bash: |
printf "Hello form job1\n" > $(Pipeline.Workspace)/file.1
- task: PublishPipelineArtifact@1
inputs:
targetPath: $(Pipeline.Workspace)/file.1
- job: job2
pool: {vmImage: 'Ubuntu-16.04'}
steps:
- bash: |
printf "Hello form job2\n" > $(Pipeline.Workspace)/file.2
- task: PublishPipelineArtifact@1
inputs:
targetPath: $(Pipeline.Workspace)/file.2
- job: check_prev_jobs
dependsOn: "all other jobs"
pool: {vmImage: 'Ubuntu-16.04'}
steps:
- bash: |
mkdir -p $(Pipeline.Workspace)/previous_artifacts
- task: DownloadPipelineArtifact@2
inputs:
source: current
path: $(Pipeline.Workspace)/previous_artifacts
其中目录$(Pipeline.Workspace)/previous_artifacts 仅包含file.1 和file.2,没有目录job1 和job2 分别包含/file.1 和/file.2。
谢谢!
更新
使用@Yujun Ding-MSFT's 回答。我创建了以下azure-pipelines.yml 文件:
stages:
- stage: generate
jobs:
- job: Job_1
displayName: job1
pool:
vmImage: ubuntu-20.04
variables:
JOB_NAME: $(Agent.JobName)
DIR: $(Pipeline.Workspace)/$(JOB_NAME)
steps:
- checkout: self
- bash: |
mkdir -p $DIR
cd $DIR
printf "Time form job1\n" > $JOB_NAME.time
printf "Hash form job1\n" > $JOB_NAME.hash
printf "Raw form job1\n" > $JOB_NAME.raw
printf "Nonesense form job1\n" > $JOB_NAME.nonesense
displayName: Generate files
- task: PublishPipelineArtifact@1
displayName: Publish Pipeline Artifact
inputs:
path: $(DIR)
artifactName: job1
- job: Job_2
displayName: job2
pool:
vmImage: ubuntu-20.04
variables:
JOB_NAME: $(Agent.JobName)
DIR: $(Pipeline.Workspace)/$(JOB_NAME)
steps:
- checkout: self
- bash: |
mkdir -p $DIR
cd $DIR
printf "Time form job2\n" > $JOB_NAME.time
printf "Hash form job2\n" > $JOB_NAME.hash
printf "Raw form job2\n" > $JOB_NAME.raw
printf "Nonesense form job2\n" > $JOB_NAME.nonesense
displayName: Generate files
- task: PublishPipelineArtifact@1
displayName: Publish Pipeline Artifact copy
inputs:
path: $(DIR)
artifactName: job2
- stage: analyze
jobs:
- job: download_display
displayName: Download and display
pool:
vmImage: ubuntu-20.04
variables:
DIR: $(Pipeline.Workspace)/artifacts
steps:
- checkout: self
- bash: |
mkdir -p $DIR
- task: DownloadPipelineArtifact@2
displayName: Download Pipeline Artifact
inputs:
path: $(DIR)
patterns: '**/*.time'
- bash: |
ls -lR $DIR
cd $DIR
displayName: Check dir content
但是,如下面的屏幕截图所示,我仍然将每个 .time 文件放在一个单独的与作业相关的目录中:
不幸的是,在我看来,使用 Pipeline.Artifacts 可能无法实现我想要的,如this Microsoft discussion 中所述。考虑到 Build.Artifacts 目前已被弃用,这将是一个遗憾。
【问题讨论】:
标签: azure-devops azure-pipelines azure-pipelines-yaml