【问题标题】:In ansible, how to synchronize 2 folders on the same remote machine?在ansible中,如何在同一台远程机器上同步2个文件夹?
【发布时间】:2016-12-12 09:27:15
【问题描述】:

我有以下简单的任务:

将文件夹 A 中的所有内容复制到文件夹 B。由于我在一个组中有许多主机,因此我使用以下 yaml 任务定义:

- name: Sync /etc/spark/conf to $SPARK_HOME/conf
  synchronize: src=/etc/spark/conf dest={{spark_home}}/conf
  delegate_to: "{{item}}"
  with_items: "{{play_hosts}}"
  tags: spark

但是,运行 ansible-playbook 给了我以下错误:

TASK [cloudera : Sync /etc/spark/conf to $SPARK_HOME/conf] *********************
failed: [52.53.220.119 -> 52.53.200.0] (item=52.53.200.0) => {"cmd": "/usr/bin/rsync --delay-updates -F --compress --archive --rsh 'ssh -i /home/peng/.ssh/saphana.pem -S none -o StrictHostKeyChecking=no' --rsync-path=\"sudo rsync\" --out-format='<<CHANGED>>%i %n%L' \"/etc/spark/conf\" \"52.53.220.119:/opt/spark/spark-1.6.2-bin-hadoop2.4/conf\"", "failed": true, "item": "52.53.200.0", "msg": "Warning: Identity file /home/peng/.ssh/saphana.pem not accessible: No such file or directory.\nPermission denied (publickey).\r\nrsync: connection unexpectedly closed (0 bytes received so far) [sender]\nrsync error: unexplained error (code 255) at io.c(226) [sender=3.1.0]\n", "rc": 255}
failed: [52.53.200.193 -> 52.53.200.0] (item=52.53.200.0) => {"cmd": "/usr/bin/rsync --delay-updates -F --compress --archive --rsh 'ssh -i /home/peng/.ssh/saphana.pem -S none -o StrictHostKeyChecking=no' --rsync-path=\"sudo rsync\" --out-format='<<CHANGED>>%i %n%L' \"/etc/spark/conf\" \"52.53.200.193:/opt/spark/spark-1.6.2-bin-hadoop2.4/conf\"", "failed": true, "item": "52.53.200.0", "msg": "Warning: Identity file /home/peng/.ssh/saphana.pem not accessible: No such file or directory.\nPermission denied (publickey).\r\nrsync: connection unexpectedly closed (0 bytes received so far) [sender]\nrsync error: unexplained error (code 255) at io.c(226) [sender=3.1.0]\n", "rc": 255}
ok: [52.53.200.0 -> 52.53.200.0] => (item=52.53.200.0)
ok: [52.53.220.119 -> 52.53.220.119] => (item=52.53.220.119)
failed: [52.53.200.193 -> 52.53.220.119] (item=52.53.220.119) => {"cmd": "/usr/bin/rsync --delay-updates -F --compress --archive --rsh 'ssh -i /home/peng/.ssh/saphana.pem -S none -o StrictHostKeyChecking=no' --rsync-path=\"sudo rsync\" --out-format='<<CHANGED>>%i %n%L' \"/etc/spark/conf\" \"52.53.200.193:/opt/spark/spark-1.6.2-bin-hadoop2.4/conf\"", "failed": true, "item": "52.53.220.119", "msg": "Warning: Identity file /home/peng/.ssh/saphana.pem not accessible: No such file or directory.\nPermission denied (publickey).\r\nrsync: connection unexpectedly closed (0 bytes received so far) [sender]\nrsync error: unexplained error (code 255) at io.c(226) [sender=3.1.0]\n", "rc": 255}
failed: [52.53.200.0 -> 52.53.220.119] (item=52.53.220.119) => {"cmd": "/usr/bin/rsync --delay-updates -F --compress --archive --rsh 'ssh -i /home/peng/.ssh/saphana.pem -S none -o StrictHostKeyChecking=no' --rsync-path=\"sudo rsync\" --out-format='<<CHANGED>>%i %n%L' \"/etc/spark/conf\" \"52.53.200.0:/opt/spark/spark-1.6.2-bin-hadoop2.4/conf\"", "failed": true, "item": "52.53.220.119", "msg": "Warning: Identity file /home/peng/.ssh/saphana.pem not accessible: No such file or directory.\nPermission denied (publickey).\r\nrsync: connection unexpectedly closed (0 bytes received so far) [sender]\nrsync error: unexplained error (code 255) at io.c(226) [sender=3.1.0]\n", "rc": 255}
ok: [52.53.200.193 -> 52.53.200.193] => (item=52.53.200.193)
failed: [52.53.220.119 -> 52.53.200.193] (item=52.53.200.193) => {"cmd": "/usr/bin/rsync --delay-updates -F --compress --archive --rsh 'ssh -i /home/peng/.ssh/saphana.pem -S none -o StrictHostKeyChecking=no' --rsync-path=\"sudo rsync\" --out-format='<<CHANGED>>%i %n%L' \"/etc/spark/conf\" \"52.53.220.119:/opt/spark/spark-1.6.2-bin-hadoop2.4/conf\"", "failed": true, "item": "52.53.200.193", "msg": "Warning: Identity file /home/peng/.ssh/saphana.pem not accessible: No such file or directory.\nPermission denied (publickey).\r\nrsync: connection unexpectedly closed (0 bytes received so far) [sender]\nrsync error: error in rsync protocol data stream (code 12) at io.c(226) [sender=3.1.0]\n", "rc": 12}
failed: [52.53.200.0 -> 52.53.200.193] (item=52.53.200.193) => {"cmd": "/usr/bin/rsync --delay-updates -F --compress --archive --rsh 'ssh -i /home/peng/.ssh/saphana.pem -S none -o StrictHostKeyChecking=no' --rsync-path=\"sudo rsync\" --out-format='<<CHANGED>>%i %n%L' \"/etc/spark/conf\" \"52.53.200.0:/opt/spark/spark-1.6.2-bin-hadoop2.4/conf\"", "failed": true, "item": "52.53.200.193", "msg": "Warning: Identity file /home/peng/.ssh/saphana.pem not accessible: No such file or directory.\nPermission denied (publickey).\r\nrsync: connection unexpectedly closed (0 bytes received so far) [sender]\nrsync error: unexplained error (code 255) at io.c(226) [sender=3.1.0]\n", "rc": 255}

显然,ansible 似乎正在尝试在我的所有 3 个主机之间创建置换对并在每个对之间进行同步(因此执行了 9 次 rsync),我该如何避免这种情况并命令 ansible 仅在本地进行 rsync?

更新:我已将任务定义更改为使用 delegate.host:

- name: Sync /etc/spark/conf to $SPARK_HOME/conf
  synchronize: src=/etc/spark/conf dest={{spark_home}}/conf
  delegate_to: delegate.host
  tags: spark

但是明显没有被ansible引擎正确解释,调试日志显示没有被主机IP地址替代:

为用户建立 SSH 连接:

SSH:EXEC ssh -C -q -o ControlMaster=auto -o ControlPersist=60s -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o ConnectTimeout=10 -o ControlPath=/home/peng/.ansible/cp/ansible-ssh-%h-%p-%r delegate.host '/bin/sh -c '"'"'( umask 77 && mkdir -p "echo $HOME/.ansible/tmp/ansible-tmp-1470667606.38-157157938048153" && 回声 ansible-tmp-1470667606.38-157157938048153="echo $HOME/.ansible/tmp/ansible-tmp-1470667606.38-157157938048153" ) && 睡眠 0'"'"''

这看起来像是一个已弃用的功能,我使用的是 ansible 2.1.0.0

【问题讨论】:

  • 看来您需要将delegate.host 替换为真实的主机名或IP 地址。我现在不知道为什么它在我的情况下有效,但它显然与ansible.cfg 中的scp_if_ssh = True 设置有关,没有它我也会收到错误。

标签: ansible rsync ansible-playbook ansible-2.x


【解决方案1】:

已解决:

- name: Sync /etc/spark/conf to $SPARK_HOME/conf
  synchronize: src=/etc/spark/conf dest={{spark_home}} copy_links=true
  delegate_to: "{{ inventory_hostname }}"
  tags: spark

delegate.host 可能已被删除以支持新变量

【讨论】:

  • 不,delegate.host 只是文档中类似于put.your.host.name.here 的字符串。设置了scp_if_ssh 标志后,我放的任何东西都在一个目标主机上运行。
  • 我明白了,所以它首先不是变量。这就是您删除第一个答案的原因。我相信文档应该更清楚地说明它
  • @tribbloid : "{{ inventory_hostname }}" 到底是什么;是服务器的地址,还是你的主机文件中定义的组?
  • @firasKoubaa 根据清单中的名称,这是正在运行该游戏的当前主机的主机名。这使得 src 和 dest 系统是一回事。如果您特别需要完全限定的域名,ansible_fqdn 是一个不错的选择。
猜你喜欢
  • 2020-12-12
  • 1970-01-01
  • 1970-01-01
  • 1970-01-01
  • 1970-01-01
  • 2019-05-25
  • 1970-01-01
  • 1970-01-01
  • 1970-01-01
相关资源
最近更新 更多