Get Entire AWS S3 Bucket Contents with Ansible

I ran into this issue the other day while putting together a simple deploy playbook.  For this particular project, we store artifacts in S3 and I needed to grab several jar files from the same bucket. Unfortunately, the Ansible S3 Module Get operation does not support recursive copy. While I could have accomplished this with a shell command and the awscli, I preferred to have Ansible perform the get.

The solution I ended up implementing was using list mode to create registered variables of the artifacts list, then loop through and get the artifacts. Simple and it works great. The working tasks are below.

tasks:
  - name: List s3 objects
    aws_s3:
      bucket: "bucketname"
      prefix: "path/to/artifact/"
      mode: list
    register: s3objects

  - name: Download s3objects
    aws_s3:
      bucket: "bucketname"
      object: "{{ item }}"
      mode: get
      dest: "destinationfolder/{{ item|basename }}"
    with_items: "{{ s3objects.s3_keys }}"

Leave a Reply

Your email address will not be published. Required fields are marked *