Get Entire AWS S3 Bucket Contents with Ansible

I ran into this issue the other day while putting together a simple deploy playbook.  For this particular project, we store artifacts in S3 and I needed to grab several jar files from the same bucket. Unfortunately, the Ansible S3 Module Get operation does not support recursive copy. While I could have accomplished this with a shell command and the awscli, I preferred to have Ansible perform the get.

The solution I ended up implementing was using list mode to create registered variables of the artifacts list, then loop through and get the artifacts. Simple and it works great. The working tasks are below.

tasks:
  - name: List s3 objects
    aws_s3:
      bucket: "bucketname"
      prefix: "path/to/artifact/"
      mode: list
    register: s3objects

  - name: Download s3objects
    aws_s3:
      bucket: "bucketname"
      object: "{{ item }}"
      mode: get
      dest: "destinationfolder/{{ item|basename }}"
    with_items: "{{ s3objects.s3_keys }}"

One thought on “Get Entire AWS S3 Bucket Contents with Ansible”

  1. Hi Tyler

    how you was able to ignore the “prefix1”

    s3_keys return
    [‘prefix1/’, ‘prefix1/key1’, ‘prefix1/key2’]

    for examples i am getting the below error when
    prefix =partners/stream/Testkit/ and dest = /opt/partners/1/

    failed
    “item”: “partners/stream/Testkit/”,
    “msg”: “attempted to take checksum of directory: /opt/partners/1/”

    “msg”: “LIST operation complete”,
    “s3_keys”: [
    “partners/stream/Testkit/”,
    “partners/stream/Testkit/114TestData.csv”,
    “partners/stream/Testkit/114testdata.sh”,
    “partners/stream/Testkit/220TestData.csv”,
    “partners/stream/Testkit/220testdata.sh”,
    “partners/stream/Testkit/adept-testkit-3.0.100.jar”
    ]
    }

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.