MuleSoft Technical Guides

Integrate Amazon S3 with Mule

User MuleSoft Integration Team
Calendar February 24, 2022

Amazon Web Service’s S3 stands for “Simple Storage Service”. It is a type of cloud storage provided to the developers as a scalable solution over the Internet. 

Amazon S3 uses the concept of Buckets & Objects to store the data. It allows an easy, user-friendly, fast & on-demand approach for storing & retrieving the data online.

MuleSoft provides an Amazon S3 connector which allows us to perform various operations on the Objects & Buckets.

In this blog, we are going through the understanding & working of the below S3 connectors:

  • Create Bucket
  • Delete Bucket
  • Create Object
  • Get Object
  • Delete Object

Prerequisite Steps to be followed:

  • There are 2 ways to add the Amazon S3 connector to your project:
  1. Add AWS-S3 connector module from Exchange to the Studio palette.

2. Or Go to the Anypoint Platform > Exchange > All assets > Provided by MuleSoft. 

Search & select the “Amazon S3 Connector” for Mule 4.

On opening the connector, click on “Dependency Snippets”.

Copy the opened Snippet under Maven.

Finally, paste it inside your pom file under <dependencies> tag. 

  • Make sure you have an account created with Amazon Web Services. You can refer this link to sign-up for a free account.
  • Note down the Security Credentials named “Access Key ID” & “Secret Key” from your AWS account. To create a new set of credentials you can click on “Create New Access Key” button under “Security Credentials” option.
  • Create a global connector configuration for “Amazon S3 Configuration” which will be common & used in the implementation of all the below mentioned S3 connectors.

For the “Basic” Connection, we will need to provide the details for “Access Key” & “Secret Key” (refer to the previous point).

  1. Create Bucket

A Bucket is used to store the Objects. Whenever an Object is added to the Bucket, a distinct property called “Version Id” is allocated to the Object internally.

We have created a simple flow using HTTP Listener (to make the POST call), loggers (to log the required information to the console) & most importantly “Create Bucket” connector from Amazon S3 module

Select the “Amazon_S3_Configuration” set up in global-config for the “Connector Configuration” field.

For “Bucket name” field, we can enter the value directly or dynamically. In our case we are getting the value dynamically using the POST call’s payload.

Set the name of the bucket in payload while hitting the URL. 

In our case a bucket named “for-blog” is created successfully.

2. Delete Bucket

Now we will make use of the “Delete Bucket” connector from the S3 module to delete the Bucket created in the previous section of the blog.

Here also, we will set the value of “Bucket name” dynamically using the payload.

On triggering the URL with the name of the Bucket to be deleted, we see that the “for-blog” named bucket is deleted from the S3 storage.

3. Create Object

The AWS Object contains the data which we want to store in the Bucket. The object is labeled with the value of “Key” passed while creating the Object.

For “Create Object” connector, we will need to provide the values of the below fields:

  • Bucket name: We are using an already created Bucket named “poc-aws-api” directly.
  • Key: Here we entered the name of the object to be created dynamically using the payload.
  • Object content: We will be passing the below data in json format to be stored, as part of payload.


    “objectName”: “newobject”,

    “description”: “description of the newobject comes here !”


After successful triggering of the URL, we can see the object with the name “newobject” is created under the “poc-aws-api” bucket

4. Get Object

In this scenario, we will use the “Get Object” S3 connector to fetch the data stored in the Object.

We are also using the “Write” file connector to download the fetched data in a file on a local location.

In “Get object” connector’s configuration, we will mention the below details:

  • Bucket name: The name of the Bucket where the Object is stored.
  • Key: The name of the Object to be fetched is passed as Variable stored as “objectName”.

On execution of the flow, the details of the object are successfully fetched & downloaded locally as a JSON file.

5. Delete Object

In this section, we will use the “Delete object” S3 connector to delete the Object created in 3rd section of the blog.

 In “Delete object” connector’s configuration, we will mention the below details:

  • Bucket name: The name of the Bucket where the Object is stored.
  • Key: The name of the Object to be deleted is passed as the part of payload.

After triggering the URL, we can see the mentioned Object is deleted from the Bucket.

Best Practice:

For purpose of simplification, we have mentioned the configuration details directly. 

But it is always recommended to externalize these details & encrypt the sensitive data.


Click here to check out/download the sample code for AWS S3 integration in Anypoint Platform.

Leave a comment

Your email address will not be published. Required fields are marked *