MuleSoft – Splunk Integration

Published on
February 10, 2021
Author
MuleSoft Integration Team
MuleSoft – Splunk Integration

Pushing Mule application logs to Splunk Logging is an essential part of monitoring and troubleshooting issues and any production errors or visualizing the data. Logging must be consistent, reliable so we can use that information for discovering relevant data. Some external logging tools include ELK and Splunk MuleSoft provides its logging mechanism for storing application… Continue reading MuleSoft – Splunk Integration

Pushing Mule application logs to Splunk

Logging is an essential part of monitoring and troubleshooting issues and any production errors or visualizing the data. Logging must be consistent, reliable so we can use that information for discovering relevant data. Some external logging tools include ELK and Splunk
MuleSoft provides its logging mechanism for storing application logs. Although CloudHub has a limitation of 100 MB of logs or 30 days of logs. The blog typically talks MuleSoft Splunk Integration

For a robust logging mechanism, it is essential to have an external log analytic tool to further monitor the application.
Today we will be using Splunk as an external logging tool and integrating it with MuleSoft using Log4j2 HTTP appender to send mule application logs to Splunk. Logging to Splunk can be enabled on Cloud Hub and On-Premise.
First things first, we need to create a token in Splunk.

1. Go to Settings > Data > Data Inputs

2. Go to Settings > Data > Data Inputs > New Token

3. After Clicking on New Token, Click on HTTP Event Collector and add log4j as the source, since we will be sending logs from log4j to Splunk

4. Complete all the steps, and you will get the token value. The token value will be used to connect to Splunk from the log4j file in the MuleSoft application. Next Steps will involve configuring the HTTP Appender in the log4j file to connect to Splunk

5. Once you have created the token, make sure to enable the token by going to global settings. You can also enable SSL for this token and set the port. By the default the value is “8088”

6. Add the following snippet in the log4j2.xml in the mule application. We can also add SSL if the URL is HTTPS.

<Http name="Splunk" url="http://host:port/services/collector/raw"> <Property name="Authorization" value="Splunk {{Token-Value}}"></Property> <PatternLayout pattern="%m%n"></PatternLayout> <!--<TrustStore location="<path to trustore>" password="<truststore-password>"/> --> </Http> and add this reference in Async Root <AppenderRef ref = "Splunk"/> 

7. For better log analysis and monitoring, it is recommended to use JSON logs. For this purpose, we can either use JSON logs or add log information in JSON format. Here is a snippet for the application that will be sending logs to Splunk

8. Once you start the application, you can see logs flowing to Splunk. To check that, click on the “Search and Monitoring” option.

9. After that, click on “Data Summary” and click on “Source Types” and search for Log4j ( see Step 3), and select log4j.

10. On selecting that, you can see the logs being pushed to Splunk

Pushing MuleSoft Anypoint Platform logs to Splunk

Sending MuleSoft Anypoint Platform (CloudHub) logs will require a slightly different process. CloudHub uses its default logging mechanism. To use our logging, we need to make specific changes to the log4j file so that we can override the default log4j configuration for CloudHub. Below are the few steps that need to be followed.

1. Raise a support ticket with MuleSoft to disable CloudHub application logs. Once that is done, you will have an option to disable logs “Disable Application Logs” at runtime while deploying the application

2. The next step is to add CloudHub log appenders to the log4j2.xml file in the mule application.

An example log4j2 file with a custom cloudhub appender.

<?xml version="1.0" encoding="UTF-8"?> <Configuration status="INFO" name="cloudhub" packages="com.mulesoft.ch.logging.appender"> <Appenders> <RollingFile name="FILE" fileName="/opt/mule/mule-CURRENT/logs/mule-${sys:domain}.log" filePattern="/opt/mule/mule-CURRENT/logs/mule-${sys:domain}-%i.log"> <PatternLayout pattern="[%d{MM-dd HH:mm:ss.SSS}] %-5p %c{1} [%t]: %m%n"/> <DefaultRolloverStrategy max="10"/> <Policies> <SizeBasedTriggeringPolicy size="10 MB" /> </Policies> </RollingFile> <Log4J2CloudhubLogAppender name="CLOUDHUB" addressProvider="com.mulesoft.ch.logging.DefaultAggregatorAddressProvider" applicationContext="com.mulesoft.ch.logging.DefaultApplicationContext" appendRetryIntervalMs="${sys:logging.appendRetryInterval}" appendMaxAttempts="${sys:logging.appendMaxAttempts}" batchSendIntervalMs="${sys:logging.batchSendInterval}" batchMaxRecords="${sys:logging.batchMaxRecords}" memBufferMaxSize="${sys:logging.memBufferMaxSize}" journalMaxWriteBatchSize="${sys:logging.journalMaxBatchSize}" journalMaxFileSize="${sys:logging.journalMaxFileSize}" clientMaxPacketSize="${sys:logging.clientMaxPacketSize}" clientConnectTimeoutMs="${sys:logging.clientConnectTimeout}" clientSocketTimeoutMs="${sys:logging.clientSocketTimeout}" serverAddressPollIntervalMs="${sys:logging.serverAddressPollInterval}" serverHeartbeatSendIntervalMs="${sys:logging.serverHeartbeatSendIntervalMs}" statisticsPrintIntervalMs="${sys:logging.statisticsPrintIntervalMs}"> <PatternLayout pattern="[%d{MM-dd HH:mm:ss}] %-5p %c{1} [%t]: %m%n"/> </Log4J2CloudhubLogAppender> <Http name="Splunk" url="http://host:port/services/collector/raw"> <Property name="Authorization" value="Splunk {{Token-Value}}"></Property> <PatternLayout pattern="%m%n"></PatternLayout> <!--<TrustStore location="<path to trustore>" password="<truststore-password>"/> --> </Http> </Appenders> <Loggers> <AsyncRoot level="INFO"> <AppenderRef ref="FILE"/> <AppenderRef ref="CLOUDHUB"/> <AppenderRef ref="Splunk"/> </AsyncRoot> <AsyncLogger name="com.gigaspaces" level="ERROR"/> <AsyncLogger name="com.j_spaces" level="ERROR"/> <AsyncLogger name="com.sun.jini" level="ERROR"/> <AsyncLogger name="net.jini" level="ERROR"/> <AsyncLogger name="org.apache" level="WARN"/> <AsyncLogger name="org.apache.cxf" level="WARN"/> <AsyncLogger name="org.springframework.beans.factory" level="WARN"/> <AsyncLogger name="org.mule" level="INFO"/> <AsyncLogger name="com.mulesoft" level="INFO"/> <AsyncLogger name="org.jetel" level="WARN"/>x <AsyncLogger name="Tracking" level="WARN"/> </Loggers> </Configuration> 

3. Once you deploy the application to MuleSoft Anypoint Platform CloudHub and disable CloudHub logs, it will use the log4j2 configuration which we have created

Feel free to drop your questions in the comments section on MuleSoft Splunk Integration.

Recent Blogs

Connecting MuleSoft and Azure SQL with Entra ID
BlogJul 14, 2025

Connecting MuleSoft and Azure SQL with Entra ID

Introduction Establishing a secure connection between MuleSoft and Azure SQL Database can be challenging, especially if you are using Entra ID (formerly known as Azure Active Directory) for authentication. This blog walks through a fully working configuration for connecting to Azure SQL using ActiveDirectoryServicePrincipal in Mule runtime 4.7.4 with Java 8 — addressing driver setup,… Continue reading Connecting MuleSoft and Azure SQL with Entra ID

Read More
Blog
2 min read

Connecting MuleSoft and Azure SQL with Entra ID

Introduction Establishing a secure connection between MuleSoft and Azure SQL Database can be challenging, especially if you are using Entra ID (formerly known as Azure Active Directory) for authentication. This blog walks through a fully working configuration for connecting to Azure SQL using ActiveDirectoryServicePrincipal in Mule runtime 4.7.4 with Java 8 — addressing driver setup,… Continue reading Connecting MuleSoft and Azure SQL with Entra ID

Read More
Understanding Salesforce Flow Approval Processes
BlogJun 30, 2025

Understanding Salesforce Flow Approval Processes

Introduction: Salesforce introduced Flow Approval Processes in the Spring '25 release. This is an evolved version of the classic approval process model, powered by Flow Orchestrator. The new approach brings unprecedented flexibility, enabling the creation of dynamic, multi-level, and logic-driven approval workflows that are entirely declarative. Continue reading the blog to get a deeper understanding… Continue reading Understanding Salesforce Flow Approval Processes

Read More
Blog
5 min read

Understanding Salesforce Flow Approval Processes

Introduction: Salesforce introduced Flow Approval Processes in the Spring '25 release. This is an evolved version of the classic approval process model, powered by Flow Orchestrator. The new approach brings unprecedented flexibility, enabling the creation of dynamic, multi-level, and logic-driven approval workflows that are entirely declarative. Continue reading the blog to get a deeper understanding… Continue reading Understanding Salesforce Flow Approval Processes

Read More
Capturing Real-time Record Updation Using LWC
BlogMay 14, 2025

Capturing Real-time Record Updation Using LWC

Introduction In modern CRM ecosystems, real-time Salesforce integration and seamless user experiences are no longer optional but fundamental for driving operational efficiency. Imagine your sales reps making important Opportunity changes, but the ERP remains out of sync, leading to confusion and data errors. We understood the necessity to bridge this data gap and implemented a… Continue reading Capturing Real-time Record Updation Using LWC

Read More
Blog
5 min read

Capturing Real-time Record Updation Using LWC

Introduction In modern CRM ecosystems, real-time Salesforce integration and seamless user experiences are no longer optional but fundamental for driving operational efficiency. Imagine your sales reps making important Opportunity changes, but the ERP remains out of sync, leading to confusion and data errors. We understood the necessity to bridge this data gap and implemented a… Continue reading Capturing Real-time Record Updation Using LWC

Read More
All About Schedulers: Mule 4
BlogMay 7, 2025

All About Schedulers: Mule 4

In the world of Mule 4, automating repetitive tasks and triggering flows at defined intervals is necessary for building efficient and robust integration solutions. This is where Mule 4 schedulers come into use. This blog post explores the intricacies of scheduling in Mule 4, providing practical examples and best practices to help you get deeper… Continue reading All About Schedulers: Mule 4

Read More
Blog
7 min read

All About Schedulers: Mule 4

In the world of Mule 4, automating repetitive tasks and triggering flows at defined intervals is necessary for building efficient and robust integration solutions. This is where Mule 4 schedulers come into use. This blog post explores the intricacies of scheduling in Mule 4, providing practical examples and best practices to help you get deeper… Continue reading All About Schedulers: Mule 4

Read More