Contact Us 1-800-596-4880

To Analyze Business and API Data using ELK

This topic describes how to use the Elastic Stack (ELK) to analyze the business data and API analytics generated by Anypoint Platform Private Cloud Edition. It also describes how to use Filebeat to process Anypoint Platform log files and insert them into an Elasticsearch database. After inserting the files into Elasticsearch, you can analyze them with Kibana.

The following image shows an example of how API data appears in Kibana.

kibana example

Requirements

To export external analytics data to ELK, you must have the following software:

  • Mule Runtime, version 3.8.4 or greater

  • Mule Runtime Manager Agent, version 1.7.0 or greater

  • Elasticsearch, version 5.6.2

  • Filebeat, version 5.6.2

  • Kibana, version 5.6.2

Prerequisites

  • Ensure that you have installed and configured your Mule runtime instances.

  • Ensure that you have registered these instances with Runtime Manager.

Edit the wrapper.conf Properties File

In your Mule runtime installation, edit the following the conf/wrapper.conf file.

  1. Set the analytics_enabled property to true.

    wrapper.java.additional.<n>=-Danypoint.platform.analytics_enabled=true
  2. Remove the URI value from the analytics_base_uri property.

    wrapper.java.additional.<n>=-Danypoint.platform.analytics_base_uri=
  3. Set the on_prem property to true.

    wrapper.java.additional.<n>=-Danypoint.platform.on_prem=true

Enable Event Tracking

To export business and event tracking data, you must enable event tracking in runtime manager.

  1. From Anypoint Platform, click Runtime Manager.

  2. Click Servers.

  3. Click the row containing the server where you want to configure ELK.

  4. Click Manager Server, then click the Plugins tab.

  5. Under Event Tracking, select Business Events from the Levels drop down.

  6. Click the switch to Enable event tracking on ELK.

    Runtime Manager displays the ELK Integration dialog.

  7. Configure the log files as necessary.

  8. Click Apply

Enable API Analytics

To export API analytics, you must enable API analytics in Runtime Manager.

  1. From Anypoint Platform, click Runtime Manager.

  2. Click Servers.

  3. Click the row containing the server where you want to configure ELK.

  4. Click the switch to enable API analytics, the click the switch next to ELK.

  5. Configure the log files as necessary.

  6. Click Apply

Install and Configure the Filebeat Agent

You must install the Filebeat agent on each server where you have Mule runtime instances.

  1. Download and install Filebeat for your operating system.

Configure the Mule Filebeat Module

You must install the Filebeat module on each server where you have Mule runtime instances.

  1. Download and expand the Filebeat Module archive from the following URL:

    https://s3.us-east-2.amazonaws.com/elk-integration/elk-integration-09-29-17/elk-integration.tar.gz
  2. Add the absolute paths of each log file to the filebeat.yml file:

    • Add the event log paths to the var.paths variable of business_events as shown in the following example:

      business_events:
          enabled: true
          ...
          var.paths:
            - /var/mule/logs/instance1_events.log
            - /var/mule/logs/instance2_events.log
          ...
    • Add the API analytics log paths to the var.paths variable of api_analytics as shown in the following example:

      api_analytics:
        enabled: true
        	...
          var.paths:
            -  /var/mule/logs/instance1_api-analytics-elk.log
            -  /var/mule/logs/instance2_api-analytics-elk.log
  3. Add your Elasticsearch host to the hosts property:

    output.elasticsearch:
    	...
    	hosts: ["http://your_elastic_installation:9200"]
    	...
  4. Update the Filebeat configuration

    • If you installed Filebeat using a Linux package manager, run the following script included in the Filebeat module download:

       setup_mule_module.sh
    • If you installed Filebeat using another method, you must

      • copy filebeat.template.mule.json and filebeat.yml to the root installation folder of Filebeat

      • copy the mule module folder to the module folder of your Filebeat installation.

  5. Start Filebeat as a service on your system.

    For example, if you are using an RPM package manager:

    sudo /etc/init.d/filebeat start
  6. Configure Filebeat to start automatically during boot:

    sudo chkconfig --add filebeat

Install the Elasticsearch Geoip and Agent Modules

You must install the following Elasticsearch plugins:

  • Geoip: determines the geographical location of IP addresses stored in your logs.

  • User Agent: determines information about a browser or operating system based on HTTP requests.

Configure Kibana and Import the MuleSoft Kibana Dashboards

After installing Filebeat and Elasticsearch, you must configure Kibana to be able to consume data from Anypoint Platform.

MuleSoft provides a set of default Kibana configuration that you can use to analyze business and API data. These include dashboards, searches, and visualizations.

  1. Configure an Index Pattern

    You must create an Elasticsearch index for the Anypoint Platform data.

    1. Generate initial set of data.

      This is required for the index to be created so that Kibana can recognize this. For example, you can send a request to a test API to generate an initial set of data.

    2. In the Kibana management console, create an index pattern with mule-* as the value.

  2. Obtain the Index Pattern ID

    After creating the index pattern, you must obtain the index ID. This pattern is visible in the URL when viewing the mule-* index pattern. For example, in the following image the index pattern ID is AV7OmqBs1r9syiCBxyee.

    kibana index pattern id
  3. Download the Mule Kibana configuration files from the following URL:

    https://s3.us-east-2.amazonaws.com/elk-integration/elk-integration-09-29-17/dashboards.tar.gz

    This file contains a default dashboard, search, and visualization dashboards that you can use to analyze Anypoint Platform data.

  4. Add the Index Pattern ID to the searchSourceJSON Property of searches.json.

    Modify searches.json to include the index pattern ID retrieved in a previous step. You must modify every occurrence of searchSourceJSON in this file.

    "kibanaSavedObjectMeta": {
            "searchSourceJSON": "{\"index\":\"AV7OmqBs1r9syiCBxyee\", .......
     }
  5. Import each of the dashboards into your Kibana installation.

    You must import the dashboards in the following order:

    1. dashboards.json

    2. searches.json

    3. visualizations.json