Fetching Data From APIGEE and pushing into Mirth Via RabbitMQ

This will be a interesting blog post. In this post I will explain the following:

  1. Fetching the data from APIGEE (URL re-routing) in JAVA
  2. Pushing the fetched data from APIGEE and inserting to RabbitMQ queue.
  3. Pull the data from Rabbit MQ with Mirthconnect.


APIGEE is available for both enterprise and normal non-payable version. You can sign into Apigee. There are various purposes and use-cases available to use Apigee but i’m currently using this Apigee as a URL re-routing mechanism.

Take any source of JSON data which can be available free. Sign in to APIGEE and click on API Proxies. In the opening page click on +PROXY button on the right hand side top. This will open a new page with the following information


Once the Above screen appears select the first option “REVERSE PROXY” this will act as a URL re-routing mechanism. You will have a actual URL but you will not use that URL for communicating with the clients instead, you will give one URL which which will be mapped to the original URL.

Click next on selecting the first option. Then you will see the below screen as shown:


In the above screen on Proxy Name you have to fill out the name that you wish to give as a proxy name in the first text box in my case I have provided (vibinces-eval-test). In Proxy Base Path  you need to provide a sub-context of your API name I have provided (apigeejsonprofile). In the Existing API you need to provide the full URL path of existing JSON API. Description is optional field, you can either provide it or not.

Once it is created my url looked like this http://vibinces-eval-test.apigee.net/apigeejsonprofile you might get a URL as well with the name of your choice. In the Security  tab, It is advised to select CORS headers on browse. Because it is always possible to get the cross-origin error when you try to access data from browsers which are not verified properly. Im also using no authorization for the API.


In the Next tab you can see how the data provided is converted to your URL. It is also fascinating that APIGEE provides you two types of URL that can be used both in Testing or BETA and as well as one for PROD.


Now your URL re-router is created. That is if  you hit the URL http://vibinces-eval-test.apigee.net/apigeejsonprofile you can see the JSON data that actually belongs to some other URL.


I’m going to create the below two classes:

  1. FetchJsonFromApigee.java
  2. PushApigeeDataToRabbitMQ.java

1. FetchJsonFromApigee.java:

import java.io.BufferedReader;
import java.io.BufferedReader;
import java.io.InputStreamReader;
import java.net.HttpURLConnection;
import java.net.URL;
public class FetchJsonFromApigee {
public static String call_me() throws Exception {
String url = “http://vibinces-eval-test.apigee.net/apigeejsonprofile”;
URL obj = new URL(url);
HttpURLConnection con = (HttpURLConnection) obj.openConnection();
con.setRequestProperty(“User-Agent”, “Mozilla/5.0”);
int responseCode = con.getResponseCode();
System.out.println(“Response Code : ” + responseCode);
BufferedReader in = new BufferedReader(new InputStreamReader(con.getInputStream()));
String inputLine;
StringBuffer response = new StringBuffer();
while ((inputLine = in .readLine()) != null) {
} in .close();
System.out.println(“response : ” + response.toString());
return response.toString();
public String sendingMessage() throws Exception {
String pushedJsonMessage = FetchJsonFromApigee.call_me();
return pushedJsonMessage;

2. PushApigeeDataToRabbitMQ.java:

import java.io.IOException;
import java.util.concurrent.TimeoutException;

import com.rabbitmq.client.Channel;
import com.rabbitmq.client.Connection;
import com.rabbitmq.client.ConnectionFactory;

* @author Vibinchander.V
public class PushApigeeDataToRabbitMQ {
private final static String QUEUE_NAME = “TestQueuing”;

public static void passMessage(String message) throws IOException, TimeoutException {
ConnectionFactory factory = new ConnectionFactory();
Connection connection = factory.newConnection();
Channel channel = connection.createChannel();
channel.queueDeclare(QUEUE_NAME, false, false, false, null);
channel.basicPublish(“”, QUEUE_NAME, null, message.getBytes());
System.out.println(” [x] Sent ‘” + message + “‘”);

public static void main(String[] args) throws IOException, TimeoutException {
FetchJsonFromApigee getData = new FetchJsonFromApigee();
String passMessage = null;
try {
passMessage = getData.sendingMessage();
} catch (Exception e) {
System.out.println(“Executed Main Method !!!”);

When you run the first class you can see that the data is fetched from APIGEE and pushed into RabbitMQ message queue.

3. Write a JAR file that will pull data from RabbitMQ:

I’m writing the below class. this class will be used inside the Mirth tool which will act as a consumer to pull the data out of RabbitMQ.

import java.io.IOException;
import java.io.IOException;
import java.util.concurrent.TimeoutException;

import com.rabbitmq.client.AMQP;
import com.rabbitmq.client.Channel;
import com.rabbitmq.client.Connection;
import com.rabbitmq.client.ConnectionFactory;
import com.rabbitmq.client.Consumer;
import com.rabbitmq.client.DefaultConsumer;
import com.rabbitmq.client.Envelope;
import com.rabbitmq.client.QueueingConsumer;
@author Vibinchander.V
public class QueueConsumer {
 public String returnMessage(String queueName) throws IOException, TimeoutException {
  ConnectionFactory factory = new ConnectionFactory(); 
Connection connection = factory.newConnection(); 
Channel channel = connection.createChannel(); 
channel.queueDeclare(queueName, false, false, false, null); 
boolean noAck = false; 
QueueingConsumer consumerVal = new QueueingConsumer(channel);  channel.basicConsume(queueName, noAck, consumerVal); 
boolean runInfinite = true; 
QueueingConsumer.Delivery delivery = null; 
//while (runInfinite) {  //QueueingConsumer.Delivery delivery; 

try {   
delivery = consumerVal.nextDelivery(); 
} catch (InterruptedException ie)
{   // continue; 
}  //System.out.println(“Message received” + new String(delivery.getBody()));  channel.basicAck(delivery.getEnvelope().getDeliveryTag(), false); 
  return new String(delivery.getBody()); 

Make the above program in a JAR file put this inside the custom lib folder of Mirth. you might have to include other JAR files as well. Inside the Mirth source connector Javascript reader write the below code:

var queueConsumer = new org.envision.queuing.QueueConsumer();
msg = queueConsumer.returnMessage(“TestQueuing”);
return msg;

When you run the first FetchDataFromApigee.java you can see that data will be fetched from Apigee and pushed to RabbitMQ queue and immediately pulled by Mirth consumer.

Happy Integrations!!!!!



How to install and work with RabbitMQ – Part 1

This blog is about installation and working with one of the most useful message queuing system called RabbitMQ.

Before beginning, it is important for us to know what it actually does. This is more like a server mainly used for better queuing of messages. It uses a protocol called AMQP Advanced Message Queuing Protocol which enables the data to queue in a better rate and then process them.

This RabbitMQ is built on a language which is most infamous now and its called Erlang and it is built on Open Telecom Platform Framework for clustering and failover.

Previously there was a disadvantage in using this RabbitMQ because everything has to be handled via command prompt. People who uses this server were not tech savvy to do that. Later in their version they created a management studio that enables even a layman to work better.

It is also important for us to understand that not only RabbitMQ but also apache has a similar feature of server. Apache’s distribution of queuing mechanism is called Apache Kafka. We will discuss more about the both in fore coming days.


Step 1: Install Erlang –  We need to install Erlang and its corresponding components first before installing RabbitMQ. RabbitMQ will not work if Erlang is not installed. Download the latest version of Erlang from here.

Step 2: Once the installation is done, you have to set the environment config of ERLANG_HOME. If you are using Windows 10 you don’t need to setup the environment variable. It will be automatically setup as shown below.rabbitmq-Setup

Step 3: Install the latest version of RabbitMQ from here. This installation is pretty straight forward. If the installation of the Erlang is done without any problem then this will also work as smooth as butter.

Step 4: RabbitMQ runs by default as a windows service. We don’t need to explicitly invoke it. But at this stage you will have to do everything in the command line interface. Which is a tedious task. So it is better to work with the web management console interface of this.

  • Open command line interface with Admin access.
  • Navigate to C:\Program Files (x86)\RabbitMQ Server\rabbitmq_server-3.3.4\sbin 
  • Run the command rabbitmq-plugins.bat enable rabbitmq_management to enable the plugin of web management
  • Once, the above command is done, then re-install the RabbitMQ  using following commands:
  • rabbitmq-service.bat stop
    rabbitmq-service.bat install
    rabbitmq-service.bat start
  • Note: you can remain in the same directory to perform this.


If everything works fine as shown above in the screenshot. Then you can now open the web console of the RabbitMQ. you can access the RabbitMQ using the url http://localhost:15672/ the default username and password for this console will be guest guest

Will post more POC’s integrating RabbitMQ with Mirthconnect in future posts.

Happy Integrations!!!!!




Automate Import/Export channels functionality – Part2

This code will consist of the data needed for Mirth channel (B) in the server 2.

Basically, this channel will be reading the JSON message, decode the incoming encoded message, then automatically import those channel and then deploy it. This channel will be responsible to do all the importing operations and this will happen without any manual intervention.

Please use the below code in the source transformer of the channel and proceed connecting the mirth server 1 and mirth server 2


//Define and Initilize Mirth controller instances
var channelController = ChannelController.getInstance();
var codeTemplateController = CodeTemplateController.getInstance();
var configurationController = ConfigurationController.getInstance();

//Get list of existing libraries & channel dependencies here
var existingLibraries = codeTemplateController.getLibraries(null, true);
var channelDependencies = configurationController.getChannelDependencies();
var restoreChannelGroups = channelController.getChannelGroups(null);
var restoreLibraries = codeTemplateController.getLibraries(null, true);
var restoreChannelTagSet = configurationController.getChannelTags();
var restoreChannelDependencies = configurationController.getChannelDependencies();
var restoreDeployedChannels = channelController.getDeployedChannels(null);

//Get channel metadata
var channelMetaDataMap = configurationController.getChannelMetadata();

var abortDeploymentAndRestoreBackup = false;
var fileSuffix = DateUtil.getCurrentDate(“MMddyyyyHHmmss”);
var backupFileName = “channel backups/backup-” + fileSuffix + “.json”;
channelMap.put(“backupFileName”, backupFileName);

backup(restoreChannelGroups, restoreLibraries, restoreChannelTagSet, restoreChannelDependencies, restoreDeployedChannels);

var serializer = ObjectXMLSerializer.getInstance();
var jsonMessage = msg;
var toBeDeployedList = new Packages.java.util.ArrayList();
var groups = new Packages.java.util.HashSet();

//Populate existing channel groups on the mirth instance
var existingGroups = channelController.getChannelGroups(null);
if (existingGroups === null || existingGroups === undefined) {
existingGroups = new Packages.java.util.ArrayList();

//Iterae through the groups (from json message received)
for (var groupCounter = 0; groupCounter < jsonMessage.length && !abortDeploymentAndRestoreBackup; groupCounter++) {
var currentChannelGroup = jsonMessage[groupCounter];

var groupAlreadyPresent = false;
var indexFound = -1;
//Check to see if group alredy exists on mirth by iterating through the existing groups
for (existingCounter = 0; existingCounter < existingGroups.size(); existingCounter++) {
if (existingGroups.get(existingCounter).getId().equals(currentChannelGroup.groupId)) {
groupAlreadyPresent = true;
indexFound = existingCounter;
//If group is already present, then get copy of that in a variable. If not, then add it to existing groups list.
var chGroup = null;
if (!groupAlreadyPresent) {
logger.info(“Group NOT present. Creating a new one”);
var chGroup = new Packages.com.mirth.connect.model.ChannelGroup(currentChannelGroup.groupName, “”);
} else {
logger.info(“Group Already present”);
chGroup = existingGroups.get(indexFound);

//Parse channels element from json message and iterate through the channels
var channels = currentChannelGroup.channels;
for (channelCounter = 0; channelCounter < channels.length && !abortDeploymentAndRestoreBackup; channelCounter++) {
var decodedChannel = new Packages.java.lang.String(FileUtil.decode(channels[channelCounter]));
logger.debug(“Decoded Channel for import:” + decodedChannel);
var channelObject = serializer.deserialize(decodedChannel, Channel);
//Get channel dependency list from decoded channel being imported
var dependentIDS = channelObject.getExportData().getDependentIds().iterator();
var dependencyIDS = channelObject.getExportData().getDependencyIds().iterator();
//Before importing a new channel, try to stop and undeploy existing channel (max 5 times) after waiting for a second each time.
//If deploy fails, abort import with message and move over to next channel
for (retryCount = 0; retryCount < 5; retryCount++) {
if (retryCount >= 1) {
if (ChannelUtil.isChannelDeployed(channelObject.getId())) {
logger.info(“Request raised for stopping channel: ” + channelObject.getName());
logger.info(“Request raised for undeploying channel: ” + channelObject.getName());
} else {
logger.info(“Channel is undeployed:” + channelObject.getName());

if (ChannelUtil.isChannelDeployed(channelObject.getId())) {
logger.error(“Aborting import of channel as it is still deployed:” + channelObject.getName());
abortDeploymentAndRestoreBackup = true;

//Get code template libraries details linked with a channel, iterate through it and import those first before channel.
var chLibraries = channelObject.getExportData().getCodeTemplateLibraries();
for (ch1 = 0; ch1 < chLibraries.size(); ch1++) {
var currentLibrary = chLibraries.get(ch1);
var isLibraryAlreadyPresent = false;
//Find if the code template library is already present in mirth, then overwrite and update that library. Else import a new one.
for (exLibraryCounter = 0; exLibraryCounter < existingLibraries.size(); exLibraryCounter++) {
if (existingLibraries.get(exLibraryCounter).getId().equals(currentLibrary.getId())) {
isLibraryAlreadyPresent = true;
existingLibraries.set(exLibraryCounter, currentLibrary);
//Add new library, if not already present
if (!isLibraryAlreadyPresent) {
//Find list of code templates from the library and update & import them in mirth
var chTemplates = chLibraries.get(ch1).getCodeTemplates();
for (ch2 = 0; ch2 < chTemplates.size(); ch2++) {
var codeT = chTemplates.get(ch2); //serializer.deserialize(chTemplates.get(ch2), Packages.com.mirth.connect.model.codetemplates.CodeTemplate);
codeTemplateController.updateCodeTemplate(codeT, null, true);
codeTemplateController.updateLibraries(existingLibraries, null, true);
//Import the new channel in mirth and add it deployment list
channelController.updateChannel(channelObject, null, true);
var channelAlreadyFound = false;
//Check to see, if channel is already linked to channel group. If so, no need to link it again. If not, then add to channel group.
for (existingChannelCounter = 0; existingChannelCounter < chGroup.getChannels().size(); existingChannelCounter++) {
if (chGroup.getChannels().get(existingChannelCounter).getId().equals(channelObject.getId())) {
logger.debug(channelObject.getId() + ” channel id already found. No need to add again to the group. – ” + channelObject.getName());
channelAlreadyFound = true;
if (!channelAlreadyFound) {
//Update dependency and dependent Id list (in mirth memory for now)
while (dependentIDS.hasNext()) {
var dependentId = dependentIDS.next();
if (dependentId != null && dependentId !== undefined && !dependentId.equals(channelObject.getId())) {
channelDependencies.add(new Packages.com.mirth.connect.model.ChannelDependency(dependentId, channelObject.getId()));
while (dependencyIDS.hasNext()) {
var dependencyId = dependencyIDS.next();
if (dependencyId != null && dependencyId !== undefined && !dependencyId.equals(channelObject.getId())) {
channelDependencies.add(new Packages.com.mirth.connect.model.ChannelDependency(channelObject.getId(), dependencyId));
//Clear channel’s export data’s code template libraries and dependcies from memory before moving to next channel in channel group.
//If import of channels is done successfully, then set channel dependencies and update channel groups and deploy channels
if (!abortDeploymentAndRestoreBackup) {
//Update dependency and dependent Id list (in mirth persistence)

//Convert list of channel groups to be updated to a set and update channel groups
var newGroups = new Packages.java.util.HashSet();
for (i = 0; i < existingGroups.size(); i++) {
channelController.updateChannelGroups(newGroups, null, true);

//Deploy all the channels that were earlier added to deployment list.
for (k = 0; k < toBeDeployedList.size(); k++) {

//Wait for 5 seconds before deployment is verified.
Packages.java.lang.Thread.sleep(5000); // Not mandatory

//Identify if channel is not deployed even after 5 seconds.
for (k = 0; k < toBeDeployedList.size(); k++) {
if (!ChannelUtil.isChannelDeployed(toBeDeployedList.get(k))) {
logger.info(toBeDeployedList.get(k) + ” channel is not deployed yet after the import, so recovery process would start now.”);
abortDeploymentAndRestoreBackup = true;

if (abortDeploymentAndRestoreBackup) {
//Read json back up file for restoring previous version
var jsonBackupObject = JSON.parse(FileUtil.read(backupFileName));

//Get list of deployed channels from backup file
var deployedChannelIds = jsonBackupObject.deployedChannelIds;
var deserializerRestore = ObjectXMLSerializer.getInstance();

//Get list of channel groups from backup file
var channelGroupSetForRestore = new Packages.java.util.HashSet();
for (i = 0; i < jsonBackupObject.encodedChannelGroups.length; i++) {
channelGroupSetForRestore.add(deserializerRestore.deserialize(decode(jsonBackupObject.encodedChannelGroups[i]), ChannelGroup));

//Get list of code template libraries from backup file
var codeTemplateLibrariesForRestore = new Packages.java.util.ArrayList();
for (i = 0; i < jsonBackupObject.encodedCodeTemplateLibraries.length; i++) {
codeTemplateLibrariesForRestore.add(deserializerRestore.deserialize(decode(jsonBackupObject.encodedCodeTemplateLibraries[i]), CodeTemplateLibrary));

//Get channel tags from backup file
var channelTagSetForRestore = new Packages.java.util.HashSet();
for (i = 0; i < jsonBackupObject.encodedChannelTags.length; i++) {
channelTagSetForRestore.add(deserializerRestore.deserialize(decode(jsonBackupObject.encodedChannelTags[i]), ChannelTag));

//Get channel dependencies from backup file
var channelDependenciesSetForRestore = new Packages.java.util.HashSet();
for (i = 0; i < jsonBackupObject.encodedChannelDependencies.length; i++) {
channelDependenciesSetForRestore.add(deserializerRestore.deserialize(decode(jsonBackupObject.encodedChannelDependencies[i]), ChannelTag));

//Revert code templates and libraries
for (lb = 0; lb < codeTemplateLibrariesForRestore.size(); lb++) {
var currLibrary = codeTemplateLibrariesForRestore.get(lb);
for (ct = 0; ct < currLibrary.getCodeTemplates().size(); ct++) {
codeTemplateController.updateCodeTemplate(currLibrary.getCodeTemplates().get(ct), null, true);

//Revert channel code
var channelGrpIterator = channelGroupSetForRestore.iterator();
while (channelGrpIterator.hasNext()) {
var channelGroupToBeRestored = channelGrpIterator.next();
for (k = 0; k < channelGroupToBeRestored.getChannels().size(); k++) {
var channelToBeRestored = channelGroupToBeRestored.getChannels().get(k);
channelController.updateChannel(channelToBeRestored, null, true);

//Revert libraries, channel groups, channel tags and channel dependencies by calling mirth classes with values parsed from backup file.
codeTemplateController.updateLibraries(codeTemplateLibrariesForRestore, null, true);
channelController.updateChannelGroups(channelGroupSetForRestore, null, true);

//Deploy old version channels
for (i = 0; i < deployedChannelIds.length; i++) {

If you guys have a look at the code deeply, you can find that it uses three functions to achieve the above code. You can either place the below functions in the code template area or in the transformer as well.

//Function to decode the value and return string
function decode(value) {
return new Packages.java.lang.String(FileUtil.decode(value));

//Function to get an array of objects with xml representation of collection of objects
function getXml(collection) {
var returnList = [];
var counter = 0;
var backupSerializer = ObjectXMLSerializer.getInstance();
var iterator = collection.iterator();
while (iterator.hasNext()) {
var object = iterator.next();
var writerObject = new Packages.java.io.StringWriter();
backupSerializer.serialize(object, writerObject);
returnList[counter++] = FileUtil.encode(writerObject.toString().getBytes());
return returnList;

//This function takes backup of entire set of channel groups, code templates/libraries, channel tags, dependencies and list of deployed channels in time.
//backup is kept in json file with content as base64 encoded.
function backup(channelGroups, codeTemplateLibraries, channelTags, restoreChannelDependencies, restoreDeployedChannels) {
if (channelMetaDataMap != null) {
for (i = 0; i < channelGroups.size(); i++) {
var currentChannelGroup = channelGroups.get(i);
for (j = 0; j < currentChannelGroup.getChannels().size(); j++) {
var channelSetId = new Packages.java.util.HashSet();

var currentChannels = channelController.getChannels(channelSetId);
if (currentChannels != null) {
var currentChannel = currentChannels.get(0);
currentChannelGroup.getChannels().set(j, currentChannel);
var restoreDeployedChannelIds = new Packages.java.util.ArrayList();
if (restoreDeployedChannels != null) {
for (i = 0; i < restoreDeployedChannels.size(); i++) {
var encodedChannelGroups = getXml(channelGroups);
var encodedCodeTemplateLibraries = getXml(codeTemplateLibraries);
var encodedChannelTags = getXml(channelTags);
var encodedChannelDependencies = getXml(restoreChannelDependencies);
var backup = {};
backup.encodedChannelGroups = encodedChannelGroups;
backup.encodedChannelTags = encodedChannelTags;
backup.encodedCodeTemplateLibraries = encodedCodeTemplateLibraries;
backup.encodedChannelDependencies = encodedChannelDependencies;
backup.deployedChannelIds = restoreDeployedChannelIds;

var outputBackup = JSON.stringify(backup);
FileUtil.write(backupFileName, false, JsonUtil.prettyPrint(outputBackup));

Yup. That’s how you can automate the tool which turns up to save much of your valuable time. Happy Automation

Automate Import/Export channels functionality – Part1

This is a weird experiment.

Just in-case we want to automate the exporting/importing of channels in mirth, then this feature will be very helpful. The user will be giving the ID’s of the channel group which needs to be exported to one mirth server to be imported into another mirth server, and this operation will be performed without any manual intervention.

The Mirth channel (A) from SERVER1 will accept all the channel group ID’s in a comma separated value, and then export the entire channel group along with the code template and dependencies attached with that channels of the group and generate a Json containing all the exported value in a base 64 encoded format.

The Mirth channel (B) from SERVER2 will consume this Json data decode the encoded string and automatically import those channels along with the group and their code templates and deploy them.

Server 1 – Mirth Channel (A):

This channel will consume the group ID’s in a comma separated value and then generate and JSON string out of it. Please copy the below code to the source/destination transformer.


//Defined and Initalized all controller instances
var configurationController = ConfigurationController.getInstance();
var channelController = ChannelController.getInstance();
var codeTemplateController = CodeTemplateController.getInstance();
var serializer = ObjectXMLSerializer.getInstance();

//Define required variables.
var groupIds = new Packages.java.util.HashSet();

//Parse the group ids passed into an array using comma as a separator
var commaSeparatedGroupIds = connectorMessage.getRaw().getContent();
var arrayGroupIds = commaSeparatedGroupIds.split(“,”);
for (i = 0; i < arrayGroupIds.length; i++) {

//Get channel groups and channel metadata and existing libraries.
var channelGroups = channelController.getChannelGroups(groupIds);
var channelMetaDataMap = configurationController.getChannelMetadata();
var libraries = codeTemplateController.getLibraries(null, true);

var output = [];

var newJsonObj = {};
newJsonObj.Manifest = [];
newJsonObj.ChannelExportData = [];

//Iterae through the channel groups (passed as input)
for (i = 0; i < channelGroups.size(); i++) {
var channelGroup = channelGroups.get(i);
var channelIds = new Packages.java.util.HashSet();
var groupNameValue = channelGroup.getName();

var groupNames = {};
groupNames.groupInfo = channelGroup.getName();
groupNames.channelNames = [];

var channelGroupJson = {};
channelGroupJson.groupId = channelGroup.getId();
channelGroupJson.groupName = channelGroup.getName();
channelGroupJson.channels = [];
output[i] = channelGroupJson;

// logger.info(“CHANNEL GROUP EXPORTED WITH NUMBER OF CHANNELS: ” + channelGroup.getChannels().size());
//Iterate through all the channels in the group and add channel ids of those to a list.
for (channelCounter = 0; channelCounter < channelGroup.getChannels().size(); channelCounter++) {
var currentChannelId = channelGroup.getChannels().get(channelCounter).getId();

//Load the channel objects based on channel ids collected previously.
var channels = channelController.getChannels(channelIds);

//Iterate through the channels loaded previously and update following for each channel
//1. Export Data -> Metadata
//2. Export Data -> Code template libraries
//3. Export Data -> Channel Tags
//4. Export Data -> Dependent Ids
//5. Export Data -> Dependency Ids
//Then convert that channel object into xml with base 64 encoding
for (channelCounter = 0; channelCounter < channels.size(); channelCounter++) {
var currentChannelId = channels.get(channelCounter).getId();
var channelDetails = {};
channelDetails.channelName = channels.get(channelCounter).getName();
channelDetails.Library = [];

if (channelMetaDataMap != null) {

for (ctCounter = 0; libraries != null && ctCounter < libraries.size(); ctCounter++) {
var library = libraries.get(ctCounter);
//logger.info(“library : “+library.getName())

if (library.getEnabledChannelIds().contains(currentChannelId) ||
(library.isIncludeNewChannels() && !library.getDisabledChannelIds().contains(currentChannelId))) {


var channelTagSet = configurationController.getChannelTags();
var channelTags = null;
if (channelTagSet != null) {
channelTags = channelTagSet.iterator();

while (channelTags.hasNext()) {
var channelTag = channelTags.next();
if (channelTag.getChannelIds().contains(currentChannelId)) {

var channelDependenciesSet = configurationController.getChannelDependencies();
var channelDependencies = null;
if (channelDependenciesSet != null) {
channelDependencies = channelDependenciesSet.iterator();
while (channelDependencies.hasNext()) {
var channelDependency = channelDependencies.next();
if (channelDependency.getDependencyId().equals(currentChannelId)) {
} else if (channelDependency.getDependentId().equals(currentChannelId)) {

var writer = new Packages.java.io.StringWriter();
serializer.serialize(channels.get(channelCounter), writer);

channelGroupJson.channels[channelCounter] = FileUtil.encode(writer.toString().getBytes());

var newJson = JSON.stringify(newJsonObj);

//Write entire channel group and base64 list of its channel xmls into a file at defined location
FileUtil.write(“C:/Labs/POC/Import_Export/output.json”, false, JsonUtil.prettyPrint(newJson));
channelMap.put(“output”, JsonUtil.prettyPrint(newJson));




Create Automated Script for IT deployment

This is a hypothetical scenario:
Imagine a situation where you have developed all the channels required to build the interfaces, now you are going to move your channels to the production or any beta testing environment.

In this scenario you would want to make the channels to be imported to the mirth in specific environment. Imagine you don’t have an access to make this move. This will be permitted to be done only by the IT guys. In this scenario, the IT team will find it difficult to import the channels, as you cannot expect them to understand mirth.

The easier way for them to do is, you give a command to them they will execute the command and everything will start to work fine. i.e an import command via mirth command prompt like this:

import “Your-channel-available-folder\20180312\EAI-Deployment Script Generator.xml” force

But it will again be the manual process for the developers to do this manually. Imagine one day you have to send 4 to 5 channels to. In that case, you have to manually create this script and then send it. To overcome it, we can write one channel that will create a script for all the channels that is deployed today.

The logic behind this is that whatever is developed and tested today, only those channels will be moved to beta testing or prod. Based on that scenario, I have built the below code.

var currentDate = DateUtil.getCurrentDate(“yyyy-MM-dd”);
var currentYear = currentDate.substring(0, 4);
var currentMonth = currentDate.substring(5, 7);
var currentDay = currentDate.substring(8, 10);
var georgianMonth = parseInt(currentMonth) – 1;
var scriptBuilder = java.lang.StringBuilder();
var getScriptDate = DateUtil.getCurrentDate(“yyyyMMdd”);
// Initialize controller
var controller = com.mirth.connect.server.controllers.ControllerFactory.getFactory().createEngineController();
// Create Channel Deployed ID’s
var channels = ChannelUtil.getDeployedChannelIds().toArray();

for each(channel in channels) {


var dashboardStatus = controller.getChannelStatus(channel);
// Get Georgian date mapping from – https://docs.oracle.com/javase/7/docs/api/constant-values.html#java.util.Calendar.DATE

var fetchLastDeployedDay = dashboardStatus.getDeployedDate().get(5);
var fetchLastDeployedMonth = dashboardStatus.getDeployedDate().get(2);
var fetchLastDeployedYear = dashboardStatus.getDeployedDate().get(1);

if ((fetchLastDeployedYear == currentYear) && (fetchLastDeployedDay == currentDay) && (fetchLastDeployedMonth == georgianMonth)) {

var getDeployedChannelName = dashboardStatus.getName();
var deploymentScript = ‘import ‘ + ‘”‘ + $(‘Eai_qa_path’) + getScriptDate + ‘/’ + getDeployedChannelName + ‘.xml’ + ‘”‘ + ‘ force’;
var processedScript = deploymentScript.replace(/\//g, “\\”);



FileUtil.write(“C:/Labs/POC/Import_Export/test.txt”, false, scriptBuilder);

Put this code in the javascript listener and make it to run for 24 hours. (i.e) for every 24 hours one import script will be generated based on the channels that were developed and tested today

Happy Automating!!!!

Function for – Fetching Complete System Specification

This blog provides the code with javascript function that will fetch all the system specification of the system upon which mirth is installed.

This function does not require any input parameter, it will fetch all the statistics of the system in complete real time scenario.

function fetchSystemConfigurations() {

var systemConfiguration = ”;

function memoryCalc(data) {
var kb = data / 1024;
var mb = kb / 1024;
var gb = mb / 1024;

var finalGb = Math.round(gb);
var finalMb = Math.round(mb);
var finalKb = Math.round(kb);

var finalValue;

if (finalGb == 0) {
finalValue = finalMb + ‘MB’;
} else if (finalMb == 0) {
finalValue = finalKb + ‘KB’;
} else if (finalKb == 0) {
finalValue = data + ‘Bytes’;
} else {
finalValue = finalGb + ‘GB’;
return finalValue;

var availableProcessors = “Available Processors : ” + new java.lang.Runtime.getRuntime().availableProcessors();
var freeMemory = “Free Memory : ” + memoryCalc(new java.lang.Runtime.getRuntime().freeMemory());
var osName = “OS Name : ” + new java.lang.System.getProperty(‘os.name’);
var maximumMemory = “Maximum Memory : ” + memoryCalc(new java.lang.Runtime.getRuntime().maxMemory());
var totalJVMMemory = “Total JVM Memory : ” + memoryCalc(new java.lang.Runtime.getRuntime().totalMemory());
var javaVersion = “Java Version : ” + new java.lang.System.getProperty(‘java.version’);
var file = new java.io.File(‘c:’);
var diskFreeSpace = “Disck Free Space : ” + memoryCalc(file.getFreeSpace());
var diskTotalSpace = “Total Disk Space : ” + memoryCalc(file.getTotalSpace());
var hostNameAndIP = Packages.java.net.InetAddress.getLocalHost().toString();
var splitData = hostNameAndIP.split(“/”);
var hostName = “Host Name : ” + splitData[0];
var IP = “IP : ” + splitData[1];
var processorIdentifier = “Processor Identifier : ” + java.lang.System.getenv(“PROCESSOR_IDENTIFIER”);
var processorArchitecture = “Processor Architecture : ” + java.lang.System.getenv(“PROCESSOR_ARCHITECTURE”);
var javaClassPath = “Java Class Path : ” + new java.lang.System.getProperty(“java.class.path”);

systemConfiguration = availableProcessors + “\n” + freeMemory + “\n” + osName + “\n” + maximumMemory + “\n” + totalJVMMemory + “\n” + javaVersion + “\n” + diskFreeSpace + “\n” + diskTotalSpace + “\n” + IP + “\n” + hostName + “\n” + processorIdentifier + “\n” + processorArchitecture;

return systemConfiguration;

Put the above code in the code template library and call this function anywhere either in transformer or connector or  anywhere.


The output of the code will be as follows:

Available Processors : 4
Free Memory : 116MB
OS Name : Windows 10
Maximum Memory : 228MB
Total JVM Memory : 201MB
Java Version : 1.8.0_151
Disck Free Space : 415GB
Total Disk Space : 465GB
IP :
Host Name : VIBV-BLR-02
Processor Identifier : Intel64 Family 6 Model 142 Stepping 9, GenuineIntel
Processor Architecture : AMD64

Happy Integration ………!!!!!!!!

Perform all File IO operations – Code Templates

In this post, I’m creating the code templates that will do one stop solution for all problems that we face in mirth while doing the File based IO operation.

Imagine a case where you need to move the file from one directory to another via Mirth instead of source or destination connector provided by tool.

In that case:
1. first we need to check if the file exists or not?
2. we have to use FileUtil.write function to write that file to destination location
3. we have to delete the file from the source location.

The performance time required to complete this process will be high. and we have to catch exceptions in correct places. To avoid all these troubles I have developed Mirth code template library that will be one stop solution for all these problem.

This library utilizes apache commons FileUtils library. Download that library from this link here. In that link under binaries select commons-io-2.6-bin.zip if you are using Windows and commons-io-2.6-bin.tar.gz if you are using linux systems.

apache commons fileutils library

Once downloaded navigate to commons-io-2.6 folder and copy the JAR file named commons-io-2.6 alone. You will also find other JAR files along with that, you can ignore them, you have to copy this and place it in your custom-lib folder of Mirth connect installed directory. Once  done, go to Mirth settings tab and click on Reload Resource.

Provide the following codes in your code template library:

Copy Directory To Directory:

function copyDirectoryToDirectory(sourceDirectory, destinationDirectory) {

var srcDirectory = new java.io.File(sourceDirectory);

var destDirectory = new java.io.File(destinationDirectory);

try {

var copyDirectoryToDirectory = new Packages.org.apache.commons.io.FileUtils.copyDirectoryToDirectory(srcDirectory, destDirectory);

} catch (exp) {



call from Transformer:

copyDirectoryToDirectory(“C:/Projects/PROJECTS/TEST/Sample Message/PDF-tests/sourcedirectory”,”C:/Projects/PROJECTS/TEST/destinationDirectory”);

Move File To Directory:

function moveFileToDirectory(sourceFileName, destinationDirectoryName) {

var srcFile = new java.io.File(sourceFileName);

var destDir = new java.io.File(destinationDirectoryName);

try {

var moveFileToDirectory = new Packages.org.apache.commons.io.FileUtils.moveFileToDirectory(srcFile, destDir, false);

} catch (exp) {



call from Transformer:

moveFileToDirectory(“C:/Projects/PROJECTS/TEST/Sample Message/PDF-tests/test2.pdf”, “C:/Projects/PROJECTS/TEST/Sample Message/”);

Move Directory To Directory:

function moveDirectoryToDirectory(sourceDirectory, destinationDirectory) {

var srcDirectory = new java.io.File(sourceDirectory);

var destDirectory = new java.io.File(destinationDirectory);

try {

var moveDirectoryToDirectory = new Packages.org.apache.commons.io.FileUtils.moveDirectory(srcDirectory, destDirectory);

} catch (exp) {



call from Transformer:

moveDirectoryToDirectory(“C:/Projects/PROJECTS/TEST/Sample Message/PDF-tests/sourcedirectory”, “C:/Projects/PROJECTS/TEST/destinationDirectory”);

Copy File To Directory:

function copyFileToDirectory(sourceFileName, destinationDirectoryName) {

var srcFile = new java.io.File(sourceFileName);

var destDir = new java.io.File(destinationDirectoryName);

try {

var copyFileToDirectory = new Packages.org.apache.commons.io.FileUtils.copyFileToDirectory(srcFile, destDir);

} catch (exp) {




call from Transformer:

copyFileToDirectory(“C:/Projects/PROJECTS/TEST/Sample Message/PDF-tests/test.pdf”, “C:/Projects/PROJECTS/TEST/Sample Message/”);

Integrating – AWS EC2 (MySQL) to Mirth Engine

For this post. I have purchased a personal EC2 instance in AWS environment (a free tier for one year). Specifically an Amazon-AMI Image AWS instance with Fedora Operating System.

In the remote EC2 system, MySQL is deployed and a database is created as test. Once the DB is created in the EC2 instance, you have to create a table with some sample patient demographics information.

How to Access AWS remote server?


Open the putty client and put the AWS amazon hostname in the Host Name or IP adress text box. Else you can put the elastic IP of your AWS instance there. It doesn’t always need to be a complete hostname.

  1. Then select SSH on the left side window of the putty and select on Auth. 
  2. Once you selected Auth on the right window pane click on the Browse button and select the private key you have downloaded from the amazon.
  3. This private key will be a .ppk file. This PPK file will be necessary to establish the SSH connectivity between your putty client and the AWS system.


  1. Once this is done click on the Open button on the bottom.
  2. Once you open you will be prompted to enter the username. If you purchased a Linux Ubuntu System the default user name will be ubuntu.
  3. If you have purchased a different system (say here I have purchased the Amazon AMI system). For the Amazon AMI system you need the password as ec2-user. 
  4. you can install MySQL linux distribution based the version you use. For Amazon AMI linux it uses the Fedora System. For Fedora use this command to install the MySQL. Before executing the below command do the command sudo yum update once.
dnf install mysql-community-server

For the Debian Ubuntu distribution use the below command for the MySQL installation

  • sudo apt-get update
  • sudo apt-get install mysql-server

Once installation is done. Log into your MySQL database on the AWS linux box. You can do this by typing the below command in the putty client box.

mysql -u root -p

Once it is done, you will enter into mysql> type show databases; initially you will not have a new database been created at your end. you have to create a new database for yourself. use the below command and create a new database.

create database test;

This will create a new database. Now we have to use this database and create the tables in it. Use the below command to use this database;

use test;

Here, test is the database name that i’m creating in the box. you can have the name of your choice. Then create a table with the following fields as provided in the screen shot below.


  1. Once you have created it, log on to your AWS web console. And open the port in the security groups on the web console.
  2. Click on Edit button after selecting the security group and then add the port 3306 on TCP.
  3. Only when we do this the inbound socket of your remote system will open, then only your local Mirth system can establish the communication to the system in AWS environment.

In case if you install any application server like the Apache or Tomcat you would initially want to open the specific ports which are specified in the httpd.conf or catalina.conf files. We want to go to the security group on the AWS console and then open enable those inbound ports, then only your (IP+Port) combination will work in the browser, this IP+Port combination is technically referred as Socket.

Create a channel in your local Mirth Connect. make the source to the javascript reader and keep the polling frequency of your interest to fetch the data from the DB. In the source connector area provide the following code.

var dbConn;
// AWS Mysql Credentials
var mySqlDriver = “com.mysql.jdbc.Driver”;
// – Server IP
// 3306 – Mysql Default Port number
// test – Database in Remote server
var mySqlConnectionTemplate = “jdbc:mysql://”;
var mySqlUserName = “root”;

// Create Parent Tag <PatientDemographics>
var patientDemographicsXml = new XML(‘<PatientDemographics></PatientDemographics>’);
// Create Parent for individual patient information <IndividualPatientInformation>
var individualPatientInfoXml = new XML(‘<IndividualPatientInformation></IndividualPatientInformation>’);

try {
// MySQL connection Template
dbConn = DatabaseConnectionFactory.createDatabaseConnection(mySqlDriver, mySqlConnectionTemplate, mySqlUserName, );
// Select statement
// patient_information – is the table name
result = dbConn.executeCachedQuery(“select * from patient_information“);

// Loop through the resultset value
while (result.next()) {
individualPatientInfoXml[‘PatientId’] = result.getInt(“pid”);
individualPatientInfoXml[‘PatientFirstName’] = result.getString(“patient_first_name”);
individualPatientInfoXml[‘PatientLastName’] = result.getString(“patient_last_name”);
individualPatientInfoXml[‘PatientMiddleName’] = result.getString(“patient_middle_ name”);
individualPatientInfoXml[‘PatientSuffixName’] = result.getString(“patient_suffix_name”);
individualPatientInfoXml[‘PatientDateOfBirth’] = result.getString(“patient_date_of_birth”);
individualPatientInfoXml[‘PatientGender’] = result.getString(“patient_gender”);
individualPatientInfoXml[‘PatientAge’] = result.getInt(“patient_age”);
individualPatientInfoXml[‘PatientAddress1’] = result.getString(“patient_address_1”);
individualPatientInfoXml[‘PatientAddress2’] = result.getString(“patient_address_2”);
individualPatientInfoXml[‘PatientEmailAddress’] = result.getString(“patient_emailAddress”);
individualPatientInfoXml[‘PatientTelecomNumber’] = result.getString(“patient_telecom_number”);
individualPatientInfoXml[‘PatientRace’] = result.getString(“patient_race”);
individualPatientInfoXml[‘PatientEthincity’] = result.getString(“patient_ethincity”);
individualPatientInfoXml[‘PatientMaritalStatus’] = result.getString(“patient_maritalstatus”);
individualPatientInfoXml[‘PatientLanguage’] = result.getString(“patient_language”);
individualPatientInfoXml[‘PatientCountry’] = result.getString(“patient_country”);
individualPatientInfoXml[‘PatientState’] = result.getString(“patient_state”);
individualPatientInfoXml[‘PatientCity’] = result.getString(“patient_city”);
individualPatientInfoXml[‘PatientZipCode’] = result.getString(“patient_zipcode”);
individualPatientInfoXml[‘PatientSSN’] = result.getString(“patient_ssn”);
individualPatientInfoXml[‘PatientDriverLicense’] = result.getString(“patient_driver_license”);

patientDemographicsXml[‘PatientDemographics’] += individualPatientInfoXml;

individualPatientInfoXml = new XML(‘<IndividualPatientInformation></IndividualPatientInformation>’);

msg = patientDemographicsXml;

return msg;

} finally {
if (dbConn) {

Once the connector code is created then you will be able to fetch all the data which are available in the database as a batch, instead of single row entries. You will be able to accumulate all the data from the database on each row as shown below:


These data will be accumulated inside mirth as a XML data created in batch and the output of your mirth data will be as shown below:

<PatientAddress1>No:8, washington, test drive</PatientAddress1>
<PatientAddress2>Oregan, detroit</PatientAddress2>
<PatientAddress1>4/12 Stevie Street, jj colony</PatientAddress1>
<PatientAddress2>Michigan, detroit</PatientAddress2>

Happy Integrations !!!!!


How to create Zip file with multiple files in Mirth?

var zipFile = “c:\\test.zip”;
var sourceFiles= new Array(${arrayOfFileNames});

try {

var buffer = java.lang.reflect.Array.newInstance(java.lang.Byte.TYPE, 5000);
var fileOutput = new java.io.FileOutputStream(zipFile);
var zos = new java.util.zip.ZipOutputStream(fileOutput);

for (i = 0; i < sourceFiles.length; i++) {

var srcFile = new java.io.File(sourceFiles[i]);
var fis = new java.io.FileInputStream(srcFile);

zos.putNextEntry(new java.util.zip.ZipEntry(sourceFiles.getName()));
var length;

while ((length = fis.read(buffer)) > 0) {
zos.write(buffer, 0, length);

} catch (ioe) {
logger.debug(“Error creating zip file: ” + ioe);

Paste the above provided code in the destination Javascript Writer. It does not require any external Java libraries to achieve this task as well.


Happy Integrating……

Parse responses in response tab

Imagine that we are sending certain data to external sources and we are getting response from those source. Now the received response has to be parsed inside the response tab.


Imagine we are receiving an HTML response like this as provided below:

<!DOCTYPE html PUBLIC “-//W3C//DTD XHTML 1.0 Strict//EN” “http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd”&gt;
<html xmlns=”http://www.w3.org/1999/xhtml”&gt;
<meta http-equiv=”Content-Type” content=”text/html; charset=iso-8859-1″/>
<title>401 – Unauthorized: Access is denied due to invalid credentials.</title>
<style type=”text/css”>
body{margin:0;font-size:.7em;font-family:Verdana, Arial, Helvetica, sans-serif;background:#EEEEEE;}
fieldset{padding:0 15px 10px 15px;}
h3{font-size:1.2em;margin:10px 0 0 0;color:#000000;}
#header{width:96%;margin:0 0 0 0;padding:6px 2% 6px 2%;font-family:”trebuchet MS”, Verdana, sans-serif;color:#FFF;
#content{margin:0 0 0 2%;position:relative;}

401 – Unauthorized: Access is denied due to invalid credentials.

You do not have permission to view this directory or page using the credentials that you supplied.


We need to get the status that we are receiving on this and make business logic accordingly. For example,  here on this HTML response we are supposed to parse the status value 401, and if it is 401 then we need to perform actions accordingly.

This can be achieved by using external library called JSOUP that is used for parsing HTML via JAVA. download the jsoup jar library and then test it in the JAVA first using any version of eclipse. Find the JAVA code below:

import org.jsoup.Jsoup;
import org.jsoup.nodes.Document;
import org.jsoup.select.Elements;

public class TestHtml {
public void htmlDataReader() {
String htmlData =””; // your html string
Document doc = Jsoup.parse(htmlData);
Document doc = Jsoup.parse(htmlData);
Elements tds = doc.getElementsByTag(“h2”);
String text = tds.text();
System.out.println(“Data : “+ text.substring(0,3));
public static void main (String args[]) {
TestHtml val =  new TestHtml();

Output : Data: 401

The corresponding Mirth code for the above JAVA code will be as follows:

var doc = Jsoup.parse(msg);
var tds = doc.getElementsByTag(“h2”);
var text = tds.text();
var responseStatusValue = text.substring(0,3);
logger.debug(“Data : “+ text.substring(0,3));
// Business logic below of your choice


If we are going to parse the HTML data through mirth itself without using any external library, then it is little challenging because the tag <!DOCTYPE html PUBLIC “-//W3C//DTD XHTML 1.0 Strict//EN” “http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd”> is expected to come in HTML, but ex4 will not be considering this element as a XML tag. So mirth will not take this data as a XML and allow us to parse the message as normal XML.

Make your source listener as RAW instead of XML then only it will be consumed inside Mirth. In the pre-processor of the mirth write the following code:

var pattern = $gc(‘pattern’);
if (!pattern) {
pattern = java.util.regex.Pattern.compile(‘^(\\s*(<\\?xml.*\\?>)?\\s*(<!DOCTYPE[^\\[>]*(\\[\\s*(<![^>]*>\\s*)*\\])?[^>]*>)?\\s*)’);
$gc(‘pattern’, pattern);
var prolog = ”;
var matcher = pattern.matcher(message);
if (matcher.find()) {
prolog = matcher.group(1);
message = message.substring(0, matcher.start(1)) + message.substring(matcher.end(1));
$c(‘prolog’, prolog);
return message;

Inside the source transformer provide the following code. you can use ‘msg’ inside the source transformer since, those DOCTYPE! will be removed inside the pre-processor. Your pre-processor is the first place where your code reaches at first followed by other stages of mirth interface engine.

var responseData = msg[‘body’][‘div’][1][‘div’][‘fieldset’][‘h2’].toString();
var responseData = msg[‘body’][‘div’][1][‘div’][‘fieldset’][‘h2’].toString();
var responseDataStatus = responseData.substring(0,3);
logger.debug(“RESPONSE DATA : “+responseDataStatus);
// Your business Logic


You can also perform this activity in the summary of the channel. As shown in the picture below:

summary javascript

Click on the properties tab, and provide the following code. This code is basically the replace code that we will be using to remove the incoming namespace.

message.replace(‘<!DOCTYPE html PUBLIC “-//W3C//DTD XHTML 1.0 Strict//EN” “http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd”>&#8217;,”);
return message;

Similar to the scripts tab, we should be using message here instead of msg. From 3.4 version mirth has made this javascript properties tab available in the summary of the channel, which will be executed at first instead of the pre-processor.

Inside transformer we can use the code that we have used for Approach 2 directly. it will work fine.


If we want the same to be achieved in response tab, then we cannot use approach 2 or approach 3. In that case we can use the inbuilt function $(‘responseStatusLine’) which will contain the corresponding HTTP responses such as 500,401 or 200.

We can call this function directly and use it in response transformer as provided below:

//Your business logic goes here…

Mirth – DB Reader

In the Mirth tool, often people face many issues when it comes to connecting to DB either fetching data or updating/inserting data. This blog post will help you understand what are functions used for what purposes.

Starting with the Database reader concept, there are multiple misconception that needs to be clarified. Not everything are understood as the name portrays, the tool has its own definition of most of the areas and buttons.

polling settings

The option Poll Once on Start means that the database channel will start to poll the database once it is deployed. In-case if NO is selected for this option then, the channel will start to fetch the data from the database in the specified interval of time.


The option Process Batch is used, when the data are coming in batches and there is a necessity to split the data into separate batches. You may wonder, how data will come in batch from the database?. Seems there is a option available to make such polling of data from the DB. In that case you will select YES for the Process Batch and the split the incoming. In usual case, the option NO has to be selected.


Please note that the URL tab for SQL Server will be different than other traditional relational databases. Instead of IP & port in the connection string we may need to use the SQL server name or the system name. When you assign the special or custom user (sa in this example) specific to certain database then your port number will not be the usual default SQL server database port (i.e 1443). To check the port number that you are accessing the data use the following query.

USE master
xp_readerrorlog 0, 1, N’Server is listening on’

Apart from this, the other major misconception is the usage of Javascript. This will be either YES or NO. But in either case we just have to remember one thing that, this will be the triggering action that will be performed on the database. Mirth will poll the database by using the query that we write in the below area.

sql javascript

You can either use this SQL javascript area just for triggering the database by using a query, and perform the logic inside the transformers. Else some people prefer to write logic in the connector area itself by pressing YES on the using Javascript radio button.

But, if you decide to write logic in the SQL javascript area in the source, you will have to face certain limitations. Like the following:

  1. The variable used inside the SQL javascript text area cannot be used in the next javascript text area (i.e  Post-processor javascript). So just in-case you have to update a data using the data fetched previously the SQL javascript area, it will not work well.
  2. You do not have the liberty of using channelMap variable inside the SQL javascript editor area, neither the globalMap. In both the cases it will throw error.
  3. Said that, the result set value will be returned to the source transformer which can be used to process the data of our interest.

Most, misconception of all is how can we fetch data (i.e process). Mirth will fetch data from the Database only in the one-by-one row manner. There is no such possibility available in mirth to fetch 10 or 15 or 100 rows in single stretch. It’s Mirth a tool, not hulk. 🙂

Hope it clarifies some questions.

Code Templates in Mirth

Mirth is a wonderful tool which has provided options to use generalized functions in the form of code templates. This will improve the performance of the channels in the greater fashion. Code re-usability is a major success of Mirth which is the reason why Code Templates are used for.

Please see the example below.

code templates

Navigate to “Channels” tab in the Left pan of the Mirth navigation bar, create a new Channel or choose any existing channel. When a channel is selected, the options will expand on the left wing of the navigation bar. Select “Edit Code Templates” option there.

You will get a screen like showed below

create library

Note: You will not have any libraries listed at first. Unless you imported a channel from previous version of mirth that might have taken the libraries along with it.

On the right hand side you can see the list of channels that are available in the Mirth tool. You can select the channel that needs this code template to be used. In real time not all channels will use the generalized functions. It depends on the business needs of the channel creation.

Now create a new library to use in your channel by clicking the “New Library” on the left hand side navigation bar. This will list a new library on the dashboard. Please add the description for the library you have created. It is a good practice to write description where ever needed that includes writing the description for the channel even.

Now, be on the same library you have created and select “New Code Template” on the left hand side of the navigation panel. You will be provided with three options on the code template. They are as follows:

  1. Functions
  2. Drag-And-Drop code block
  3. Complied Code Block

I’m going to explain the code templates usage in here by using a simple function. I’m writing a function that will get the segment of the HL7 message and check if the length of the String in the segment is not empty. If the length of the string is 0 or empty then it should alert by saying “Length is irregular”. Below goes the code:

function checkStringLength(data) {
if (data.length != 0) {
data = data;
} else if (data.length == 0) {
data = “Length is irregular”;
return data;

Navigate to Save Changes on the left hand side of the navigation panel. Now The generalized function that has to be used again and again is successfully written in the code template. The purpose becomes meaningful only we know how to access this code, right?…

  1. To execute this. Navigate to the “Channels” tab, and double click on the channel you have created.
  2. Navigate to the Source Tab.
  3. Click on Edit Transformer on the left hand side of the navigation pan.
  4. Now you will see the source transformer dashboard area. Select the type as “Javascript”
  5. In the Right hand side  there will be three tabs available at the top. Reference, message Trees and Message Template. -> Select Message Template.
    Place the HL7 sample message provided below in the template area.

MSH|^~\&|EPICADT|DH|LABADT|DH|201301011226||ADT^A01|HL7MSG00001|P|2.3| EVN|A01|201301011223||
PID|||MRN12345^5^M11||^JOHN^A^III||19710101|M||C|1 DATICA STREET^^MADISON^WI^53005-1020|GL|(414)379-1212|(414)271-3434||S||MRN12345001^2^M10|123456789|987654^NC|

Whatever function we created in the code template will be available ready-made in the Reference tab. I have named the code template as “Check String Length“. As shown in the picture below


So I’m typing “Check String Length” in the Reference tab of the source transformer search area. You will get the function that you have created on the code template area in here directly as shown below

Poin check String length

You can drag and drop the function in your transformer Javascript area. I have just made a printing function that will get the value from the specific segments of the PID 5.1 field of the HL7 message which is basically patient’s first name. And call this function. The code is as below

logger.info(“check String length : “+checkStringLength(msg[‘PID’][‘PID.5’][‘PID.5.2’].toString()));

Now deploy the channel and send the HL7 message pasted above and send it via the dashboard channel by right clicking it. Observe the inferences on both stages when it has a value, it should display the name as it is, When it does not have value it should through the statement in the else part.

Hope it clarifies the question on how to use Code Templates.

Happy Coding & Integration …….

Don’t hesitate to reach out to me if there is any questions on the above part.


Blog at WordPress.com.

Up ↑