Wednesday, January 13, 2010

How to implement Payload Splitting using XSLT in BPEL

This blog is on the implementation details of how to process huge XML payload files in smaller chunks. As you know out of memory happens because XSL transformation and translation leaves huge memory footprint and it proportionally grows with XML payload size. In order to avoid that what we have done is to parse the XML in smaller chunks. To implement this we brought in indexes to XSL transformation. This is done using a begin Index and end Index which will be dynamically passed into the XSL. It helped in getting control on the no.of records being transformed for processing.

I will list down some important steps that were followed as part of the prototype we had developed before implementing it in real time scenarios.

To start with we created 2 BPEL processes – PayloadSplitter and PayloadProcessor

PayloadSplitter will have the logic for breaking down the XML payload into smaller chunks. PayloadProcessor will have the logic required for processing the payload. I will be focusing only PayloadSplitter as that takes of managing the payload.


  1. Initialization

a. Create a Variable params which should be of type parameters(Refer Param.xsd)

b. Create a variable totalCount for storing the no of records in the payload.

c. Create a variable incrementSize for storing the no of records that needs to be processed in each loop.

d. Create a variable currentIndex for storing the end index of the records processed in each loop.

For eg:-If we have 100 records to be processed and 5 records needs to be processed in each loop.

totalCount = 100 & incrementSize = 5. currentIndex for first loop will be 5 as it will be initialized as incrementSize.

For 2nd loop currentIndex will be 10(currentIndex + incrementSize) and so on

Code Snippet :: Custom schema - Param.xsd

<xsd:schema xmlns:xsd=""





<xsd:element name="parameters">



<xsd:element name="item" minOccurs="1" maxOccurs="unbounded">



<xsd:element name="name" type="xsd:string"/>

<xsd:element name="value" type="xsd:string"/>








  1. Assign XML fragment mentioned below to the parameters

Code Snippet :: XML fragment in Assign

<bpelx:assign name="Assign_ParamNames">



<parameters xmlns="">











<to variable="params" query="/ns2:parameters"/>


3. Assign 0 as the begin index to /ns2:parameters/ns2:item[1]/ns2:value .

(You will have to manually edit the index value[i] in the XPath tab)


<from expression="'0'"/>

<to variable="params" query="/ns2:parameters/ns2:item[1]/ns2:value"/>


4. Assign incrementSize as the end index to /ns2:parameters/ns2:item[2]/ns2:value

(You will have to manually edit the index value [i] in the XPath tab)


<from variable="incrementSize"/>

<to variable="params" query="/ns2:parameters/ns2:item[2]/ns2:value"/>


  1. Initialize currentIndex as incrementSize (only for first loop)
  2. Now starts the while loop which will process records in smaller chunks.

Condition check : totalCount > 0

Inside the while loop we will be having 3 main activities.

a) Transform the payload in smaller chunks to the format expected by PayloadProcessor

b) Invoke the payload processor (invoke activity along with a partner link to PayloadProcessor)

c) Assign activity to update the loop variables for the next run of the loop.

  1. Have a XSLT transform activity where we will be leveraging the feature of Passing Parameters into XSLT.

Refer Ram Kumar’s article :Passing BPEL Variable contents into XSLT as Parameters

We will pass the params variable to the XSL which is populated with beginIndex and endIndex values for first loop.

The transformation from the payload variable will happen to the input variable of the Payload Processor.

Code Snippet :: .bpel file

<assign name="Transform_1">





<from expression="ora:processXSLT('Transformation_1.xsl',


bpws:getVariableData('params') )"/>

<to variable="invokeProcessInputVariable" part="payload"/>



Code Snippet :: .xsl file




<xsl:param name=" beginIndex"/>

<xsl:param name="endIndex "/>

<xsl:template match="/">


<xsl:for-each select="/ns1:PayloadSplitterProcessRequest/ns1:inputData">

<xsl:if test="(position() > $ beginIndex) and (position() <= $ endIndex)">










  1. Have an invoke activity to call the PayloadProcessor .PayloadProcessor BPEL process should contain the logic to process the payload.
  2. In assign Activity

a) TotalCount = TotalCount - incrementSize

b) Assign endIndex to beginIndex for next loop.


<from variable="params"


<to variable="params"



c) currentIndex = currentIndex + incrementSize

d) Assign currentIndex to endIndex.


<from expression="bpws:getVariableData('currentIndex')"/>

<to variable="params"




That’s pretty much the logic required for creating Payload Splitter .Special thanks to Srinivas Kommanaboyina who had worked on developing this prototype.

Thursday, January 7, 2010

Tracking instances from BPEL Console

In one of my blog’s I had referred to multiple ways of using java in BPEL and how to use internal BPEL API’s. Here goes the next one. Recently I had chance to work with the support team who was managing one of the projects I had worked on. So one of the main issues they had was on tracking the instances to the batch of records it had transferred. There were many legacy systems which didn’t have any triggering mechanism because of which integrations were scheduled to run every 5 minutes so as to get a close real-time integration. Because of this, there used to be thousands of instances appearing in console which made it harder for support team to find the instance and debug the issues.

As part of the best practices, we had implemented a common message header among all the integrations. One of the fields in the message header was a GlobalMessageId which will be a unique value for each instance. In the final stages of design phase the unique global message Id was decided to be used as batch Id for the records being transferred in each run. In course of time the Global message ID became one of the critical factors in tracing a transaction.

There were scenarios where we had transferred the records to the end system but had not triggered the procedure properly .So to track back on what went wrong with these batch of records, the Id that came in handy was the unique batch Id’s which was same as the Global message Id of the BPEL process instance which transferred those records.

So one of the ways we used to help the support team was to overwrite the Title of the BPEL process with the global message Id so that if we have the batch ID of the records which has issues ,we can filter the instances with the same ID and figure out what went wrong with the process. The overwriting was done using Java embedding.


From a support perspective it will be really helpful if we do small workarounds like this. It can make their life simpler not really smooth, since they need to figure out what went wrong with the instance. On this happy note happy new year J