Wednesday, 5 December 2018

Change connection in the Integrations: Oracle Integration Cloud

There is the situation where we are required to replace the connection in the integrations. Integration cloud doesn't allow to change the connection from the integration canvas directly. So in this blog, we'll demonstrate two ways to solve this problem which will help us to easily switch the connection in the integrations.

Before we proceed further, let me explain the use case where we may require to change the connection in the integrations:
  • Let's suppose we have 10 integrations which are using the single DB connection and the name of the connection is DB_Conn. Considering, in the DB_Conn, we had configured the standard DB schema called apps. Later we realize the 5 integrations should not use standard schema to compile the package or procedures and should use some custom schema (custom_user). Then, in that case, we will have to create one more DB connection with the different name and configure custom_user and then use the DB connection in these 5 integrations.
  • We want to clone the integration and in the cloned integration wanted to use different connection
  • We are migrating the integration which uses one REST connection with name REST_Sample but we have REST connection on the different instance with a different name called REST_Conn
Below limitation need to keep in mind before we replace the connection
  • The integration should not be in locked or activate state
  • Only the connection of the same adapter type can be replaced. For example, if we have an integration using the REST adapter then we can't replace the connection of the SOAP adapter type.
There are two ways to replace the connection in the integrations.
  1. Export the integration and change the connection name in the project.xml file directly and then import
  2. Use Integration Cloud REST APIs to replace the connection
Let's get started and achieve both the ways to replace connection.

For this use case, we are considering below:
  • We have two connection of the REST adapter type with below names and having the role as the trigger
    • REST_Sample
    • REST_Conn
  • Created an integration with name Sample_Integration which initially uses the connection REST_Sample. You can see the connection name by hovering on the adapter

Approach-1 ) Export the integration, replace the connection name in the project.xml and then import back
  • Export the integration by clicking the hamburger menu parallel to the integration
  • Open the .iar file directly in the 7-zip
  • Navigate to the \icspackage\project\SAMPLE_INTEGRATION_01.00.0000\PROJECT-INF\ directory
  • Open the project.xml file and see the REST_SAMPLE. REST_SAMPLE is the REST adapter connection identifier
  • Replace the REST_SAMPLE with REST_CONN  which is another REST adapter connection identifier
  • Save the file and import the integration using the Import button
  • Hover the mouse over the connection and see the integration should show REST_Conn

Approach-2) Using Oracle Integration Cloud REST API
  • First, we will get the details of the Integration (Sample_Integration) using the GET REST API using POSTMAN tool
  • Notice the "dependencies" element in the response. It's showing the connection (REST_CONN) used in the Integration
  • Now, we'll use the update REST APIs to update the connection. Here we'll update the REST_CONN with REST_SAMPLE. Below are the details of update API
URL: /ic/api/integration/v1/integrations/{id}


Authorization: Basic EncodeBase64({username}:{password})
X-HTTP-Method-Override: PATCH
Content-Type: application/json



"dependencies": {
        "connections": [
                "id": "REST_SAMPLE"

  • Go back to the Integration list page,  hover the mouse over the connection and see the integration should the "REST_Sample" connection

Monday, 3 December 2018

Poll File from Agent server leveraging File Adapter: Integration Cloud

Oracle Integration Cloud provides a File adapter to deal with the files which reside on the local server.

File adapter allows following operations on an on-premise location, using Integration Cloud connectivity agent:
  • File polling
  • Write 
The in and out directories should be locally accessible from the server where ICS connectivity agent is installed.

Refer the blog to understand the difference between File and FTP adapter

Use Case: For the particular article, we'll leverage the File adapter to poll the file from the server where connectivity agent is installed and transfer the file on FTP server.

Let's get started and see how to achieve the use case
  • Create an Orchestration process in Integration Cloud Service with name PollFile
  • Drop File connection as a trigger point
  • Enter below information and click the Next button
    • What do you want to call your endpoint? PollFileFromServer
    • Do you want to define a schema for this endpoint? No
  • Enter below information and click the Next button
Specify an Input Directory
Enter the input directory for File reading
Specify a File Name Pattern
Specify the input file name pattern
Maximum Files
Specifies the number of files to be processed in a single poll operation
Polling Frequency
Specifies the polling operation frequency
Processing Delay
Specifies the polling operation frequency delay
Delete Files After Successful Reading
When selected, files are deleted after they are successfully read

  • Select Done button

  • Drop the FTP connection
  • Enter the endpoint name of your choice and click the Next button
  • Enter below information and click the Next button
    • Select Operation: Write File
    • Select a Transfer Mode: ASCII
    • Specify an Output Directory: /home/opc/tempfiles
    • Specify a File Name Pattern: *
    • Enable PGP security: No
  • Enter below information and click the Next button
    • Select the Do you want to define a schema for the endpoint radio button
    • Select the Select an existing schema from the file system radio button
  • Create a xsd file with below element and browse
<?xml version = '1.0' encoding = 'UTF-8'?>
<schema targetNamespace="" xmlns="">
    <element name="opaqueElement" type="base64Binary" />

Note: An opaqueElement element is created of type base64Binary because while polling the file from the File server, it returns the element of base64Binary only.

  • Click the Done button
  • Edit the mapper and map below fields:
    • fileName -> fileName
    • element0 -> opaqueElement

  • Configuration is completed now. Close and Activate the integration
  • Keep three files on agent server at /home/PollDir directory
  • Open Integration Cloud monitoring dashboard and notice three instances should be created, one for each file

  • Open the FTP /home/opc/tempfiles directory and see three files should be there

Saturday, 1 December 2018

Implement Pagination in REST Service: Integration Cloud

There are situations in which we need to fetch a large number of records from the on-premise application in the Integration Cloud. For example, there are 100k+ records resides in the on-premise database table which we need to fetch and expose them into the REST service response.

In this particular scenario, we are considering the database is installed behind the firewall. So to communicate with the on-premise application, connectivity agent is required.

Connectivity agent has some limitation in which agent can parse payload of 10MB only for a single request. Exceeding the limit may cause stack overflow error in the agent which may bring the agent down.

So to handle a large volume of data from the on-premise application, we have to keep this limitation in mind and implement the solution accordingly.

There might be multiple ways to handle such a situation. Here I'm gonna to handle this situation by implementing the pagination into the REST service which will fetch the data into the chunk.

To complete this article we should have below
  • Oracle Database 12c
  • Oracle Integration Cloud instance
  • Connectivity agent installed
Let's get started with step by step. Before we move forward, below must be the object which should be compiled in the Oracle Database
  • A table with name test_table containing below columns
  • A package with a single procedure. Below are the package specification and body script
Package specification


Package body

CREATE  OR  replace package body TEST_PAGINATION_PKG AS PROCEDURE TEST_PAGINATION_PROC(i_page IN number , i_limit IN number, i_page_size out number, p_data out sys_refcursor) AS count1 number;BEGIN
  SELECT Count(*)
  INTO   count1
  FROM   test_table ;   
  OPEN p_data for
  ORDER BY employee_number offset nvl((i_page-1),0)*i_limit rows
  FETCH next i_limit rows only;

The procedure takes two input parameter:
  • i_page IN: The parameter specifies which page number data we are looking for
  • i_limit: The parameter specifies the number of record set we want to limit in a single run
And two output parameter:
  • i_page_size: The output parameter gives the total number of pages depending on the limit we set in the i_limit input parameter. For example, if TEST_TABLE contains 150 records and we wanted to limit 10 records at a time. So, in that case, the procedure will return the value of i_page_size as 15. 
  • p_data: Contains the record set
Let's see how to achieve the pagination and get records in chunks
  • Create an Orchestration process in Integration Cloud Service with name Pagination_Int
  • Drop a REST adapter as a Trigger End point
  • Configure the below properties and click the Next button
    • What do you want to call your endpoint: GetPaginatedData
    • What is the endpoint's relative resource URI?: /employees
    • What action do you want to perform on this endpoint: GET
    • Check the checkbox Add and review parameter for this endpoint
    • Check the checkbox Configure this endpoint to receive the response
  • Add a new parameter with the name page of type integer and click the Next button
  • Select JSON Sample and then click the <<< inline >>> link, then enter below sample json

  • Click the Done button
  • Drop database adapter just above the mapper
  • Enter the endpoint name as GetData and select Invoke a Stored Procedure from the operation drop down
  • Select the below information and click the Next button
    • Select Schema: Select the schema in which package is compiled. In our case package is compiled under apps schema so selected the apps schema
    • Select Package: Select package name as TEST_PAGINATION_PKG
    • Select Procedure: Select procedure as TEST_PAGINATION_PROC
  • Click Done button
  • Click edit icon of GetData mapper
  • Map the request parameter page to I_PAGE and set the value of I_LIMIT to 100. For the instance, we would like to fetch 100 records at a time that's the reason we set the value of I_LIMIT to 100
  • Edit the GetPaginatedData mapper and map the response as below
    • Drop I_PAGE_SIZE onto PageSize
    • Set Limit to 100
    • Drop  page onto Page
    • Map FirstName, LastName, and EmailAddress
  • The configuration is completed. Save the integration, close and Activate.
  • See the number of records into the TEST_TABLE. We have 999 records in the table.
  • Let's try to hit the service from POSTMAN
HIT-1: Set the page=1 and hit the run button. Since the total number of records in the DB is 999 and we set the limit size to 100. So the total number of pages would be 999/100 = 10 in round figure. Notice the PageSize and Limit values

HIT-2: Set the page=2 and hit the run button

That's the way we can achieve pagination in REST services and handle he large number of records of on-premise application.