Loading data from a flat file coming from SFTP/BI Publisher or any third party source is a very common requirement for many of the integration. The traditional way for achieving this task is to invoke the ATP adapter from OIC which can read data in chunks and insert into an ATP database table. Considering we have huge volume of data, this is a very time consuming process. So, in this blog I am going to explain you about this new performance enhancement that can be achieved with OIC and ATP Adapter to insert data from large flat files into ATP databases. There is a new cloud library available in the ATP database using which this operation can be completed much faster.
Basic/traditional Integration flow structure:
The basic solution is demonstrated in the below diagram where in the integration picks up the CSV file from the embedded SFTP server, which is now part of OIC-FileServer Gen2. We use the OIC stage activity to read the entire file in the staging area. Then we make use of the for-each loop which reads each records sequentially and insert into the database table. For more information on the steps please refer this article.
Enhanced Integration flow structure:
The enhanced structure is demonstrated in the below diagram where the same CSV file is read from from the embedded SFTP server and uploaded into a bucket in Oracle Cloud Infrastructure using a REST API invoke. Then we call the ATP adapter which invokes a PL/SQL procedure and reads the file from Object Storage and inserts into the database tables.
Solution Architecture for the Enhanced Process flow:
Steps to build the enhanced integration flow is described below:
- Create an SFTP connection: This will read the file from the embedded SFTP server. For more details on how to create a REST connection refer this article.
- Create an Object Storage REST connection: This will upload the file into a bucket in OCI. For more details on how to create a REST connection refer this article.
- Create a PL/SQL procedure: Using the DBMS_CLOUD.CREATE_CREDENTIAL procedure to access the object storage and DBMS_CLOUD.COPY_DATA to load the data from object storage to the database table.
BEGIN DBMS_CLOUD.CREATE_CREDENTIAL ( credential_name => '<credential name>', username => '<username>', password => '<auth token>' ) ; END; BEGIN DBMS_CLOUD.COPY_DATA(table_name => '<table name>', credential_name => '<credential name>', file_uri_list => '<OCI bucket file list URI>', schema_name => '<schema name>', format => JSON_OBJECT(--'type' VALUE 'csv', 'escape' VALUE 'true', 'delimiter' VALUE ',', 'skipheaders' VALUE '1', 'ignoremissingcolumns' VALUE 'true', 'removequotes' value 'true') ); END;
- Create an ATP connection: Use this connection and perform the operation as Invoke a stored procedure which will invoke the package which is created in the previous step.
At the end, the complete end-to-end flow should look like the one shown below.
That’s all about this post. Thank you!
If you like this article, please like, comment, and share. Request you to subscribe to my YouTube channel for more such videos and don’t forget to subscribe to our channel to get regular updates.