OBIA-SDE_JDE_AR_Transaction_Fact_Extract Incremental Mapping having performance issue



Business Intelligence Applications Consumer - Version 7.9.6.3 [AN 1900] and later
Information in this document applies to any platform.
Implemented OBIA 7.9.6.3 from JDE Edwards 9.1 for finance and Supply chain analytics. In SDE_JDE_AR_Transaction_Fact_Extract workflow the query used in the session SDE_JDE_AR_Transaction_Fact_Extract_Invoice_Receipt load is NOT having any date and time or $$Last_Extract_JDE_Date parameter filter for bringing incremental records. Also it's pointing to SDE_JDE_AR_Transaction_Fact_Extract_Full_Load mapping.Every time it brings the complete set of records in source.Hence the mapping is taking huge time to complete.
Cause
<Bug 16392156> OBIA7963:JDE9.1 SDE_JDE_AR_TRANSACTION_FACT_EXTRACT_INVOICE_RECEIPT LOAD ISSUE
Solution
Please reapply the SQL override to the SDE_JDE_AR_Transaction_Fact_Extract_Invoice_Receipt_Load session.

Here is the OOTB SQL for that session:
SELECT
               F03B11.RPDOC,
               F03B11.RPDCT,
               F03B11.RPKCO,
               F03B11.RPSFX,
               F03B11.RPAN8,
               F03B11.RPDGJ,
               F03B11.RPDIVJ,
               F03B11.RPICUT,
               F03B11.RPICU,
               F03B11.RPFY,
               F03B11.RPCTRY,
               F03B11.RPCO,
               F03B11.RPAID,
               F03B11.RPPYR,
               F03B11.RPPOST,
               F03B11.RPPST,
               F03B11.RPAG,
               F03B11.RPAAP,
               B.CCCRCD,
               F03B11.RPTXA1,
               F03B11.RPGLBA,
               F03B11.RPMCU,
               F03B11.RPSBLT,
               F03B11.RPSBL,
               F03B11.RPPTC,
               F03B11.RPDDJ,
               F03B11.RPRMK,
               F03B11.RPITM,
               F03B11.RPU,
               F03B11.RPUM,
               F03B11.RPRYIN,
               F03B11.RPUSER,
               F03B11.RPUPMJ,
               F03B11.RPUPMT,
               F4201.SHDOCO,
               F4201.SHAN8,
               F4201.SHSHAN,
               F4201.SHTRDJ,
               F4201.SHVR01,
               (CASE WHEN F03B11.RPACR != 0.0 THEN F03B11.RPACR
                                               ELSE F03B11.RPAG END) AS
DOC_AMT,
               RECEIPT_NO ,
               RECEIPT_ITEM,
               (CASE WHEN F03B11.RPPST='P' THEN RECEIPT_DATE ELSE NULL END)
AS CLEARED_ON_DT,
       RECEIPT_DATE AS CLEARING_DOC_DT,
               (CASE WHEN F03B11.RPFAP != 0.0 THEN F03B11.RPFAP
                                               ELSE F03B11.RPAAP END) AS
DOC_AMT_OPEN,
               F03B14_DATE.RZDCTM,
               F03B11.RPCRCD,
               CAST(F03B11.RPDOC as char(53)) AS ACCT_DOC_NUM,
               CAST(F03B11.RPSFX as char(53))AS ACCT_DOC_ITEM,
               RPPN,
       $$DB_SPECIFIC_NULL AS RZTYIN,
RZTYIN AS CLRNG_DOC,
0 AS RZPYID,
F4201.SHUPMJ,
0 AS RYUPMJ,
F4201.SHCO,
RZKCO,
MULT_CURR_FLG_CHK.CCCRYR,
'N' AS DELETE_FLG,
(CASE WHEN F03B11.RPDCT ='RB' AND F03B11.RPODOC > 0 THEN 'Y' ELSE 'N' END) AS
CGBK_FLG,
F03B11.RPTORG,
F03B11.RPERDJ,
F03B11.RPDCT AS INVOICE_DOC_TYPE,
0 AS X_CUSTOM
FROM   F03B11 LEFT OUTER JOIN  F4201
               ON  (F03B11.RPSKCO=F4201.SHKCOO
               AND F03B11.RPSDCT=F4201.SHDCTO AND
       F03B11.RPSDOC  = F4201.SHDOCO)
               LEFT OUTER JOIN
               (SELECT                RZDOC,RZDCT,RZKCO,max(RZDCTM) AS
RZDCTM,RZSFX,
               MAX(RZCKNU) AS RECEIPT_NO, MAX(RZRC5) AS RECEIPT_ITEM,
               MAX(RZDMTJ)AS RECEIPT_DATE,MAX(RZPYID) AS RZPYID,MAX(RZTYIN)
AS RZTYIN,
               MAX(RZUPMJ) AS RZUPMJ
               FROM  F03B14 X
               WHERE RZDMTJ IN (SELECT MAX(RZDMTJ)
                                               FROM  F03B14 Y
                                               WHERE X.RZDOC = Y.RZDOC
                                                               AND X.RZDCT =
Y.RZDCT
                        AND X.RZKCO = Y.RZKCO
                        AND X.RZSFX = Y.RZSFX) AND RZVDGJ = 0
               GROUP BY RZKCO,RZDCT,RZDOC,RZSFX) F03B14_DATE
               ON
               (F03B11.RPDOC=F03B14_DATE.RZDOC AND
               F03B11.RPDCT=F03B14_DATE.RZDCT AND
               F03B11.RPKCO=F03B14_DATE.RZKCO AND
               F03B14_DATE.RZSFX=F03B11.RPSFX ),
                F0010 B,(SELECT CCCRYR FROM F0010 WHERE CCCO = '00000')
MULT_CURR_FLG_CHK
WHERE RPDCT NOT IN('R1','RU','R5')
AND       B.CCCO = F03B11.RPCO
AND F03B11.RPISTR <= '0'
AND RPVDGJ = 0
AND       (F03B11.RPUPMJ >=$$LAST_EXTRACT_JDEDATE OR F4201.SHUPMJ >=
$$LAST_EXTRACT_JDEDATE  OR  F03B14_DATE.RZUPMJ >= $$LAST_EXTRACT_JDEDATE )
UNION ALL
SELECT
               F03B14.RZDOC,
               F03B14.RZDCTm AS rzdct,
       F03B14.RZKCO,
               F03B14.RZSFX,
               F03B14.RZAN8,
               F03B14.RZDGJ,
               F03B11.RPDIVJ,
               F03B14.RZICUT,
               F03B14.RZICU,
               F03B14.RZFY,
               F03B14.RZCTRY,
               F03B14.RZCO,
               F03B14.RZAID,
               F03B13.RYPYR AS PYR,
               F03B14.RZPOST,
               F03B11.RPPST,
               (F03B14.RZPAAP+F03B14.RZADSA) AS RZPAAP,
               0.00 AS AAP,
               B.CCCRCD,
               F03B11.RPTXA1,
               F03B11.RPGLBA,
               F03B14.RZMCU,
               F03B14.RZSBLT,
               F03B14.RZSBL,
               F03B11.RPPTC,
               F03B14.RZDDJ,
               F03B14.RZRMK,
               F03B11.RPITM,
               F03B11.RPU,
               F03B11.RPUM,
               F03B11.RPRYIN,
               F03B14.RZUSER,
               F03B14.RZUPMJ,
               F03B14.RZUPMT,
               F4201.SHDOCO,
               F4201.SHAN8,
               F4201.SHSHAN,
               F4201.SHTRDJ,
               F4201.SHVR01,
               (CASE WHEN F03B14.RZTAAP != 0.0 THEN
(F03B14.RZTAAP+F03B14.RZTADA) ELSE (F03B14.RZPAAP+F03B14.RZADSA) END) AS
DOC_AMT,
               F03B14.RZCKNU,
               F03B14.RZRC5,
               F03B14.RZDMTJ AS CLEARED_ON_DT,
       F03B14.RZDMTJ AS CLEARING_DOC_DT,
               0.0 AS FAP,
               F03B14.RZDCTM,
               (CASE WHEN F03B14.RZTAAP != 0.0 THEN F03B14.RZTCRC ELSE
F03B14.RZCRCD END) AS RZCRCD,
               CAST(F03B14.RZCKNU as char(53)) AS ACCT_DOC_NUM,
               CAST(F03B14.RZRC5 as char(53)) AS ACCT_DOC_ITEM,
               RZPN
       ,RZTYIN,
               RZTYIN AS CLRNG_DOC,
       RZPYID,
               F4201.SHUPMJ,
               F03B13.RYUPMJ,
               F4201.SHCO,
               RZKCO,
               MULT_CURR_FLG_CHK.CCCRYR,
       (CASE WHEN F03B14.RZVDGJ=0 THEN 'N' ELSE 'Y' END) AS DELETE_FLG,
'N' AS CGBK_FLG,
F03B14.RZTORG,
F03B11.RPERDJ,
F03B14.RZDCT AS INVOICE_DOC_TYPE,
0 AS X_CUSTOM
FROM  (SELECT CCCRYR FROM F0010 WHERE CCCO = '00000')
MULT_CURR_FLG_CHK,F03B14,  F03B13, F0010 B,F03B11
               LEFT OUTER JOIN  F4201 ON
               (F03B11.RPSKCO=F4201.SHKCOO AND
               F03B11.RPSDCT=F4201.SHDCTO AND
CAST(F03B11.RPSDOC as char(16)) = CAST(F4201.SHDOCO as char(16)))
WHERE F03B14.RZDOC=F03B11.RPDOC AND
F03B14.RZDCT=F03B11.RPDCT AND
F03B14.RZKCO=F03B11.RPKCO AND
F03B14.RZSFX=F03B11.RPSFX AND
B.CCCO = F03B14.RZCO
AND ((RZPYID = RYPYID AND RZDCTM != 'R1'
                               AND RZDCT LIKE 'R%' )
OR (RZPYID = RYPYID
               AND RZTYIN = 'A'
               AND RZDCTM = 'R1'
               AND RYDDST = '0' ))
AND(     F03B14.RZUPMJ >= $$LAST_EXTRACT_JDEDATE  OR
               F03B13.RYUPMJ >= $$LAST_EXTRACT_JDEDATE  OR
               F03B11.RPUPMJ >= $$LAST_EXTRACT_JDEDATE  OR F4201.SHUPMJ >=
$$LAST_EXTRACT_JDEDATE )

Performance issue with Informatica mapping


Business Intelligence Applications Consumer - Version: 7.9.5.1 [AA 1448] - Release: V7

Information in this document applies to any platform.
Mapping SDE_ORA11510_Adaptor.SDE_ORA_Stage_APTransactionFact_AGGRDerive.log is running for hours without completing.

Query is:

SELECT
MAX(W_GL_ACCOUNT_D.INTEGRATION_ID),
SUM(W_AP_XACT_F.AP_DOC_AMT),
SUM(W_AP_XACT_F.AP_LOC_AMT),
W_ORA_GLRF_F_TMP.GL_JE_HDR_ID,
W_ORA_GLRF_F_TMP.GL_JE_LINE_NUM,
W_ORA_GLRF_F_TMP.DATASOURCE_NUM_ID
FROM
W_ORA_GLRF_F_TMP, W_AP_XACT_F , W_GL_ACCOUNT_D
WHERE W_ORA_GLRF_F_TMP.JE_SOURCE='Payables'
AND W_ORA_GLRF_F_TMP.IMPORT_REF_ID = W_AP_XACT_F.ACCT_DOC_ID
AND W_ORA_GLRF_F_TMP.DATASOURCE_NUM_ID = W_AP_XACT_F.DATASOURCE_NUM_ID
AND W_ORA_GLRF_F_TMP.FIN_STMT_ITEM_CODE='AP'
AND W_AP_XACT_F.GL_ACCOUNT_WID = W_GL_ACCOUNT_D.ROW_WID
GROUP BY
W_ORA_GLRF_F_TMP.GL_JE_HDR_ID,
W_ORA_GLRF_F_TMP.GL_JE_LINE_NUM,
W_ORA_GLRF_F_TMP.DATASOURCE_NUM_ID

Table counts:

select count(*) from W_ORA_GLRF_F_TMP --682198
select count(*) from W_AP_XACT_F -- 8374771
select count(*) from W_GL_ACCOUNT_D -- 601689

W_ORA_GLRF_F_TMP is unindexed.
Cause
~) Bug 7348859: NEED TO CREATE THE FOLLOWING ETL INDICES IN 7.9.5.1

~) This BUG contains the following:

These indices will help improve performance on GRF Derive mappings and
AGGRDerive mappings.

1. Composite index on:
W_AR_XACT_F.ACCT_DOC_ID
W_AR_XACT_F.DATASOURCE_NUM_ID
2. Composite index on:
W_AP_XACT_F.ACCT_DOC_ID
W_AP_XACT_F.DATASOURCE_NUM_ID
3. Composite index on:
W_GL_COGS_F.ACCOUNT_DOC_ID
W_GL_COGS_F.DATASOURCE_NUM_ID

These indices have been added to the Data Warehouse in 7.9.5.1 and 7.9.6 and
the change has been pushed to all refernces.

CAUSE JUSTIFICATION
===============

~) Bug 7348859: NEED TO CREATE THE FOLLOWING ETL INDICES IN 7.9.5.1
Solution
~) Please find the following DDL. You may need to change this to reference your Index tablespace. Also please check to ensure an indexes with these name do not already exist.

CREATE INDEX
W_AR_XACT_F_M6
ON
W_AR_XACT_F
(
ACCOUNT_DOC_ID Asc
,DATASOURCE_NUM_ID Asc
)
NOLOGGING
;


CREATE INDEX
W_AP_XACT_F_M7
ON
W_AP_XACT_F
(
ACCT_DOC_ID Asc
,DATASOURCE_NUM_ID Asc
)
NOLOGGING
;

CREATE INDEX
W_GL_COGS_F_M2
ON
W_GL_COGS_F
(
ACCOUNT_DOC_ID Asc
,DATASOURCE_NUM_ID Asc
)
NOLOGGING

~) Yes you would need to add their definition in the DAC

Adding an Index to the Data Warehouse
Follow this procedure to add a new index to the data warehouse.

1. Add the new index definition into the data warehouse database.
2. In the DAC toolbar, select the appropriate source system container from the
drop-down list in the toolbar.
3. In the Menu bar, select Views, then select Design, then select Tables.
4. Query for the table for which you want to import index definitions.
5. Right-click and select Import from Database, then select Import Indices.
6. Choose to import indexes for a selected table or for all the records retrieved in
your query, and click OK.
7. In the Import Indices dialog, select DataWarehouse from the Data Sources
drop-down list.
8. Click Read Indices.
a. In the list of indexes displayed, select the Import check box for the indexes you
want to import.
b. Click Import Indices.
An informational message indicates whether the process was successful.

The Index Usage should be defined as ETL.

OBIA 11g Performance Issue With PLP_PLP_ARINCREMENTALACTIVITYLOAD and PLP_PLP_APINCREMENTALACTIVITYLOAD



Applies to:

Business Intelligence Applications Consumer - Version 11.1.1.8.1 and later
Information in this document applies to any platform.

Symptoms

OBIA 11.1.1.8.1 DataWarehouse
Data source - EBS
OBIA configured with the following Modules -

  • Oracle Financial Analytics
  • Oracle Procurement and Spend Analytics
  • Oracle Project Analytics

When performing full initial load on the modules listed above, the following two steps have been running for almost 12 hours now -

  • PLP_PLP_ARINCREMENTALACTIVITYLOAD
  • PLP_PLP_APINCREMENTALACTIVITYLOAD

Is there a way to obtain better performance?

Cause

Poor Out of the Box configuration led to poor performance in the ETL
  • Bug 18952097 PERFORMANCE ISSUE WITH PLP_APINCREMENTALACTIVITYLOAD.W_AP_BALANCE_F_TMP
  • Bug 17509570 RFA - HOW TO CALCULATE THE TEMP TABLESPACE REQUIRED TO RUN A SCENARIO?

These BUGs also offer suggestions on how to resolve the issue

Solution

Solution Steps:
ODI Changes to AP and AR:
1. Create Custom Folder and Modify AP/AR PLP Mappings
For AP:

1) Login to ODI repository from the ODI studio and go to “BI Apps Project”
2) Under “Mappings” folder, create a new custom SDE adaptor folder based on the appropriate EBS release version in your environment.

a. Right-click the Mappings folder and select New Sub-Folder
b. Set Name as CUSTOM_<Original Folder Name>. For example, if this is for PLP, create custom folder CUSTOM_PLP. This represents the custom PLP folder for the original PLP folder.
c. Click the Connect Navigator button in the Designer tab.
d. Select Edit Release Tags.
e. Select the release tag that corresponds to your source. For example, BIA_11.
f. Select the custom PLP folder you created and add it to the release tag.
g. Click Next
h. Click Finish
3) Enable versioning for the preconfigured Task Folder, PLP_APIncrementalActivityLoad, to be customized. The version comment should indicate this is the base version of the task. Subsequent patches applied to this task in the future would require increasing the version in the comment so that it can be compared to the original task to identify any changes.
a. Go to appropriate PLP folder, right-click the Task folder PLP_APIncrementalActivityLoad and select Version > Create Version.
b. Accept the default version number, 1.0.0.0.
c. Add a description indicating that this is the original version of this task.
4) Duplicate the Task folder to be customized (i.e. PLP_APIncrementalActivityLoad) by copying it. Cut and paste the copied task folder to the Custom adaptor folder you have created (e.g. CUSTOM_PLP), and rename it to remove the 'Copy of…' prefix.
5) Using the same method as in step 3, enable versioning of copied Task folder. The version comment should indicate this is the original version. This versioning enables comparison of the customized task to a copy of the original version to determine all changes that have been introduced.
6) Create another version of the copied task PLP_APIncrementalActivityLoad. The version comment should indicate this is the customized version. Use the same steps as above.
7) Expand the map in the custom adaptor folder, Expand “Interfaces” to find TWO entries as below.
8) Double click a particular interface (PLP_APIncrementalActivityLoad.W_AP_BALANCE_F_TMP_SQ_PLP_AP_INCR_ACTIVITY_LOAD) opens ‘Overview’ Page on right hand side. Now click on ‘Quick-Edit’ tab
available in bottom section of this page

9) Go to the Join section and expand for the joins to be visible. Change the highlighted join condition with the below by clicking on the join expression.
(CASE WHEN W_AP_XACT_F.ACCOUNTING_DT_WID=0
THEN
W_AP_XACT_F.POSTED_ON_DT_WID
ELSE
W_AP_XACT_F.ACCOUNTING_DT_WID END)=W_DAY_D.ROW_WID 

10) Apply and Save the Interface

Next, delete the scenario associated with the original, out-of-the-box task and generate a new scenario for the custom task using the same scenario name as the original. ODI enforces unique scenario names and we want to use the same scenario name for the custom task so that the load plan executes this ETL task rather than the out-of-the-box ETL task.

Delete the out-of-the-box scenario:

  • Navigate to the out-of-the-box Task folder - Packages - <Package Name> - Scenarios - <Scenario Name>
  • Make a note of the Scenario Name (you can double click and use CTRL-C to copy the scenario name)
  • Right click the scenario and select 'Delete'
i) Before regenerating the scenario, go to the original task folder PLP_APIncrementalActivityLoad in the OOTB mapping folder (e.g. PLP), delete the existing OOTB scenario (e.g. PLP_PLP_APINCREMENTALACTIVITYLOAD).

Generate a scenario for the custom task:

  • Navigate to the custom Task folder - Packages - <Package Name>
  • Right click the package and select 'Generate Scenario'
  • Use the original, out-of-the-box scenario name (CTRL-V to paste the name if you copied it previously)
  • Check the 'Generate scenario as if all underlying objects are materialized' box
  • Click 'OK'
  • Select 'Use All' from the 'Startup Parameters' drop down box
  • Click 'OK'
When you execute the load plan, it will now execute the custom task rather than the original task.

ii) Go back to the modified PLP_APIncrementalActivityLoad in the custom folder, go to Packages and generate the scenario using the option to generate the scenario as if all underlying objects are materialized. Rename the scenario name to use the original out of box scenario name. For example, PLP_PLP_APINCREMENTALACTIVITYLOAD for PLP.
In the future if you make changes to any of the interfaces or the package, you can either regenerate the existing scenario or generate a new scenario. Unless you need separate scenarios, it is recommended that you regenerate the existing scenario. To do this, right-click the scenario and select Regenerate.

Along with the above changes let them add the below HINT1: Hardcode HINT1 value in the Load Plan component

For AP:
FULL=/*+ USE_HASH( W_AP_XACT_F W_GL_ACCOUNT_D)*/

For  AR:

Similar change should be done in the custom adaptor folder:

1) Open the map PLP_ARIncrementalActivityLoad Expand this map and find “Interfaces” inside this folder and expand “Interfaces” to find TWO entries.
2) Double clicking a particular interface (PLP_ARIncrementalActivityLoad.W_AR_BALANCE_F_TMP_SQ_PLP_AR_INCR_ACTIVITY_LOAD) opens ‘Overview’ Page on right hand side. Now click on ‘Quick-Edit’ tab available in bottom section of this page 3) Go to the Join section and expand for the joins to be visible. Change the highlighted join condition with the below by clicking on the join expression.

(CASE WHEN W_AR_XACT_F.ACCOUNTING_DT_WID=0
THEN
W_AR_XACT_F.POSTED_ON_DT_WID
ELSE
W_AR_XACT_F.ACCOUNTING_DT_WID END)=W_DAY_D.ROW_WID
4) Apply and Save the Interface Delete the scenario associated with the original, out-of-the-box task and generate a new scenario for the custom task using the same scenario name as the original. ODI enforces unique scenario names and we want to use the same scenario name for the custom task so that the load plan executes this ETL task rather than the out-of-the-box ETL task.
Delete the out-of-the-box scenario:
  •  Navigate to the out-of-the-box Task folder - Packages - <Package Name> - Scenarios - <Scenario Name>
  •  Make a note of the Scenario Name (you can double click and use CTRL-C to copy the scenario name)
  •  Right click the scenario and select 'Delete'

i) Before regenerating the scenario, go to the original task folder PLP_ARIncrementalActivityLoad in the OOTB mapping folder (e.g. PLP), delete the existing OOTB scenario (e.g. PLP_PLP_ARINCREMENTALACTIVITYLOAD).

Generate a scenario for the custom task:

  •  Navigate to the custom Task folder - Packages - <Package Name>
  •  Right click the package and select 'Generate Scenario'
  •  Use the original, out-of-the-box scenario name (CTRL-V to paste the name if you copied it previously)
  •  Check the 'Generate scenario as if all underlying objects are materialized' box
  •  Click 'OK'
  •  Select 'Use All' from the 'Startup Parameters' drop down box
  •  Click 'OK'
When you execute the load plan, it will now execute the custom task rather than the original task.

ii) Go back to the modified PLP_ARIncrementalActivityLoad in the custom folder, go to Packages and generate the scenario using the option to generate the scenario as if all underlying objects are materialized. Rename the scenario name to use the original out of box scenario name. For example, PLP_PLP_ARINCREMENTALACTIVITYLOAD for PLP.

In the future if you make changes to any of the interfaces or the package, you can either regenerate the existing scenario or generate a new scenario. Unless you need separate scenarios, it is recommended that you regenerate the existing scenario. To do this, right-click the scenario and select Regenerate.

Along with the above changes add the below HINT1: Hardcode HINT1 value in the Load Plan component
For AR:

FULL=/*+ USE_HASH( W_AR_XACT_F W_GL_ACCOUNT_D)*/
But before doing this they have to follow the below instructions:
Instructions:
In the CUSTOM folder:
1) Go to the Flow tab of the yellow interface:PLP_ARIncrementalActivityLoad%
2) Navigate to properties of the IKM.Go to OBI_HINT_NESTED and add #HINT1 in expression editor.Save
3) Regenerate the scenario.
4) Select #HINT1 when the checkbox shows up during scenario regeneration.
5) Now go to the Load Plan component and pass the HINT1 value as mentioned above.