Issues importing large amounts of records in batch job

This forum allows users to post and respond to "How Do I Do ....." questions. The information contained in this forum has not been validated by K-Rise Systems and, as such, K-Rise Systems cannot guarantee the accuracy of the information.
Post Reply
CathyC
Posts: 469
Joined: November 16th, 2021, 11:15 am
Contact:

Issues importing large amounts of records in batch job

Unread post by CathyC »

I was talking with Matt and we both had situations where we needed to import many records and ran into design issues in the batch job.
I know the recommended approach is to save to db (described in this forum post). Below is the summary of the approaches tried and how they turned out
Ideally all approaches should have some way to work, though approach 1 and 3 would not be recommended for data that could change.
Can I get some feedback on a setup for approaches 1 and 2 that would not run into these issues?

Approach #1: RunQuery Paging
Using the paging feature for RunQuery by setting Paging = True, # of records per page = 50
Then set up the canvas so it looped back to RunQuery until EOF = True
This approach meant that it would have multiple queries against JDE
Failed: hung at ~30k records and caused the whole batch server to hang

Approach #2: Save to Db
Using the save to db feature of a RunQuery and retrieving using the CreateServiceNodeFromDatabase.

Platform3 App1: pulling 50 records at a time, saving to database, ECC timeout 2400000 (40min), Data Size 9.77 KB.
Tried to pull in over 10k records and it would timeout. Eventually the requirements were changed and more filters were added so only 2k records were needed, so this was never solved.

Platform3 App2: same variables. 50 records at a time, saving to database, ECC timeout 240000, data size 9.77 KB.
Tried to pull in 16k records, but get an ECC timeout message:
An error occurred in the ItemQuery RunQuery service: No message received from ECC for application : 100093 (Connector Application : 100053). Timeout: 2400000 ms

Attempt 3: 50 records at a time, data size: 1MB, 10 minute timeout.
Needed to pull in 191k records, but lowered it to 10k and it still timed out.

Approach #3: Run Batch Job at regular frequency and resume progress
run for 1k at a time and let the batch job run every minute and resume at the last ID
keeping track of the last ID that was processed and doing a ID > lastId in the Query with a top 1000
This worked
Last edited by CathyC on May 5th, 2023, 11:23 am, edited 4 times in total. word count: 362

Tags:
CameronDouthitt
Posts: 15
Joined: February 10th, 2022, 10:16 am
Contact:

Re: Issues importing large amounts of records in batch job

Unread post by CameronDouthitt »

Recent development from Platform2 on 5.1 classic:

They were currently using the approach #1 for their AddressImport and TopSellingItems batch jobs.

The issue found was that the query that was being looped on never reach EOF. So the batch job would run indefinitely until the server ran out of resources. Especially because both of these would run at the same time each day.

Solution:
I had to change these over to use approach #2. This is current best practice.

Conclusion:
1. When looping over any set of data that has the potential to get large in a batch job, approach #2 should be the method used.
2. Make sure to set the ECC timeout(if JDE through ECC) to a high enough setting so the connection to JDE stays open long enough to read all data from the query in one go.
3. Make sure to set the data size high enough that it can receive all data that is coming from the query.
word count: 165
CameronDouthitt
Posts: 15
Joined: February 10th, 2022, 10:16 am
Contact:

Re: Issues importing large amounts of records in batch job

Unread post by CameronDouthitt »

Successful uses from Platform2 update:

These successful batch jobs were done using approach #2.

Job 1:
- Loop1: pulling 25 records at a time from SaveToDB. Actual count 671. ECC timeout(none-application query). Data Size 97.66KB .
Job 2:
- Loop1: pulling 50 records at a time from SaveToDB. Actual count ~2000. ECC timeout(600000). Data Size 9.54 MB
- Loop2: pulling 50 records at a time from SaveToDB. Actual count ~1500. ECC timeout(600000). Data Size 976.56 KB

All of these loop setups are working successfully to resolve the initial issues found when using approach #1.
word count: 97
SumanPrasad
Posts: 88
Joined: August 26th, 2021, 5:21 pm
Contact:

Re: Issues importing large amounts of records in batch job

Unread post by SumanPrasad »

Task 10697 : Solve ECC timeout issue
Task 10698 : Solve Batch job loop issues
word count: 13
JustinVanRegenmorter
Posts: 517
Joined: August 26th, 2021, 9:56 am
Contact:

Re: Issues importing large amounts of records in batch job

Unread post by JustinVanRegenmorter »

Task 10698 has been resolved in RunTime Revision 1065.
word count: 9
darshanP
Posts: 3
Joined: October 31st, 2021, 2:14 pm
Contact:

Re: Issues importing large amounts of records in batch job

Unread post by darshanP »

Had an issue with the Item import query. I was seeing an error - 'Unable to read data from the transport connection.'
I changed the inquire service datasize to 512000 from 10000 and the timeout is set to 2400000. That fixed the error and was able to get 16220 records.
word count: 49
Post Reply