Home › Forums › Development › Loop within Service
Tagged: batch mode, Database, service process
- This topic has 7 replies, 3 voices, and was last updated 8 years, 9 months ago by Aravinddokala.
-
AuthorPosts
-
-
November 25, 2015 at 8:15 am #440ashutoshParticipant
Hi Team,
We need to read database table(Table1) in batch mode and persist recursively the data into another table(Table2).
Eg: If I have 1000 rows in Table1 then I want to loop 10 times and pick 100 record each time from Table1 and persist in Table2 recursively .Please find the screenshot of the service process that we have made. We want to know how we can loop over that transformation and persist.
We tried the loop pallet but was not able to use it. Service works good for reading all the table data(Table1) and persist in the Table2.
Appreciate any help on this.
Regards,
Ashutosh0Attachments:
You must be logged in to view attached files. -
November 25, 2015 at 3:30 pm #443Ed ShawKeymaster
I’ve looked at what you are trying to accomplish here and I’ve written a post on the subject. I hope you find it helpful. It can be found at:
https://community.nexj.com/featured-posts/2015/11/27/all-about-a-very-simple-etl-activity/
1 -
November 26, 2015 at 5:11 pm #445Ed ShawKeymaster
You are right. The loop step in a service is probably not what you want.
First a bit on the loop step
The loop step iterates over an input collection and executes activities on each item. It has the following properties.
caption: Caption for the loop in the service editor
collection: An expression returning a list, collection or iterator. Defaults to ‘this’. e.g. (this’rows)
description: Loop description
name: Step name
variable: The name of the service variable into which the current item will be placed. Defaults to ‘this’
The reason you probably don’t want to use the loop that the loop does one item at a time and you said want to do paging.
There are examples of paging through a collection in services. One of these is the ExternalDataLoad service in Finance. This one is more complicated than you will need because it is doing complex error handling and recovery. Another reason is that we are reading from a file in that case.
Another approach is to use for-each-page to dump a page at a time into queues for asynchronous processing on by another service.
My preferred approach is to use a service to kick off an ETL Activity. I will briefly describe this now, but will post a detailed blog entry in the near future.
The steps go like this…
1) Create a message that matches your input class
2) Create a transformation that transforms to your output class
3) Create an ETL Activity that does a Class Read, then a Transformation, then a Class Write within a pipeline.
Watch for the blogs on using ETL Activities for this purpose.
0-
November 26, 2015 at 5:40 pm #447ashutoshParticipant
Hi Ed,
Yes, may be loop concept will not work in service as you are saying. But we think we cannot go with ETL as well because we want to do some processing on records that we have fetched.
Below is the logic we are trying to achieve:
Step 1:
Read the database Table1. suppose it has 1000 rows.Step 2:
I want to pick 100 rows at a time and give it to my transformation(Processing on each record) and persist results in Table2.Step 3:
Step 2 will repeat until all the data is processed from Table1.we are trying to implement some logic so that we can tell Step 2 to run number of times as we want. And its going to be dynamic. because now I have 1000 rows in my Table1 but tomorrow I may get 100,000 rows. So, our service will run 100 times to process in batch of 1000 records per batch.
Hope you are able to get what we are trying to implement. We are not very sure how ETL will help on this.
Regards,
Ashutosh0
-
-
November 26, 2015 at 11:22 pm #449Ed ShawKeymaster
You can run script steps and transformation steps in ETL Activities and do it in a flexible page-by-page manner.
0-
November 27, 2015 at 2:12 pm #451ashutoshParticipant
Thanks ED, waiting for the ETL blog :). That learning will make our process more easier to implement.
We are using ETL process for truncating the table before we start the processing(Invoking ETL step within the service).
one more question we have, In our service we have a transform step and after that we have persist step. We want to know when does the commit happens ? Is it happening in the persist step or after the END pallet(step).
If suppose commit doesn’t happen at the persist step or before END step. how we commit after persist step and before END.
Regards,
Ashutosh0
-
-
December 8, 2015 at 10:40 am #455AravinddokalaMember
Hi,
Here is one of the way (cursor), looping can be achieved:(for-each-page class attributes where order-by max-count page-size lazy? fun): function
(for-each fun arg args…): functionCode snippet:
123456789(for-each-page TestClassName '() '() '() '() 1000 #f(lambda(page)(for-each(lambda(record)...) ; end of for-each...) ; end of for-each-pageHere, ‘1000’ is the number of records per page to be processed at ease.
thanks
Aravind0
-
-
AuthorPosts
- You must be logged in to reply to this topic.