swiftinitpvtltd I have used stored procs extensively in previous aware projects for large data processings and performance reasons. I want to know if anyone using standard aware process to Bulk insert hundreds of records of single BO within 5/8 seconds. When I use standard process to insert using Process insert it takes 30 to 40 seconds to insert these many records. The UI application can not wait that long and it also needs to give user confirmation that these 300 records are now inserted quickly.
swiftinitpvtltd Yes there is a counter BO check rule that checks if the count is now 300 or 400 and then the insert process stops at 300 or 400. The count increments at each insert of BO. Once counter BO reaches 400 process to insert into main BO stops.
tford Not sure how you are collecting the instances to be inserted. Instead of firing a rule upon each insert, could you form a query that includes TAKE BEST 400 so no rules have to fire? Also are there any rule firing in a referred object?
swiftinitpvtltd Thanks for the inputs. Also one thing I observed that when I put aware IM config logs at maximum the time increases by 10 to 15 seconds vs if I put it at minimal it reduces significantly. In production app we will have medium logs. I will test some more with 400 tip and more data today and will update. There are 1/2 rules firing like IS NOT new then do this(so it does not fire but it checks). Eventually we will have few rules before going to production even if there is bulk insert at first time(NOT IS new rules)
swiftinitpvtltd Imagine if we have to book tickets for Metallica/Justien Biber show in Mumbai and then agents are going crazy... one agent trying to book 400 tickets-bulk booking(which he will resale) and then other 10 agents trying to book 300 each during 15 minutes. Now according to below article when you want to have tons of records inserted in short period its better to send it in a batch to avoid network overhead. https://stackoverflow.com/questions/1793169/which-is-faster-multiple-single-inserts-or-one-multiple-row-insert Check that auto-commit is off Open Connection Send multiple batches of inserts in a single transaction (size of about 4000-10000 rows ? you see) Close connection These are just plain inserts and there is no IS NEW rule needed. Is there any batch ops for inserting 500 rows vs doing in for loop for each? Also in this batch I need one column value to be randomly generated.