#1) Why was it in Batches of 15000 then Batches of 20? I understand that it processes X # of records and then commits and continues. But why is it that I'm having to set this and what makes 20 work better than 15000 and how would I know what to set it to
If you don't set the value the default of 1000 will be used, so strictly speaking you do not have to set it. You do need to think about the batch size if performance/memory is an issue. Aware IM loads all found records in a batch into memory and then performs rule processing in memory. If your batch size is large and each business object has lots of attributes this can take up a lot of memory. However, if further processing is simple, it may be faster than processing small batches. On the other hand, if processing is complicated large batches may take up CPU time (and memory). Most of the time we recommend small batches. Aware IM commits transactions after every batch.
2) What is the answer for other update processes that I will be creating? (20 or 15000 or maybe 500 or 100000??) Shouldn't the default work or at the very least not lockup a system and require AwareIM server to be shutdown?
The default is good when the number of found records is less than 1000, which is the overwhelming majority of cases. If it is greater than 1000 like in your case you need to change the default to a smaller value like 20 or 50 or 100 (you can experiment which value gives the best performance)
#3) Why doesn't the Active Processes work? It shows the Performance % as ZERO, when in fact I should assume something is happening. And why can't I kill the process if it hangs? Not that it should be hanging!
The performace of the process is a function of how many transactions have been committed. If you specify a large batch size, so that everything happens within one transaction you won't be able to see the performance or cancel the process - another reason to use small batches if processing of a large batch takes a lot of time.
#4) If the expected performance during a VERY SIMPLE batch process takes the CPU to over 98% on a machine with 2gig of RAM. You've got a major flaw in performance design somewhere
Updating records en masse is never simple. If you have thousands and thousands of records that you want to process and update you need to fine tune your batch size.
Controlled truncation of a data string) Can we change the % of process used during batch processes or something? Does reports do the same thing? My users won't be happy if 3 users start some kind of report or batch process and everything comes to a stops.
Reports work differently. Starting a batch process that updates thousands of records is never a good idea if the system is under a heavy load. It is recommended to do such batch processing overnight or whenever very few users are working with the system.