I have a table for Aus Postcodes ranging from 1000 to 9999. (2048 records).
Import brings in all 2048 values into the MySQL table, and Utility/Search shows all 2048 records, but my attribute (Job.PostCode) will not allow me to see/select anything above PostCode 3165 (1000th record) - no matter what value I enter into Import/"Batch Size".
Am I exceeding some limit, or is there a "Commit ALL records" selection for batch size???
Limit on imported records??
Hi Tom:
There are 2 BO's: Job and PostCode
Job has several attributes that require a PostCode value (Job.JobPostCode type PostCode, Job.OwnerPostCode type PostCode, etc). These are pulled into Job from BO PostCode via a drop-down box in BO Job (dynamic as user types because there are 2000+ postcodes ranging from 0000 - 9999).
Entering and storing any PostCode up to 3165 (exactly 1000 records)works great. If we do need to enter a PostCode above 3165, eg if we try to find 3166+, it is not in the drop-down at all. So there is no chance of entering it. Entering manually simply blanks the drop-down box as it is obviously not finding any records as we dynamically type, say 4xxx.
Yet, the PostCodes are all in the MySQL table, and will display in a Utility/Search.
geoff
There are 2 BO's: Job and PostCode
Job has several attributes that require a PostCode value (Job.JobPostCode type PostCode, Job.OwnerPostCode type PostCode, etc). These are pulled into Job from BO PostCode via a drop-down box in BO Job (dynamic as user types because there are 2000+ postcodes ranging from 0000 - 9999).
Entering and storing any PostCode up to 3165 (exactly 1000 records)works great. If we do need to enter a PostCode above 3165, eg if we try to find 3166+, it is not in the drop-down at all. So there is no chance of entering it. Entering manually simply blanks the drop-down box as it is obviously not finding any records as we dynamically type, say 4xxx.
Yet, the PostCodes are all in the MySQL table, and will display in a Utility/Search.
geoff
Yes, "Postcodes... will display in a Utility/Search", means that they will display via an AwareIM query of all instances in the Postcode BO
Sorry, I got it wrong way round re dynamic vs all at once. This is the situation:
Dynamic as user types: Will accept any code from whole range (0000 to 9999) but is displayed with one zero after decimal point (1111.0) instead of the custom "0000" format I set in BO PostCode. Interestingly, it returns a value of 1111 without the decimal format when saved. (Yes, I could live with it, but my customer doesn't think it is professional).
Fetch all records at once: That is where I cannot display or select beyond 3165.
Sorry, I got it wrong way round re dynamic vs all at once. This is the situation:
Dynamic as user types: Will accept any code from whole range (0000 to 9999) but is displayed with one zero after decimal point (1111.0) instead of the custom "0000" format I set in BO PostCode. Interestingly, it returns a value of 1111 without the decimal format when saved. (Yes, I could live with it, but my customer doesn't think it is professional).
Fetch all records at once: That is where I cannot display or select beyond 3165.
Sorry you are confused Tom. Yes, there is a question:
My question was "Is there a limit in import of attributes". As I continued investigating it becomes apparent that my question expands:
Is the problem actually in the import batch size? No
Is the problem due to the format of the attribute? Possibly No. Ensuring the formats are exactly the same in BO attributes and source csv makes no difference.
Is the problem an anomaly of "Fetch all Records At Once" or "Fetch Records As User Types"? Yes.
It is conclusive that if I have more than 1000 records in a table, Fetch All Records At Once will not allow the 1001th record or above to be used. I proved this using BO StreetName (5000+ records) as confirmation.
There is no question now that if I use Fetch Records Dynamically...I can use any of the number of records, BUT, for a NUMBER, no matter how formatted, I will always get it displaying with a decimal point (eg 1111.0 for data of 1111) in the drop down box.
So, my question(s) now have become
1) Is there a bug which does not allow more than 1000 records to be used in Fetch All Records At Once?
2) Why is my "0000" custom-formatted number accquiring a decimal point in Fetch Dynamic whereas it will be as I specified in Fetch All At Once?
My hope is that someone has similar and it works for them.
Support, do you have any guidance?
My question was "Is there a limit in import of attributes". As I continued investigating it becomes apparent that my question expands:
Is the problem actually in the import batch size? No
Is the problem due to the format of the attribute? Possibly No. Ensuring the formats are exactly the same in BO attributes and source csv makes no difference.
Is the problem an anomaly of "Fetch all Records At Once" or "Fetch Records As User Types"? Yes.
It is conclusive that if I have more than 1000 records in a table, Fetch All Records At Once will not allow the 1001th record or above to be used. I proved this using BO StreetName (5000+ records) as confirmation.
There is no question now that if I use Fetch Records Dynamically...I can use any of the number of records, BUT, for a NUMBER, no matter how formatted, I will always get it displaying with a decimal point (eg 1111.0 for data of 1111) in the drop down box.
So, my question(s) now have become
1) Is there a bug which does not allow more than 1000 records to be used in Fetch All Records At Once?
2) Why is my "0000" custom-formatted number accquiring a decimal point in Fetch Dynamic whereas it will be as I specified in Fetch All At Once?
My hope is that someone has similar and it works for them.
Support, do you have any guidance?