select all rows in a model even if all lines are not shown

  • 1
  • Question
  • Updated 2 years ago
  • Answered
I am using the below script to select all rows in a model and create new rows in a different model. However it is only selecting the initial page load number of lines and I need all rows that would be returned. Is there a way to do that without setting the "Mass number of records" to blank?

For example the "Mass number of records" field would be set to 20 but the script would pull in all the records the model should have based on the conditions. If I leave the "Mass number of records" blank we run into the issue of heap size errors.

Script:
var $ = skuid.$; 
var models = skuid.model.map();
var inventoryLocation = models.LocationInventoryPosition;
var cycleCountLines = models.CycleCountLines;
$.each(inventoryLocation.data,function(){
    var row = cycleCountLines.createRow({
        additionalConditions: [
            { field: 'SCMC__Item_Master__c', value: this.SCMC__Item_Master__r.Id},
            { field: 'SCMC__Inventory_Location__c', value: this.SCMC__Bin__r.Id},
            { field: 'SCMC__Status__c', value: "New"},
            { field: 'SAC_Inventory_Position__c', value: this.Id},
            { field: 'SCMC__Inventory_Quantity__c', value: this.QuantityfromAgg}
        ], doAppend: true
    });
});
Photo of Tami Lust

Tami Lust

  • 5,280 Points 5k badge 2x thumb

Posted 2 years ago

  • 1
Photo of Matthias

Matthias

  • 628 Points 500 badge 2x thumb
Hey Tami
It may not be a "clean" solution, but you can loop the model.loadNextOffsetPage() function while model.canRetrieveMoreRows() returns true. So you can first get all the rows in the model before launching your script without running into the heapsize problem on pageload. Just keep in mind that this can cause really long loadingtimes depending on the size of your model.
(Edited)
Photo of mB Pat Vachon

mB Pat Vachon, Champion

  • 42,714 Points 20k badge 2x thumb
This is what I would suggest. Additionally, an idea to load rows in groups on page load would be useful. Ie. 200 x 8 or 200 until all loaded.

The issue for this is if you plan to edit or create many many rows. Could have problems saving the records if the number of records approaches 1000+.
Photo of Tami Lust

Tami Lust

  • 5,280 Points 5k badge 2x thumb
Thanks for pointing me in this direction. In this article there is a loadAllRemainingRecords() call. I added this to the top of my script and also as a seperate script run before the script that creates the lines in a different model but it is not working as expected. Individually both scripts work as expected but not together.

I need all the rows to load first then select all the rows in the model to be created in a new model.

skuid.$.blockUI({ message: 'Loading all available Accounts...' });skuid.$M("LocationInventoryPosition").loadAllRemainingRecords({
   stepCallback: function(offsetStart,offsetEnd) {
      skuid.$.blockUI({ message: 'Loading Records ' + offsetStart + ' to ' + offsetEnd + '...' });
   },
   finishCallback: function(totalRecordsRetrieved) {
      skuid.$.blockUI({ message: 'Finished loading all ' + totalRecordsRetrieved + ' Accounts!', timeout: 2000 });
   }
});
var $ = skuid.$;var models = skuid.model.map();
var inventoryLocation = models.LocationInventoryPosition;
var cycleCountLines = models.CycleCountLines;


$.each(inventoryLocation.data,function(){
    var row = cycleCountLines.createRow({
        additionalConditions: [
            { field: 'SCMC__Item_Master__c', value: this.SCMC__Item_Master__r.Id},
            { field: 'SCMC__Inventory_Location__c', value: this.SCMC__Bin__r.Id},
            { field: 'SCMC__Status__c', value: "New"},
            { field: 'AC_Inventory_Position__c', value: this.Id},
            { field: 'SCMC__Inventory_Quantity__c', value: this.QuantityfromAgg}
        ], doAppend: true
    });
});
Photo of Matthias

Matthias

  • 628 Points 500 badge 2x thumb
have you tried breaking down load and save both into pieces, i assume you may come across saving/loading errors if the datasize gets to big, maybe try loading all data in 200 records steps and once everything is loaded try saving them in the new model also in 200 steps.

Since both scripts are working seperately, i assume they just can't handle the sheer size
Photo of Tami Lust

Tami Lust

  • 5,280 Points 5k badge 2x thumb
Thanks! I will give it a try.