How to query Odata model batch wise and store full data in another model?

I have an Odata model which takes input from another model field. When I query model data either in snippet or by model condition I am hitting max characters in URL error. So looking for any other ways to split the query batch wise, i.e. sending input model fields batch wise and fetch data batch wise.

I tried to set model condition by splitting the input model. But model condition is resetting everytime in snippet instead of adding data.

i think there are potentially a few ways to do this. has you tried changing the query behavior to Get More - Merge in new Rows with old?

HI Jacob,

Thanks for the reply. Yes, I tried using get more to merge in new rows. This I tried in a for loop to iterate over the set of input values. It is not working as expected, as I am setting the new model condition each time instead of getting and merging the data, skuid is overriding the existing data. Any views on this?

Hey Hemanth,

No problem!

I wasn’t able to replicate the issue you are seeing with Get More Rows locally. Which version of Skuid are you on? Also, would you explain what is being done to perform the condition setting and requery looping?

Thanks!

Hi Jacob,

We are using version Spark (12.2.14). I tried to frame a sample code below. What I am doing in below code is, getting the value of a parent field values into an array and using that array/set I am calculating the noOfLoops to split the records into the batches of 50 and store in an array. Then, I am iterating using this noOfLoops and changing model condition with the array values to get more data from Odata model. I defined the condition in Odata model.

-------------------------------------------------------code starts-----------------------------------------

var OdataModel = [[]];

noOfLoops = Math.floor(fieldValues.length/50); —> Here fieldValues are the input field from parent model which is in array.

console.log('no of loops '+ noOfLoops);

var fieldSet=[];

for(j=0; j<=noOfLoops; j++){

console.log('inside loop '+ j)

for (let i = j50; i < 50+(50j); i++){ → this logic is to split the records into batches of 50.

if(i < fieldValues.length){

let thisfield = fieldValues[i];

fieldSet[j]=invSet[j]+thisfield+";"

}

}

ModelToUpdate[j] = skuid.$M(‘OdataModel’);

ModelCondition[j] = ModelToUpdate[j].getConditionByName(‘CondName’);

}

console.log(ModelToUpdate);

var output = [];

console.log(ModelCondition);

for(j=0; j<=noOfLoops; j++){

ModelToUpdate[j].setCondition(ModelCondition[j],fieldSet[j]);

console.log(ModelCondition[j]);

ModelToUpdate[j].getMoreData();

console.log(ModelToUpdate[j].data);

output.push(ModelToUpdate[j].data);

console.log(output);

ModelToUpdate[j].save({callback: function(result){

if (result.totalsuccess) {

console.log('inside save block: ')

console.log(ModelToUpdate[j].data)

} else {

console.log('save failed')

}

}});

}

------------------------------code ends ---------------------------------------------------------------------

I have used ModelToUpdate[] array instead of single instance, as model condition is overriding with last iteration value. But even in this way I am facing same issue. So let’s say if I have 5 batches, in loop i am setting model condition multiple times and tried to get more data. But finally I am seeing only the data related to 5th batch.

Hope it is clear. Let me know if you need any further details here.

Hey Hemanth,

That make sense. I think what is happening, at the core, is the javascript loop is running faster than the server can complete the query and return data. I’ve actually run into this a few times, and I believe you are close to a working solution. Generally what I recommend is to try to avoid loops and instead use a function that can rerun itself while iterating over an array. that way the next query process wont execute until the pervious one is complete.

Here is a quick example i made:

-------------------------------------------------------------------------------------code being----------------------------------------------------------------------------------------------------

var ModelToUpdate = skuid.$M(‘OdataModel’);

var ModelCondition = ModelToUpdate.getConditionByName(‘CondName’);

var p = 0;

var fieldValues = [];

/building an array of values to iterate through for my testing, substitute this with the actual values you’ll iterate over to set the condition/

//for(f=0; f<11; f++){

//fieldValues.push(f);

//}

/how many times to run the function/

noOfLoops = fieldValues.length;

/**function that re-runs itself for each value in the array **/

/sets the condition to the next unused value in the fieldValues array/

/queries the model with getMoreData()/

/when query is complete, callback to examine the value list length and determine if more querying is needed/

function queryLooper(p){

ModelToUpdate.setCondition(ModelCondition,fieldValues[p]);

ModelToUpdate.getMoreData(function(){

if(p<noOfLoops){

  p++;

  setTimeout(queryLooper(p),0);

}else{

  console.log(p);

  // ModelToUpdate.save({callback:function(){

  //   console.log('saved');

  // }});

}

});

}

/kick off the function/

queryLooper(p);

-------------------------------------------------------------------------------------code end-----------------------------------------------------------------------------------------------------

what im doing here is creating a function (queryLooper) that sets the model condition and queries the model. it iterates over a variable(p) that is set to the length of the field data array we’ll be using to set the condition. when a query completes the function checks the current variable to see if the list has been fully parsed. if the variable(p) has not reached the length of the list yet the function will increment it and run itself again. when p has reached the length of the list the function is clear to run other functions/processes.

let me know how this works out/ if Im making sense.