Skuid Platform - What's the new model limits for other platforms?

  • 1
  • Question
  • Updated 2 years ago
  • Answered
In Salesforce there is a model limit that is something in the neighbourhood of 4000+ records.

What about Dynamics 365, SAP, etc, etc?
Photo of mB Pat Vachon

mB Pat Vachon, Champion

  • 42,714 Points 20k badge 2x thumb

Posted 2 years ago

  • 1
Photo of JD Bell

JD Bell, Senior Product Engineer

  • 2,996 Points 2k badge 2x thumb
Skuid does not have "Model limits". Limits are imposed by the source platform and the data source (Salesforce, for example). Skuid does not currently apply its own limits, nor do we have any immediate plans to limit or cap data transfer.

Theoretically, a Skuid model could hold as much data as your browser/OS/hardware can support (probably multiple GBs worth, depending on your RAM, swap file and available system resources). However, download speeds represent a practical limitation.

Note that there's a difference between connecting directly to a data source and using the Skuid proxy. When connecting directly to a data source, there's no "limit" on the amount of data you can query from a third-party data source, whether using Skuid Platform or Skuid on Salesforce.

However, when using Skuid on Salesforce with the Apex proxy, data is transferred via Salesforce's servers, so you must still abide by Salesforce's heap size limits. If the response size of your data source is greater than the 6MB limit, Salesforce will return an error. See, for example: The Microsoft DynamicsTM Data Source Type: Troubleshooting.

For Skuid Platform, we haven't settled on a definite heap size limit. We've set what we think is reasonable upper bound such that most customers will not have an issue with it, but if you encounter a problem please let us know. We are also discussing solutions to large data queries that will avoid heap size issues (such as streaming with WebSockets).

Regardless of server-imposed limits, for the sake of your users, we'd recommend you keep data loads down to the low 10's of MBs... anything more could cause significant delays and might cause serious performance problems for mobile users (where system memory is more constrained).

Most users won't want to page through thousands of records anyway... it's much better to download exactly what they need and only what they need. For aggregate models, especially, it's generally better to have the server do the heavy lifting, which they are better resourced to do, than to download a lot of data and have the browser aggregate it.

Salesforce's limits can be a blessing in disguise, encouraging developers to seek solutions that will ultimately be better for their end users in terms of reliability, performance and user experience.
(Edited)