Solved thread

This post is marked as solved. If you think the information contained on this thread must be part of the official documentation, please contribute submitting a pull request to its repository.

Committing large number of save() is slow, even using DynamicUpdate

I am currently creating a webapp that processes large amounts of data at once (upwards of maybe 10000+ new records on a page load.) Though the amount of data is actually pretty small. I have set the relevant Models to useDynamicUpdate to speed up the process, but the page loads are still relatively long.

Eventually, this will all be done using AJAX requests so the page at least won't hang, but I am wondering if there is anything I can do to further speed up this process (a lot of the data will likely be a single large creeate process with later processes doing a save on unchanged data)



1.7k

ORM is generally heavy. If you want speed, why don't you think not using ORM, but using raw SQL.

If you insert a lot of records, "bulk insert" will give you better performance.



46.8k

If you don't need the end result of that data to be returned then you can spawn off a process to call the script via the CLI interface. If you call it with a trailing '&' symbol on the shell command then it will run it in the background and it will be fine as long as the web server stays up. It will work pretty well for smaller stuff and then if you grow too much then you will need to use dedicated queue process with some form of IPC or with a custom shell script that puts the data into the queue process.

This is the route I am going. Thanks.

On Sun, Dec 28, 2014 at 9:09 PM, dschissler <[email protected]> wrote:



46.8k
Accepted
answer

I recommend having a robust cli module with its own bootstrap to compliment your web bootstrap.