Memory overflow while updating large data sets

Hello everyone!

I have a command-line script for crontab, that daily updates some data in DB and can make a lot of UPDATE-queries (about 30k or more on a single script run). At some point it just stops with "allowed memory exhausted".

Script itself is pretty simple - it queries XML from another server, parses it and then starts updating records in a loop. Something like: foreach($xmls as $xml) { $record = Record::findFirst($xml->id); //Phalcon\Mvc\Model $record->setName(...)->setAnotherData(...)->save(); } Just find-update-save, records were not stored in any arrays. And only one record at a time.

Does Phalcon internally keeps some record/query cache that can overflow while querying/updating more and more records? If so, can it be disabled or at least cleared from script?


No, Phalcon does not have an internal cache for instances unless you have one of the caching strategies described here ( implemented in your application.

Also, try using collect_cycles to try to free some unused memory:

Thank you for your reply. No, I have not manually defined any caching strategy - everything on default, if any default strategy exists. I'll try to use collect_cycles to see if it helps.

Btw, I've removed ORM select/update and made a PHQL update-query instead - no memory usage happened... Maybe, any other MVC-related in-memory caches? Models manager, metadata storage, etc...