We have moved our forum to GitHub Discussions. For questions about Phalcon v3/v4/v5 you can visit here and for Phalcon v6 here.

Multiple inserts

Hello.

I am trying to make a PHP script, that allows the user to upload an XML file from their wordpress file and upload there pages, posts & comments to my site - NOT a wordpress site.

I have a custom database, so i have to rewrite the dataset to my database.

However - What if the user have more than 1000 posts, how to I insert 1000 posts without getting memory limit?

I have tried this, wich I got from a older post that I made, but it's not working

    $phql = "INSERT INTO \Blogmoda\Models\Posts\Posts (blog_id, title, thumbnail, excerpt, content) " . "VALUES (:blog_id:, :title:, :thumbnail:, :excerpt:, :content:)";
        $query = $this->modelsManager->createQuery($phql);
        $query->execute([ "blog_id" => 48, "title" => 'loooololol', "thumbnail" => 2, "excerpt" => "jsadjsa", "content" => "<p>lol</p>"]);
        foreach($posts as $post){

            $query->execute([ "blog_id" => 48, "title" => $post['title'], "thumbnail" => 2, "excerpt" => "jsadjsa", "content" => "<p>lol</p>"]);

        }

But it is not inserting, and no error message..

Just make the script as you would, but add "limit" parameter, for example 100 records. After that set this script into a crontab and you are done :)

You can have a flag is_processed = 0 / 1 or use temporary table.

Thank you for your answer

Can you provide me with a little more details, links, maybe some code?

you could also parse the xml file in batches, then theres no need for a status flag. the class you need to extend is \XmlStreamer

I suggest you to use producer - worker - consumer, to scale and isolate processes.