Phalcon Phalcon\Db\Adapter\Mysql Performance

Hi evryone,

I have very big MySql database (more that 10 000 000 records). When I woudlike to fetch 100 000 records I have error that memory is exhausted

Allowed memory size of 134217728 bytes exhausted (tried to allocate 2 bytes) in ...

It's working welll when I make the same sql using native php PDO library...- weird

Does anybody have the simmilar issue? Is it any option to fetchall big data using Phalcon\Db\Adapter\Mysql ??



32.1k

Yes, I had this problem. First, in my opinion, if you have so many records, you should consider migrating to Mongo or a more powerfull DB like Postgres. I solved this problem by using raw sql and Phalcon\Paginator\Adapter\NativeArray as paginator. My cuurent limit is 10K rows and it is quite fast. So, for example:

class Products extends \Phalcon\Mvc\Model
{
    // ... code
}

I would do it like this:

    public function getProducts()
    {
        $sql     = "SELECT * FROM products LIMIT 100000";
        $product = new Product();
        $results = new \Phalcon\Mvc\Model\Resultset\Simple(
            null,
            $product,
            $product->getReadConnection()->query($sql)
        );

        if ($results->count() > 0) {

            foreach ($results as $product) {
                if (1 == $product->getIsActive()) {
                    $output[] = $product->toArray();
                }
            }

            $paginator = new \Phalcon\Paginator\Adapter\NativeArray(
                [
                    'data'  => $output,
                    'limit' => 10,
                    'page'  => 1 // $current_page or some other var
                ]
            );

            $result = $paginator->getPaginate();

            return $result;
        }
    }


29.7k

Hi,

thats's strange because the phalcon mysql adapter is build on top of PDO ...

Is it any option to fetchall big data using Phalcon\Db\Adapter\Mysql ??

use pagination

select only the needed columns

...

Just a question: what are you expecting to do with a 100 000 records dataset ?