We have moved our forum to GitHub Discussions. For questions about Phalcon v3/v4/v5 you can visit here and for Phalcon v6 here.

Performance issue on log generation for REST API

I am building REST API using Phalcon Micro and i am storing log data in AWS DynamoDB. In my controller I have written code for storing log just before returning response to client.

    mycontroller.php
            function mycontroller(){

                do this
                do that
                writetoDynamoDB

                return $result;
        }

I have observed that response time has increased due to storing of log data. How could i increase the performance of the API and also store log data at same time.

I am working on optimizing the queries, have included Redis caching but i need suggestions on storing log data.

Any help is very much appreciated.



9.7k
edited Aug '17

I do not knw AWS DynamoDB. Some databases have options for switching off indexes and other tricks to make the actual write faster. Does DynamoDB have a performance page?

The next problem might be a memory limit. The db was running with everything just under the limit of some setting. Your extra data pushed something out of a cache. That will show up in db specific measurements. Is there a DynamoDB option for showing memory usage by cache/table/whatever DynamoDB uses?

Including Redis caching could use enough memory to push something else out. A before/after measurement of the system might show the point where something slows down.

Well that only confirms performance hit we all get with databases (both SQL and NoSQL).

Tip: instead of watiing DB to finish it's tasks or trying to optimize writes, you could run it in a background with Message broker service.

Reference: https://olddocs.phalcon.io/en/latest/reference/queue.html

That way, your API will return response faster, while your audit data will be saved transparently via background process.



5.8k
edited Aug '17

Thanks for the reply Peter.

My application uses Mysql as Database and DynamoDb for storing API log data. In MySql proper indexes are in place.

Right now i am only using write operation for DynamoDB. Later on i will perform read operation but now its only write. Also Redis is being used as seperate service from AWS.

I am trying to find out what the best practises for writing API log data.

I am trying to find out what the best practises for writing API log data.

Best practise (i.e. production) for API's audit logs is to have them on a separate instance / elsewhere and not in your application that serves API.

It could be also done on lower level - web server level, before request even hits your app / API.

But to simplify things - Message broker service running on a same instance (but dispatches data to another server) seems best to me.