Sending large amount of messages async

The scenario is this: I have web based service that does regular request processing and it’s doing persistence into db etc.. Fast response to the client is crucial. So now the average request is about 100ms which is great.

The problem is that i need to send parsed request that I receive to the 3rd party service on another server. In order to maintain the speed of the script and response to the client I have to do that asynchronously. So just send to service and move on with executing script.

My current setup is like this. I’m using PHP and MySql on CentOS server. I have node service that listens messages locally on given port. On other side i have node.js script that receives message and then spawn php-cli process gives that php script message as argument and that script then communicates with 3rd party server.

First limitation here is that I’m using php lib (I have to) that is giving me interface to communicate with 3rd party service witch uses SOAP. So i have to call all libs methods in PHP. I can’t use other languages.

Problem is also that node service can be quite unstable and frankly i want to get away as far as i can from node handling this.

Also sometimes i get too many connections on database because i have to write to db that request is no longer pending and that is closed or failed when i receive the response. So if 3rd party service is unavailable or it’s taking too long it just hangs because its in transaction.

So what are some of your advice. I’m thinking of maybe using RabbitMQ instead of node.js or maybe use Redis pub/sub feature?

Has anyone had to do anything similar in these technologies?

P.S. I have in average about 3 million request on server in one day.

2

Trả lời

Email của bạn sẽ không được hiển thị công khai. Các trường bắt buộc được đánh dấu *