Can network micro-outages/lantency impact performance on the server side of the web application

  softwareengineering

I’m trying to study a case in our Web Application where sometimes it becomes very slow in processing the transaction it receives, each transaction consists of multiple database queries, in-memory processing and I/O streams.

In normal days the transaction takes no more than 5 seconds, but during high network usage, or intermittent outages, some of the transactions take more than 20 seconds, and after analyzing it seems that between each phase of the transaction there is some blocking time (a thread waiting for another to finish), since some parts of the process are synchronized.

Now I am inclined to think that during these network micro outages, a significant amount of requests get stuck somewhere in the network until they get released instantly, which causes the server to receive a high amount of simultaneous requests, which results in the latter ones having longer process time.

Do you think its a valid hypothesis worth checking?

New contributor

Yassir Khaldi is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.

LEAVE A COMMENT