Stopwatch ElapsedMilliseconds / 1000.0 is always 0 inside catch block
I need to measure the time (in seconds) that took from the moment a call to a server was initiated and until the moment an exception is being thrown and a code enters the catch block:
Stopwatch ElapsedMilliseconds / 1000.0 is always 0 inside catch block
I need to measure the time (in seconds) that took from the moment a call to a server was initiated and until the moment an exception is being thrown and a code enters the catch block:
Stopwatch ElapsedMilliseconds is 0 inside catch block
I need to measure the time (in seconds) that took from the moment a call to a server was initiated and until the moment an exception is being thrown and a code enters the catch block: