I have a web application which issues periodic requests to a web service using AJAX.
I have noticed that the time spent on the request is two times bigger than it should be. For example, if theping command shows 91 ms between the web server and a client machine, the request takes about 190 ms.
In Chrome or FireFox the same request takes only 100 ms, and this is normal, because some milliseconds are spent during processing the request on the server side.
So I started to investigate what happens and found the following:
There are two "Start" events and for some reason, the first one takes the same time, as theping value. And this is not a coincidence, I have performed the same test on other machines with different values of client\server latency, the time taken by this "phantom" event is always equal to the current latency.
Can anybody explain me why there are two "Start" events, what does the second mean and what could cause the doubled request latency?