Score:0

Does packet size influence the response latency?

ua flag

I am testing my web application locally against an amazon micro database thats situated in north virginia. I live in Groningen the netherlands. When i send a select query over the public internet it takes 2616(ms) to return a resultset of 42 kilobytes.

[INFO ] 2022-01-17 10:22:02.147 [http-nio-8080-exec-1] http_access_log - method=GET uri="/api/journal/get/virtual-scroll" status-code=206 bytes=42350 duration=2616(ms) client: remote ip=0:0:0:0:0:0:0:1 useragent="Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/97.0.4692.71 Safari/537.36"

When i do a simple SELECT 1 FROM TABLE query its takes 550 ms to get the response back. So does that mean the latency of the response will increase when the response size increases? Or does the resultset returning query just run slow? Its a stored procedure call.

cn flag
Bob
Generally speaking database driven applications do not deal well with high latency (in the sense of high [round trip delay](https://en.wikipedia.org/wiki/Round-trip_delay)) between the database server and application. Bad database design, bad queries and bad application design will aggravate that. For performance ensure that application and database can communicate at LAN speeds, not over WAN/Internet links.
Score:0
ru flag

When i send a select query over the public internet it takes 2616(ms) to return a resultset of 42 kilobytes.

Keep in mind that the request needs to reach the server, be processed there and the reply needs to return to the client. Each transmission consists of serialization plus propagation to the destination, so the time does depend on the data size.

In your case, the largest impact is most likely from the processing on the server, however. Complex queries can require quite a bit of time, depending on their complexity (not necessarily related to the output size) and server performance.

For completeness:

  • serialization delay = data size / bandwidth (42 KB / 100 Mbit/s ≈ 3.5 ms)
  • propagation delay: at least 5 ms per 1000 km, depending on connectivity and media (Groningen-Virginia >30 ms per trip)

So, you could roughly estimate for the round trip to take less than 100 ms while the rest of the delay is due to server processing. This is why it is vital for your server and database to be optimized for your workload. Also, you need to take the transmission delay into account, so the application should optimize and bundle queries as far as possible.

Maurice avatar
ua flag
`Also, you need to take the transmission delay into account`. What is transmission delay exactly?
Zac67 avatar
ru flag
Definitions slightly vary but above I'm using serialization delay + propagation delay.
mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.