Network throughput refers to the average data rate of successful data or message delivery over a specific communications link. Network throughput is measured in bits per second (bps). A common misconception on measuring network throughput is that measuring the time it takes to upload or download a large file is the maximum throughput of a network. This method does not take into account communications overhead such as Network receiver window size, machine limitations or network latency. Maximum network throughput equals the TCP window size divided by the round-trip time of communications data packets.
Step 1
Convert the TCP window size from bytes to bits: 64 KB is the default TCP window size for computers running the Windows operating system. To convert the window size to bits, multiply the number of bytes by eight. 64 KB x 8 = 524,288 bits.
Video of the Day
Step 2
Divide the TCP window size in bits by the network path latency. For this example, use a latency of 60 milliseconds. 524,288 bits / .060 seconds = 8,738,133 bits per second.
Step 3
Convert the result from step 2 to megabits per second by dividing the result by 1,000,000. In this example, the maximum throughput is 8.738 Mbps maximum network throughput with the main limitation on the network throughput being the high latency of the network connection.