Tuesday, May 3, 2011

How to test TCP/IP sockets delays on a single machine

Problem Statement:

How much time delay does a socket introduce when used to communicate between two processes on a machine?

Solution:

To estimate the delay, build a small test program to act as a server and a client. Measure the total time required for multiple data transfers.

Server.py

import socket

HOST = '127.0.0.1'
PORT = 50006

s = socket.socket(socket.AF_INET,socket.SOCK_STREAM)
s.bind((HOST,PORT))
s.listen(1)
conn,addr = s.accept()
print 'Connected by',addr
while 1:
    data = conn.recv(1024)
    if not data: break
    conn.send(data)
conn.close()

 

Client.py

import socket
import time

HOST = '127.0.0.1'
PORT = 50006

s = socket.socket(socket.AF_INET,socket.SOCK_STREAM)
s.connect((HOST,PORT))
t0 = time.time()
for i in range(0,1000):
    s.send('Hello World!')
    data = s.recv(1024)
delta_time = time.time()-t0
s.close()
print 'Received ',data
print 'Average Time ',delta_time/1000.0

 

Start Server.py at the command line. Run Client.py.  For my machine, the results were:

Received  Hello World!
Average Time  5.5999994278e-05

 

Discussion:

This test created a reliable socket between two independent processes by using the SOCK_STREAM socket which uses TCP/IP as a transport layer protocol.

Test Conditions:

  • Python 2.6
  • win7 with Intel i7 CPU
This work is licensed under a Creative Commons Attribution By license.

No comments:

Post a Comment