How is latency measured? Is 7ms (milliseconds) good for a Network Latency or bad? Is it true that the lower the number the better?
We have two devices (Device A and Device B) connected by 1Gbps link, both devices send and receive data from each other. My question is does device A gets 1Gbps bandwidth while sending data as well as while receiving data or 1 Gbps bandwidth is divided between sent and received data?