I need a detailed answer if possible. Assume an Internet phone application gener
ID: 3764404 • Letter: I
Question
I need a detailed answer if possible.
Assume an Internet phone application generates packets only during talk spurts. During a talk spurt the sender generates bytes at a rate of 500 bytes per second, and every 40 msecs the sender gathers bytes into chunks. Assume that RTP is used that will add a header to each chunk. In addition UDP and IP will be used. Suppose all headers (including RTP, UDP and IP) have a total length of h and an IP datagram is emitted every 20 msecs. Find the transmission rate in bits per second for the datagram generated by one side of the application.
Explanation / Answer
The UDP User Datagram Protocol It is possible to send packets even before the connection gets established It is conveyed in one Internet Protocol Packet In order to send bigger packets, it fragments or breaks in to smaller chunks of data and sends them across the network The columns of the UDP header are calculated and the header is transmitted along with the data to the receiving node The packets are generated during the talk spurts The internet phone application generates the packets during talk spurts The rate is 500 bytes per second ( 500 bps) Chunks are gathered once in every 40 msecs in the case of RTP, mostly sound and vision are transmitted as it uses multi cast, packets arriving in out of synch are taken care of as well The packet header uses Timestamp, SSRC identifier. extension that is specific to the profile the length of the extension header and the actual extension header itself The RTP protocol has components and sessions There will be a session connected for each and every stream of multimedia data that is getting sent across The rate is 500 bytes per second 1 byte = 8 bits hence 500 bytes = 500 * 8 = 4000 bits The transmission rate would be 4000 bits per second Taking the 20 msecs into consideration, the rate gets altered to 4500 bits per second We have used the formula of Time for transmitting a Packet = Size of the Packet / Bit Rate rate = 4000 bits per second But the packet delivery time will include the propagation delay as well hence time of delivery for packet = time of transmission + delay during the propagation The round trip time will include twice the packet delivery time plus the delay during the processing steps For the link it will be twice the transmission time of packets plus twice the delay of propagation plus the delay during processins The throughput is the value of the size of the window divided by the time for a round trip Hence finally the message delivery time would be the message size divided by the network throughput
Related Questions
drjack9650@gmail.com
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.