Suppose an application generates chunks of 20 bytes of data every 20 msec, and each chunk gets encapsulated in a TCP segment and then an IP datagram. What percentage of each datagram will be overhead?

Respuesta :

Answer:

67%

Explanation:

TCP Header = 20 bytes

IP Header = 20 bytes.

Data chunk = 20 bytes

Adding the 20 byte data chunk to the TCP/IP header makes 60 total bytes.

Total = 60 bytes

Overhead is 40 of the 60

i.e (40/60) * 100 = 66.67 %

The percentage of the datagram that is overhead is 66.67%.

We know that TCP and IP are 20 bytes each

Data is 20 bytes

If we add 40 BYTES to 20 Bytes of data we get 40 + 20 = 60 Bytes

Percentage of data gram that is overhead is,

[tex]\frac{40}{60} \times100=66.67%[/tex]

More Explanation:

"The TCP header takes up 20 bytes of data (or more if options are used); the IP header also uses 20 or more bytes. A minimum of 40 bytes are needed for headers, all of which are non-data also known as “overhead”. The above numbers are irrespective of data.

Learn More: https://brainly.com/question/13745311