Nagle’s algorithm is enabled in TCP to trigger transmissions. Suppose that a full TCP segment was sent to the receiver right before time t0, which emptied the sending buffer. Then the data come from the application layer at a constant rate of 160kbps (i.e., 20KBps). The sizes of all IP headers and TCP headers are 20B and the sender is in a network whose MTU is 640 bytes. RTT is 20ms and does not change during the whole TCP session. Assume that the sending buffer size is large enough. In addition, the sending window size remains at SW=MSS/2 during the whole period. How many data bytes will be sent out during the period [t0, t0+15ms]? How about during the period (t0+15, t0+30ms]?