发送大文件的NGINX在某些情况下不会发送所有数据

I have a simple movie file(mkv) that i'm sending over nginx.

My Nginx Config is the following

user  nginx;
worker_processes  auto;

worker_rlimit_nofile 300000;
events {
    worker_connections  16000;
    use epoll;
    accept_mutex on;
}

http {

    include       mime.types;
    default_type  application/octet-stream;

    sendfile           on;
    tcp_nopush         on;
    tcp_nodelay        on;
    gzip off;

    access_log off;
    keepalive_timeout 10;
    client_max_body_size 0;

    server {
        listen 5050;
        index index.html index.htm;
        root /var/www/;
        server_tokens off;
        chunked_transfer_encoding off;

        if ( $request_method !~ ^(GET|POST)$ ) {
            return 200;
        }

    }
}

If i try to download the movie from a fast connection (for example with wget from a server) the movie is being downloaded successfully.

If i try to fetch the movie using slower connection, for example using ffmpeg with the -re argument (native frames), it will only download the half of the movie. The same of course happens if i use wget and adjust the download speed to be much lower than the normal one.

The connection from nginx closes after some time without sending the whole file, and i dont understand why this is happening and why it doesn't happen if i have a fast connection.

EDIT

i Edit my post entirely, because after running some tests i find out that the issue is coming from NGINX, and not from PHP as the connection was still closing after fetching the movie directly from nginx.

From what you said in the beginning, this seems to be a problem with set_time_limit(), try to increase the value, make sure that you can adjust this value, some servers don't allow this.