PHP file_get_contents发送大POST

I have following method to send file_get_contents requests:

protected function query($page, $post, $timeout = false, $debug = false, $enableWarnings = true) {
    $this->correctHostIfNeeded(self::HTTP_CONNECTION_TYPE);
    $page = $this->host . (substr($page, 0, 1) == '?' ? '' : '?') . $page;
    $opts = array('http' => array(
            'method' => 'POST',
            'header' => 'Content-type: application/x-www-form-urlencoded',
    ));
    if (!empty($post)) {
        $postdata = http_build_query($post);
        $opts['http']['content'] = $postdata;
    }
    if ($timeout > 0)
        $opts['http']['timeout'] = $timeout;
    $context = stream_context_create($opts);
    $result = $enableWarnings ? file_get_contents($page, false, $context) : @file_get_contents($page, false, $context);
    return $result;
}

It usually works fine, better than curl version (it occasionally not executing properly, regardless of data in post). Unfortunately, if I send really big POST usign file_get_contents (for example array with 100k elements) it fails. Sometimes the target server saves part of the data but it never gets it all.

I know the internet connetion between servers is not the problem (both servers are in my own datacenters and speed between is stable about 100Mb). Also the code itself on both servers seems to be fine because with smaller data it works fine and if I change to curl big packages are received properly (unfortunately it sometimes fails and I read that it's not to strange behavior for curl).

Try to read file by parts, and merge result afterwards. In file_get_context you can specify offset and max_length argument.

Increase the execution time of the page, write this at the top-

ini_set('max_execution_time', 300);