PHP file_get_contents()超时?

I am in the early stages of building a PHP application, part of which involves using file_get_contents() to fetch large files from a remote server and transfer them to a user. Lets say, for example, the targeted file that is being fetched is 200 mB.

  • Will this process time out if downloading to the server takes too long?
  • If so, is there a way to extend this timeout?
  • Can this file that is being downloaded also be transferred to a user simultaneously, or does the file have to be saved on the server then manually fetched by the user once download has completed?

I am just trying to make sure that I know my options or limitations are before I do much too more.

Thank you for your time.

Yes, you can use set_time_limit(0) and the max_execution_time directive to cancel the time limit imposed by PHP.

You can open a stream of the file, and transfer it to the user seamlessly.
Read about fopen()

If not a timeout you may well run into memory issues depending on how your PHP is configured. You can adjust a lot of these settings manually through code without much difficulty.

http://php.net/manual/en/function.ini-set.php

ini_set('memory_limit', '256M');