By my understanding, when PHP encounters include
(or require
, or their _once
versions), it looks up the file in the filesystem and parses it identically to how it would if the code were in the place of the include
call (with the exception of return
called in the file).
So, on a busy server, it is easily conceivable that many people will be hitting included files (for instance, the file that connects to a database, or defines global functions) repeatedly. Could this have a noticeable impact on performance?
Would it be advantageous to "compile" includes by dumping the contents of the file into the relevant places?
My understanding of include(), require(), and the like is that it works a lot like the C preprocessor #include directive and basically runs all that code as if it were inline in the current file at that position, as you believe.
As some of the above comments have said, if those files are being frequently used (e.g. constantly called via include() ), they are likely sitting in RAM or at least a disk cache.
It's worth nothing that PHP files are essentialy JIT compiled and cached anyway, so you shouldn't notice a performance hit either way. (More detailed info here).
Also, as a sidenote - include_once() and require_once() do have a significant overhead when compared to include() and require(), so if speed is a factor, try to avoid the use of those calls.