This seems like it should be an obvious, quick solution, but that's not turning out to be the case.
I have a local file that does some database work and outputs HTML, let's call this file1
. I have another file that needs to use some of this information, but shouldn't need to generate it itself, file2
.
My plan was to just pull the pre-parsed PHP (at this point HTML) into file2
, but it's pulling PHP code instead. This could obviously be handled with eval()
, but I'd rather not. Something along the lines of wget()
might do the trick, but I'd rather not run command line from my PHP script either if possible.
Is there a solution while avoiding the two options above or do I have to just bite the bullet and use one of them?
Quick Clarification: The entirety of file1
is not used. file2
chops out the parts it needs so simply outputting all resulting HTML from file1
is not viable.
file_get_contents("http://Full.URL"); And to make it pass 30 characters..
-- Edit --
Ok.
when you try file_get_contents on a URL, php will request the URL, just like a good-o-wget does. You can also use this to trigger scripts.
check http://www.php.net/manual/en/function.file-get-contents.php for more info.
You could try something like the following to buffer the output and save it into a variable:
ob_start();
include "file1.php";
$content = ob_get_contents();
ob_end_clean();
Now your $content
will contained the 'echoed' output of your source file.
ob_*
functions are content buffering family. They temporarily disable outputting data to the STDOUT
and instead store echoed content inside a buffer. Documentation.