使用PHP / Joomla以编程方式将MySql数据库导出为CSV

Using PHP/Joomla, i am trying to export the database backup into CSV/sql file to download. Some tables has huge number of records. When i start the export it gives following error : Fatal error: Maximum execution time of 30 seconds exceeded

Note : I don't want to use the 'max_execution_time' or 'set_time_limit()' or any php.ini settings. Because i will use this script on different client servers & i will not tell to each client to change the settings. I need the PHP/Joomla script, not any extensions.

Please help me.

Maybe you even have to split the records of each table into different files.

Since some tables have huge amount of data, it will take quite some time. As a result the connection would timeout and the export will fail.

A better approach would be to use the command line to dump the data as SQL. You might want to look at the mysqldump command.

You can also invoke the mysqldump command from a PHP script like this -

<?php
# Database Settings
$tmpDir = "/home/<username>/tmp/"; // Temp location the user has access to
$user = "<username>_******"; // MySQL Username for the database
$password = "*******"; // MySQL Password
$dbName = "<username>_******"; // Database name, usually in the pattern: <username>_<dbname>
$dbHost = "localhost"; // Mysql server hostname, usually localhost

$sqlFile = $tmpDir . $dbName . date('Y_m_d') . ".sql"; // The dumped SQL File
$attachment = $tmpDir . $dbName . "_" . date('Y_m_d') . ".tgz"; // TGZed file

$creatBackup = "mysqldump -h '" . $dbHost .  "' -u '" . $user . "' --password='" . $password . "' '" . $dbName . "' > '" . $sqlFile . "'"; // Full command
$createZip = "tar cvzf $attachment $sqlFile"; // Full Command

// Execute 'em
system($creatBackup);
system($createZip);

This should create a SQL dump and create an archive out of it.

The above codes are part of a tool I built to email me daily backups of my databases. You can check it out on Github and modify it to suit your purposes - https://github.com/masnun/mysql-backup-to-email

I would suggest that instead of re-inventing the wheel you use Akeeba Backup as it will allow you to create automated backups of your entire site, or just the files or specific directories, or just the database or specific tables or any combination you can think of.

It's also free.

It's won awards year after year, highly rated on the Joomla Extension Directory and we use it on all of our sites and our client sites.

If you get the Pro version, you can even backup to Amazon's S3 or other useful systems.

I have no affiliation with Akeeba other than I love using their products — they let me sleep at night.

Given your comments to my previous answer I would suggest you look at creating a Joomla! CLI (have a look in the /cli directory of your Joomla 2.5+ installation).

We've done a similar thing to what you're for the Pro version of our Easystaging extension which we developed for internal use, to manage client websites with particular content approval processes. The internal version had to deal with our larger clients requirements.

Effectively we created class that inherits from JApplicationCli and can be initiated by CRON or from our components user interface. e.g.

class EasyStaging_PlanRunner extends JApplicationCli
{
    /**
     * Entry point for the plan funner (yes it's more fun)
     *
     * @return  void
     *
     * @since   2.5
     */
    public function doExecute()
    {
    }
}

If it's launched by the user from within Joomla we use a utility method to make sure the system forks it off correctly:

/**
 * Runs the script in the background by scheduling it with the `at` daemon and returns the result
 *
 * @param   string  $pathToScript  A path to the script to run e.g. "/path/to/my/cli/app.php"
 *
 * @return  int
 */
private function _runScriptInBackground($pathToScript)
{
    $cmdPath = "/usr/bin/php -q $pathToScript";

    // We need '2>&1' so we have something to pass back
    $cmd     = 'echo "' . $cmdPath . '" | at now 2>&1';
    $result = shell_exec($cmd);

    return $result;
}

Since this is running effectively as a command line script a lot of the limits are much looser but they still won't get you around a very large table. (Some of our clients have many thousands of article or worse K2 content items).

To get around this we wrote an export method called from our doExecute()'s main dispatch loop to retrieve and export table rows in reasonable batches. For us reasonable was determined by the remote databases max_allowed_packet value and we create an SQL file with statements that fit in this limit with a small allowance for various overheads. You may not have these issues but even so you should probably break up your SQL into reasonable sized insert statements.

Once the plan has finished running we then zip up all the files and send them off to archive.