Is there an easy way to run a MySQL query from the Linux command line and output the results in CSV format?
Here's what I'm doing now:
mysql -u uid -ppwd -D dbname << EOQ | sed -e 's/ /,/g' | tee list.csv
select id, concat("\"",name,"\"") as name
from students
EOQ
It gets messy when there are a lot of columns that need to be surrounded by quotes, or if there are quotes in the results that need to be escaped.
转载于:https://stackoverflow.com/questions/356578/how-to-output-mysql-query-results-in-csv-format
Alternatively to the answer above, you can have a MySQL table that uses the CSV engine.
Then you will have a file on your hard disk that will always be in a CSV format which you could just copy without processing it.
mysql --batch, -B
Print results using tab as the column separator, with each row on a new line. With this option, mysql does not use the history file. Batch mode results in non-tabular output format and escaping of special characters. Escaping may be disabled by using raw mode; see the description for the --raw option.
This will give you a tab separated file. Since commas (or strings containing comma) are not escaped it is not straightforward to change the delimiter to comma.
$ mysql your_database --password=foo < my_requests.sql > out.csv
Which is tab separated. Pipe it like that to get a true CSV (thanks @therefromhere):
... .sql | sed 's/\t/,/g' > out.csv
The OUTFILE solution given by Paul Tomblin causes a file to be written on the MySQL server itself, so this will work only if you have FILE access, as well as login access or other means for retrieving the file from that box.
If you don't have such access, and tab-delimited output is a reasonable substitute for CSV (e.g., if your end goal is to import to Excel), then Serbaut's solution (using mysql --batch
and optionally --raw
) is the way to go.
Here's a fairly gnarly way of doing it. Found it somewhere, can't take any credit
mysql --user=wibble --password wobble -B -e "select * from vehicle_categories;" | sed "s/'/\'/;s/\t/\",\"/g;s/^/\"/;s/$/\"/;s/\n//g" > vehicle_categories.csv
Works pretty well. Once again though a regex proves write only.
Regex Explanation:
So, putting it all together:
s/'/\'/ replace ' with \'
s/\t/\",\"/g replace all \t (tab) with ","
s/^/\"/ at the beginning of the line place a "
s/$/\"/ at the end of the line place a "
s/\n//g replace all \n (newline) with nothing
Here's what I do:
echo $QUERY | \
mysql -B $MYSQL_OPTS | \
perl -F"\t" -lane 'print join ",", map {s/"/""/g; /^[\d.]+$/ ? $_ : qq("$_")} @F ' | \
mail -s 'report' person@address
The perl script (sniped from elsewhere) does a nice job of converting the tab spaced fields to CSV.
How about:
mysql your_database -p < my_requests.sql | awk '{print $1","$2}' > out.csv
This is simple, and it works on anything without needing batch mode or output files:
select concat_ws(',',
concat('"', replace(field1, '"', '""'), '"'),
concat('"', replace(field2, '"', '""'), '"'),
concat('"', replace(field3, '"', '""'), '"'))
from your_table where etc;
Explanation:
That's it!
If there is PHP installed on the machine you are using, you can write a PHP script to do that. It requires the PHP installation has the MySQL extension installed.
You can call the PHP interpreter from the command line like so:
php --php-ini path/to/php.ini your-script.php
I am including the --php-ini
switch, because you may need to use your own PHP configuration that enables the MySQL extension. On PHP 5.3.0+ that extension is enabled by default, so that is no longer necessary to use the configuration to enable it.
Then you can write your export script like any normal PHP script:
<?php
#mysql_connect("localhost", "username", "password") or die(mysql_error());
mysql_select_db("mydb") or die(mysql_error());
$result = mysql_query("SELECT * FROM table_with_the_data p WHERE p.type = $typeiwant");
$result || die(mysql_error());
while($row = mysql_fetch_row($result)) {
$comma = false;
foreach ($row as $item) {
# Make it comma separated
if ($comma) {
echo ',';
} else {
$comma = true;
}
# Quote the quotes
$quoted = str_replace("\"", "\"\"", $item);
# Quote the string
echo "\"$quoted\"";
}
echo "\n";
}
?>
The advantage of this method is, that it has no problems with varchar and text fields, that have text containing newlines. Those fields are correctly quoted and those newlines in them will be interpreted by the CSV reader as a part of the text, not record separators. That is something that is hard to correct afterwards with sed or so.
MySQL Workbench can export recordsets to CSV, and it seems to handle commas in fields very well. The CSV opens up in OpenOffice fine.
Using the solution posted by Tim, I created this bash script to facilitate the process (root password is requested, but you can modify the script easily to ask for any other user):
#!/bin/bash
if [ "$1" == "" ];then
echo "Usage: $0 DATABASE TABLE [MYSQL EXTRA COMMANDS]"
exit
fi
DBNAME=$1
TABLE=$2
FNAME=$1.$2.csv
MCOMM=$3
echo "MySQL password:"
stty -echo
read PASS
stty echo
mysql -uroot -p$PASS $MCOMM $DBNAME -B -e "SELECT * FROM $TABLE;" | sed "s/'/\'/;s/\t/\",\"/g;s/^/\"/;s/$/\"/;s/\n//g" > $FNAME
It will create a file named: database.table.csv
All of the solutions here to date, except the Mysql workbench one, are incorrect and quite possibly unsafe (ie security issues) for at least some possible content in the mysql db.
Mysql Workbench (and similarly PHPMyAdmin) provide a formally correct solution, but are designed for downloading the output to a user's location. They're not so useful for things like automating data export.
It is not possible to generate reliably correct csv from the output of mysql -B -e 'SELECT ...'
because that cannot encode carriage returns and white space in fields. The '-s' flag to mysql does do backslash escaping, and might lead to a correct solution. However, using a scripting language (one with decent internal data structures that is, not bash), and libraries where the encoding issues have already been carefully worked out is far safer.
I thought about writing a script for this, but as soon as I thought about what I'd call it, it occurred to me to search for pre-existing work by the same name. While I haven't gone over it thoroughly, the solution at https://github.com/robmiller/mysql2csv looks promising. Depending on your application, the yaml approach to specifying the SQL commands might or might not appeal though. I'm also not thrilled with the requirement for a more recent version of ruby than comes as standard with my Ubuntu 12.04 laptop or Debian Squeeze servers. Yes I know I could use RVM, but I'd rather not maintain that for such a simple purpose.
Hopefully someone will point out a suitable tool, that's had a bit of testing. Otherwise I'll probably update this when I find or write one.
CREATE TABLE () (SELECT data FROM other_table ) ENGINE=CSV ;
When you create a CSV table, the server creates a table format file in the database directory. The file begins with the table name and has an .frm extension. The storage engine also creates a data file. Its name begins with the table name and has a .CSV extension. The data file is a plain text file. When you store data into the table, the storage engine saves it into the data file in comma-separated values format.
Not exactly as a CSV format, but tee
command from MySQL client can be used to save the output into a local file:
tee foobar.txt
SELECT foo FROM bar;
You can disable it using notee
.
The problem with SELECT … INTO OUTFILE …;
is that it requires permission to write files at the server.
Try this code:
SELECT 'Column1', 'Column2', 'Column3', 'Column4', 'Column5'
UNION ALL
SELECT column1, column2,
column3 , column4, column5 FROM demo
INTO OUTFILE '/tmp/demo.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n';
For more information: http://dev.mysql.com/doc/refman/5.1/en/select-into.html
To expand on previous answers, the following one-liner exports a single table as a tab-separated file. It's suitable for automation, exporting the database every day or so.
mysql -B -D mydatabase -e 'select * from mytable'
Conveniently, we can use the same technique to list out MySQL's tables, and to describe the fields on a single table:
mysql -B -D mydatabase -e 'show tables'
mysql -B -D mydatabase -e 'desc users'
Field Type Null Key Default Extra
id int(11) NO PRI NULL auto_increment
email varchar(128) NO UNI NULL
lastName varchar(100) YES NULL
title varchar(128) YES UNI NULL
userName varchar(128) YES UNI NULL
firstName varchar(100) YES NULL
Building on user7610, here is the best way to do it. With mysql outfile
there were 60 mins of file ownership and overwriting problems.
It's not cool, but it worked in 5 mins.
php csvdump.php localhost root password database tablename > whatever-you-like.csv
<?php
$server = $argv[1];
$user = $argv[2];
$password = $argv[3];
$db = $argv[4];
$table = $argv[5];
mysql_connect($server, $user, $password) or die(mysql_error());
mysql_select_db($db) or die(mysql_error());
// fetch the data
$rows = mysql_query('SELECT * FROM ' . $table);
$rows || die(mysql_error());
// create a file pointer connected to the output stream
$output = fopen('php://output', 'w');
// output the column headings
$fields = [];
for($i = 0; $i < mysql_num_fields($rows); $i++) {
$field_info = mysql_fetch_field($rows, $i);
$fields[] = $field_info->name;
}
fputcsv($output, $fields);
// loop over the rows, outputting them
while ($row = mysql_fetch_assoc($rows)) fputcsv($output, $row);
?>
From your command line, you can do this:
mysql -h *hostname* -P *port number* --database=*database_name* -u *username* -p -e *your SQL query* | sed 's/\t/","/g;s/^/"/;s/$/"/;s/\n//g' > *output_file_name.csv*
This saved me a couple of times. Fast and it works!
Example:
sudo mysql -udemo_user -p -h127.0.0.1 --port=3306 \
--default-character-set=utf8 --database=demo_database \
--batch --raw < /var/demo_sql_query.sql > /var/demo_csv_export.csv
Many of the answers on this page are weak because they don't handle the general case of what can occur in CSV format. e.g. commas and quotes embedded in fields and other conditions that always come up eventually. We need a general solution that works for all valid CSV input data.
Here's a simple and strong solution in Python:
#!/usr/bin/env python
import csv
import sys
tab_in = csv.reader(sys.stdin, dialect=csv.excel_tab)
comma_out = csv.writer(sys.stdout, dialect=csv.excel)
for row in tab_in:
comma_out.writerow(row)
Name that file tab2csv
, put it on your path, give it execute permissions, then use it list this:
mysql OTHER_OPTIONS --batch --execute='select * from whatever;' | tab2csv >outfile.csv
The Python CSV-handling functions cover corner cases for CSV input format(s).
This could be improved to handle very large files via a streaming approach.
This answer uses Python and a popular third party library, PyMySQL. I'm adding it because Python's csv library is powerful enough to correctly handle many different flavors of .csv
and no other answers are using Python code to interact with the database.
import contextlib
import csv
import datetime
import os
# https://github.com/PyMySQL/PyMySQL
import pymysql
SQL_QUERY = """
SELECT * FROM my_table WHERE my_attribute = 'my_attribute';
"""
# embedding passwords in code gets nasty when you use version control
# the environment is not much better, but this is an example
# http://stackoverflow.com/questions/12461484/is-it-secure-to-store-passwords-as-environment-variables-rather-than-as-plain-t
SQL_USER = os.environ['SQL_USER']
SQL_PASS = os.environ['SQL_PASS']
connection = pymysql.connect(host='localhost',
user=SQL_USER,
password=SQL_PASS,
db='dbname')
with contextlib.closing(connection):
with connection.cursor() as cursor:
cursor.execute(SQL_QUERY)
# Hope you have enough memory :)
results = cursor.fetchall()
output_file = 'my_query-{}.csv'.format(datetime.datetime.today().strftime('%Y-%m-%d'))
with open(output_file, 'w', newline='') as csvfile:
# http://stackoverflow.com/a/17725590/2958070 about lineterminator
csv_writer = csv.writer(csvfile, lineterminator='\n')
csv_writer.writerows(results)
Also, if you're performing the query on the Bash command line, I believe the tr
command can be used to substitute the default tabs to arbitrary delimiters.
$ echo "SELECT * FROM Table123" | mysql Database456 | tr "\t" ,
Tiny bash script for doing simple query to CSV dumps, inspired by https://stackoverflow.com/a/5395421/2841607.
#!/bin/bash
# $1 = query to execute
# $2 = outfile
# $3 = mysql database name
# $4 = mysql username
if [ -z "$1" ]; then
echo "Query not given"
exit 1
fi
if [ -z "$2" ]; then
echo "Outfile not given"
exit 1
fi
MYSQL_DB=""
MYSQL_USER="root"
if [ ! -z "$3" ]; then
MYSQL_DB=$3
fi
if [ ! -z "$4" ]; then
MYSQL_USER=$4
fi
if [ -z "$MYSQL_DB" ]; then
echo "Database name not given"
exit 1
fi
if [ -z "$MYSQL_USER" ]; then
echo "Database user not given"
exit 1
fi
mysql -u $MYSQL_USER -p -D $MYSQL_DB -B -s -e "$1" | sed "s/'/\'/;s/\t/\",\"/g;s/^/\"/;s/$/\"/;s/\n//g" > $2
echo "Written to $2"
If you have PHP set up on the server, you can use mysql2csv to export an (actually valid) CSV file for an abitrary mysql query. See my answer at MySQL - SELECT * INTO OUTFILE LOCAL ? for a little more context/info.
I tried to maintain the option names from mysql
so it should be sufficient to provide the --file
and --query
options:
./mysql2csv --file="/tmp/result.csv" --query='SELECT 1 as foo, 2 as bar;' --user="username" --password="password"
"Install" mysql2csv
via
wget https://gist.githubusercontent.com/paslandau/37bf787eab1b84fc7ae679d1823cf401/raw/29a48bb0a43f6750858e1ddec054d3552f3cbc45/mysql2csv -O mysql2csv -q && (sha256sum mysql2csv | cmp <(echo "b109535b29733bd596ecc8608e008732e617e97906f119c66dd7cf6ab2865a65 mysql2csv") || (echo "ERROR comparing hash, Found:" ;sha256sum mysql2csv) ) && chmod +x mysql2csv
(download content of the gist, check checksum and make it executable).
You can use below command from your SQL location:
"mysql -h(hostname/IP>) -u(username) -p(password) databasename <(query.sql) > outputFILE(.txt/.xls)"
e.g hostname -x.x.x.x
uname - username
password - password
DBName - employeeDB
queryFile - employee.sql
outputFile - outputFile.xls
mysql -hx.x.x.x -uusername -ppassword employeeDB< employee.sql> outputFile.xls
Make sure you are executing the command from the directory where SQL is located or mention the full path of the sql location in the above command.
What worked for me:
SELECT *
FROM students
WHERE foo = 'bar'
LIMIT 0,1200000
INTO OUTFILE './students-1200000.csv'
FIELDS TERMINATED BY ',' ESCAPED BY '"'
ENCLOSED BY '"'
LINES TERMINATED BY '\r\n';
None of the solutions on this thread worked for my particular case, I had pretty json data inside one of the columns, which would get messed up in my csv output. For those with a similar problem, try lines terminated by \r\n instead.
Also another problem for those trying to open the csv with Microsoft Excel, keep in mind there is a limit of 32,767 characters that a single cell can hold, above that it overflows to the rows below. To identify which records in a column have the issue, use the query below. You can then truncate those records or handle them as you'd like.
SELECT id,name,CHAR_LENGTH(json_student_description) AS 'character length'
FROM students
WHERE CHAR_LENGTH(json_student_description)>32767;