2008-12-10 15:59

如何以 CSV 格式输出 MySQL 查询结果?


Is there an easy way to run a MySQL query from the Linux command line and output the results in CSV format?

Here's what I'm doing now:

mysql -u uid -ppwd -D dbname << EOQ | sed -e 's/        /,/g' | tee list.csv
select id, concat("\"",name,"\"") as name
from students

It gets messy when there are a lot of columns that need to be surrounded by quotes, or if there are quotes in the results that need to be escaped.


  • 点赞
  • 写回答
  • 关注问题
  • 收藏
  • 复制链接分享
  • 邀请回答


  • csdnceshi65 larry*wei 13年前

    From http://www.tech-recipes.com/rx/1475/save-mysql-query-results-into-a-text-or-csv-file/

    SELECT order_id,product_name,qty
    FROM orders
    WHERE foo = 'bar'
    INTO OUTFILE '/var/lib/mysql-files/orders.csv'

    Using this command columns names will not be exported.

    Also note that /var/lib/mysql-files/orders.csv will be on the server that is running MySQL. The user that the MySQL process is running under must have permissions to write to the directory chosen, or the command will fail.

    If you want to write output to your local machine from a remote server (especially a hosted or virtualize machine such as Heroku or Amazon RDS), this solution is not suitable.

    点赞 23 评论 复制链接分享
  • csdnceshi65 larry*wei 10年前

    Here's a fairly gnarly way of doing it. Found it somewhere, can't take any credit

    mysql --user=wibble --password wobble -B -e "select * from vehicle_categories;" | sed "s/'/\'/;s/\t/\",\"/g;s/^/\"/;s/$/\"/;s/\n//g" > vehicle_categories.csv

    Works pretty well. Once again though a regex proves write only.

    Regex Explanation:

    • s/// means substitute what's between the first // with what's between the second //
    • the "g" at the end is a modifier that means "all instance, not just first"
    • ^ (in this context) means beginning of line
    • $ (in this context) means end of line

    So, putting it all together:

    s/'/\'/          replace ' with \'
    s/\t/\",\"/g     replace all \t (tab) with ","
    s/^/\"/          at the beginning of the line place a "
    s/$/\"/          at the end of the line place a "
    s/\n//g          replace all \n (newline) with nothing
    点赞 24 评论 复制链接分享
  • csdnceshi63 elliott.david 12年前

    mysql --batch, -B

    Print results using tab as the column separator, with each row on a new line. With this option, mysql does not use the history file. Batch mode results in non-tabular output format and escaping of special characters. Escaping may be disabled by using raw mode; see the description for the --raw option.

    This will give you a tab separated file. Since commas (or strings containing comma) are not escaped it is not straightforward to change the delimiter to comma.

    点赞 22 评论 复制链接分享
  • csdnceshi51 旧行李 11年前
    $ mysql your_database --password=foo < my_requests.sql > out.csv

    Which is tab separated. Pipe it like that to get a true CSV (thanks @therefromhere):

    ... .sql | sed 's/\t/,/g' > out.csv
    点赞 22 评论 复制链接分享
  • csdnceshi52 妄徒之命 6年前

    From your command line, you can do this:

    mysql -h *hostname* -P *port number* --database=*database_name* -u *username* -p -e *your SQL query* | sed 's/\t/","/g;s/^/"/;s/$/"/;s/\n//g' > *output_file_name.csv*

    Credits: Exporting table from Amazon RDS into a csv file

    点赞 9 评论 复制链接分享
  • weixin_41568131 10.24 10年前

    Here's what I do:

    echo $QUERY | \
      mysql -B  $MYSQL_OPTS | \
      perl -F"\t" -lane 'print join ",", map {s/"/""/g; /^[\d.]+$/ ? $_ : qq("$_")} @F ' | \
      mail -s 'report' person@address

    The perl script (sniped from elsewhere) does a nice job of converting the tab spaced fields to CSV.

    点赞 9 评论 复制链接分享
  • weixin_41568126 乱世@小熊 13年前

    Alternatively to the answer above, you can have a MySQL table that uses the CSV engine.

    Then you will have a file on your hard disk that will always be in a CSV format which you could just copy without processing it.

    点赞 8 评论 复制链接分享
  • csdnceshi60 ℡Wang Yan 5年前

    Many of the answers on this page are weak because they don't handle the general case of what can occur in CSV format. e.g. commas and quotes embedded in fields and other conditions that always come up eventually. We need a general solution that works for all valid CSV input data.

    Here's a simple and strong solution in Python:

    #!/usr/bin/env python
    import csv
    import sys
    tab_in = csv.reader(sys.stdin, dialect=csv.excel_tab)
    comma_out = csv.writer(sys.stdout, dialect=csv.excel)
    for row in tab_in:

    Name that file tab2csv, put it on your path, give it execute permissions, then use it list this:

    mysql OTHER_OPTIONS --batch --execute='select * from whatever;' | tab2csv >outfile.csv

    The Python CSV-handling functions cover corner cases for CSV input format(s).

    This could be improved to handle very large files via a streaming approach.

    点赞 7 评论 复制链接分享
  • csdnceshi69 YaoRaoLov 3年前

    The following bash script works for me. It optionally also gets the schema for the requested tables.

    # export mysql data to CSV
    #ansi colors
    green='\033[0;32m' # '\e[1;32m' is too bright for white bg.
    # a colored message 
    #   params:
    #     1: l_color - the color of the message
    #     2: l_msg - the message to display
    color_msg() {
      local l_color="$1"
      local l_msg="$2"
      echo -e "${l_color}$l_msg${endColor}"
    # error
    # show the given error message on stderr and exit
    #   params:
    #     1: l_msg - the error message to display
    error() {
      local l_msg="$1"
      # use ansi red for error
      color_msg $red "Error:" 1>&2
      color_msg $red "\t$l_msg" 1>&2
    # display usage 
    usage() {
      echo "usage: $0 [-h|--help]" 1>&2
      echo "               -o  | --output      csvdirectory"    1>&2
      echo "               -d  | --database    database"   1>&2
      echo "               -t  | --tables      tables"     1>&2
      echo "               -p  | --password    password"   1>&2
      echo "               -u  | --user        user"       1>&2
      echo "               -hs | --host        host"       1>&2
      echo "               -gs | --get-schema"             1>&2
      echo "" 1>&2
      echo "     output: output csv directory to export mysql data into" 1>&2
      echo "" 1>&2
      echo "         user: mysql user" 1>&2
      echo "     password: mysql password" 1>&2
      echo "" 1>&2
      echo "     database: target database" 1>&2
      echo "       tables: tables to export" 1>&2
      echo "         host: host of target database" 1>&2
      echo "" 1>&2
      echo "  -h|--help: show help" 1>&2
      exit 1
    # show help 
    help() {
      echo "$0 Help" 1>&2
      echo "===========" 1>&2
      echo "$0 exports a csv file from a mysql database optionally limiting to a list of tables" 1>&2
      echo "   example: $0 --database=cms --user=scott --password=tiger  --tables=person --output person.csv" 1>&2
      echo "" 1>&2
    domysql() {
      mysql --host $host -u$user --password=$password $database
    getcolumns() {
      local l_table="$1"
      echo "describe $l_table" | domysql | cut -f1 | grep -v "Field" | grep -v "Warning" | paste -sd "," - 2>/dev/null
    # parse command line options
    while true; do
      #echo "option $1"
      case "$1" in
        # options without arguments
        -h|--help) usage;;
        -d|--database)     database="$2" ; shift ;;
        -t|--tables)       tables="$2" ; shift ;;
        -o|--output)       csvoutput="$2" ; shift ;;
        -u|--user)         user="$2" ; shift ;;
        -hs|--host)        host="$2" ; shift ;;
        -p|--password)     password="$2" ; shift ;;
        -gs|--get-schema)  option="getschema";; 
        (--) shift; break;;
        (-*) echo "$0: error - unrecognized option $1" 1>&2; usage;;
        (*) break;;
    # checks
    if [ "$csvoutput" == "" ]
      error "ouput csv directory not set"
    if [ "$database" == "" ]
      error "mysql database not set"
    if [ "$user" == "" ]
      error "mysql user not set"
    if [ "$password" == "" ]
      error "mysql password not set"
    color_msg $blue "exporting tables of database $database"
    if [ "$tables" = "" ]
    tables=$(echo "show tables" | domysql)
    case $option in
       rm $csvoutput$database.schema
       for table in $tables
         color_msg $blue "getting schema for $table"
         echo -n "$table:" >> $csvoutput$database.schema
         getcolumns $table >> $csvoutput$database.schema
    for table in $tables
      color_msg $blue "exporting table $table"
      cols=$(grep "$table:" $csvoutput$database.schema | cut -f2 -d:)
      if [  "$cols" = "" ]
        cols=$(getcolumns $table)
      ssh $host rm $mysqlfiles/$table.csv
    cat <<EOF | mysql --host $host -u$user --password=$password $database 
    SELECT $cols FROM $table INTO OUTFILE '$mysqlfiles$table.csv'
      scp $host:$mysqlfiles/$table.csv $csvoutput$table.csv.raw
      (echo "$cols"; cat $csvoutput$table.csv.raw) > $csvoutput$table.csv
      rm $csvoutput$table.csv.raw
    点赞 7 评论 复制链接分享
  • csdnceshi65 larry*wei 9年前

    MySQL Workbench can export recordsets to CSV, and it seems to handle commas in fields very well. The CSV opens up in OpenOffice fine.

    点赞 7 评论 复制链接分享
  • csdnceshi53 Lotus@ 3年前

    Tiny bash script for doing simple query to CSV dumps, inspired by https://stackoverflow.com/a/5395421/2841607.

    # $1 = query to execute
    # $2 = outfile
    # $3 = mysql database name
    # $4 = mysql username
    if [ -z "$1" ]; then
        echo "Query not given"
        exit 1
    if [ -z "$2" ]; then
        echo "Outfile not given"
        exit 1
    if [ ! -z "$3" ]; then
    if [ ! -z "$4" ]; then
    if [ -z "$MYSQL_DB" ]; then
        echo "Database name not given"
        exit 1
    if [ -z "$MYSQL_USER" ]; then
        echo "Database user not given"
        exit 1
    mysql -u $MYSQL_USER -p -D $MYSQL_DB -B -s -e "$1" | sed "s/'/\'/;s/\t/\",\"/g;s/^/\"/;s/$/\"/;s/\n//g" > $2
    echo "Written to $2"
    点赞 7 评论 复制链接分享
  • csdnceshi50 三生石@ 3年前

    If you have PHP set up on the server, you can use mysql2csv to export an (actually valid) CSV file for an abitrary mysql query. See my answer at MySQL - SELECT * INTO OUTFILE LOCAL ? for a little more context/info.

    I tried to maintain the option names from mysql so it should be sufficient to provide the --file and --query options:

    ./mysql2csv --file="/tmp/result.csv" --query='SELECT 1 as foo, 2 as bar;' --user="username" --password="password"

    "Install" mysql2csv via

    wget https://gist.githubusercontent.com/paslandau/37bf787eab1b84fc7ae679d1823cf401/raw/29a48bb0a43f6750858e1ddec054d3552f3cbc45/mysql2csv -O mysql2csv -q && (sha256sum mysql2csv | cmp <(echo "b109535b29733bd596ecc8608e008732e617e97906f119c66dd7cf6ab2865a65  mysql2csv") || (echo "ERROR comparing hash, Found:" ;sha256sum mysql2csv) ) && chmod +x mysql2csv

    (download content of the gist, check checksum and make it executable).

    点赞 6 评论 复制链接分享
  • csdnceshi55 ~Onlooker 11年前

    The OUTFILE solution given by Paul Tomblin causes a file to be written on the MySQL server itself, so this will work only if you have FILE access, as well as login access or other means for retrieving the file from that box.

    If you don't have such access, and tab-delimited output is a reasonable substitute for CSV (e.g., if your end goal is to import to Excel), then Serbaut's solution (using mysql --batch and optionally --raw) is the way to go.

    点赞 6 评论 复制链接分享
  • csdnceshi58 Didn"t forge 9年前

    Unix/Cygwin only, pipe it through 'tr':

    mysql <database> -e "<query here>" | tr '\t' ',' > data.csv

    N.B.: This handles neither embedded commas, nor embedded tabs.

    点赞 6 评论 复制链接分享
  • weixin_41568126 乱世@小熊 8年前

    All of the solutions here to date, except the Mysql workbench one, are incorrect and quite possibly unsafe (ie security issues) for at least some possible content in the mysql db.

    Mysql Workbench (and similarly PHPMyAdmin) provide a formally correct solution, but are designed for downloading the output to a user's location. They're not so useful for things like automating data export.

    It is not possible to generate reliably correct csv from the output of mysql -B -e 'SELECT ...' because that cannot encode carriage returns and white space in fields. The '-s' flag to mysql does do backslash escaping, and might lead to a correct solution. However, using a scripting language (one with decent internal data structures that is, not bash), and libraries where the encoding issues have already been carefully worked out is far safer.

    I thought about writing a script for this, but as soon as I thought about what I'd call it, it occurred to me to search for pre-existing work by the same name. While I haven't gone over it thoroughly, the solution at https://github.com/robmiller/mysql2csv looks promising. Depending on your application, the yaml approach to specifying the SQL commands might or might not appeal though. I'm also not thrilled with the requirement for a more recent version of ruby than comes as standard with my Ubuntu 12.04 laptop or Debian Squeeze servers. Yes I know I could use RVM, but I'd rather not maintain that for such a simple purpose.

    Hopefully someone will point out a suitable tool, that's had a bit of testing. Otherwise I'll probably update this when I find or write one.

    点赞 5 评论 复制链接分享
  • csdnceshi53 Lotus@ 10年前

    How about:

    mysql your_database -p < my_requests.sql | awk '{print $1","$2}' > out.csv
    点赞 5 评论 复制链接分享
  • csdnceshi78 程序go 3年前

    Also, if you're performing the query on the Bash command line, I believe the tr command can be used to substitute the default tabs to arbitrary delimiters.

    $ echo "SELECT * FROM Table123" | mysql Database456 | tr "\t" ,

    点赞 4 评论 复制链接分享
  • csdnceshi62 csdnceshi62 4年前

    This answer uses Python and a popular third party library, PyMySQL. I'm adding it because Python's csv library is powerful enough to correctly handle many different flavors of .csv and no other answers are using Python code to interact with the database.

    import contextlib
    import csv
    import datetime
    import os
    # https://github.com/PyMySQL/PyMySQL
    import pymysql
    SQL_QUERY = """
    SELECT * FROM my_table WHERE my_attribute = 'my_attribute';
    # embedding passwords in code gets nasty when you use version control
    # the environment is not much better, but this is an example
    # http://stackoverflow.com/questions/12461484/is-it-secure-to-store-passwords-as-environment-variables-rather-than-as-plain-t
    SQL_USER = os.environ['SQL_USER']
    SQL_PASS = os.environ['SQL_PASS']
    connection = pymysql.connect(host='localhost',
    with contextlib.closing(connection):
        with connection.cursor() as cursor:
            # Hope you have enough memory :)
            results = cursor.fetchall()
    output_file = 'my_query-{}.csv'.format(datetime.datetime.today().strftime('%Y-%m-%d'))
    with open(output_file, 'w', newline='') as csvfile:
        # http://stackoverflow.com/a/17725590/2958070 about lineterminator
        csv_writer = csv.writer(csvfile, lineterminator='\n')
    点赞 4 评论 复制链接分享
  • csdnceshi70 笑故挽风 6年前

    Building on user7610, here is the best way to do it. With mysql outfile there were 60 mins of file ownership and overwriting problems.

    It's not cool, but it worked in 5 mins.

    php csvdump.php localhost root password database tablename > whatever-you-like.csv

    $server = $argv[1];
    $user = $argv[2];
    $password = $argv[3];
    $db = $argv[4];
    $table = $argv[5];
    mysql_connect($server, $user, $password) or die(mysql_error());
    mysql_select_db($db) or die(mysql_error());
    // fetch the data
    $rows = mysql_query('SELECT * FROM ' . $table);
    $rows || die(mysql_error());
    // create a file pointer connected to the output stream
    $output = fopen('php://output', 'w');
    // output the column headings
    $fields = [];
    for($i = 0; $i < mysql_num_fields($rows); $i++) {
        $field_info = mysql_fetch_field($rows, $i);
        $fields[] = $field_info->name;
    fputcsv($output, $fields);
    // loop over the rows, outputting them
    while ($row = mysql_fetch_assoc($rows)) fputcsv($output, $row);
    点赞 4 评论 复制链接分享
  • csdnceshi75 衫裤跑路 5年前

    This saved me a couple of times. Fast and it works!




    sudo mysql -udemo_user -p -h127.0.0.1 --port=3306 \
       --default-character-set=utf8 --database=demo_database \
       --batch --raw < /var/demo_sql_query.sql > /var/demo_csv_export.csv
    点赞 3 评论 复制链接分享
  • csdnceshi78 程序go 7年前

    Try this code:

    SELECT 'Column1', 'Column2', 'Column3', 'Column4', 'Column5'
    SELECT column1, column2,
    column3 , column4, column5 FROM demo
    INTO OUTFILE '/tmp/demo.csv'

    For more information: http://dev.mysql.com/doc/refman/5.1/en/select-into.html

    点赞 2 评论 复制链接分享
  • csdnceshi61 derek5. 7年前

    Not exactly as a CSV format, but tee command from MySQL client can be used to save the output into a local file:

    tee foobar.txt
    SELECT foo FROM bar;

    You can disable it using notee.

    The problem with SELECT … INTO OUTFILE …; is that it requires permission to write files at the server.

    点赞 2 评论 复制链接分享
  • csdnceshi59 ℙℕℤℝ 9年前

    If there is PHP installed on the machine you are using, you can write a PHP script to do that. It requires the PHP installation has the MySQL extension installed.

    You can call the PHP interpreter from the command line like so:

    php --php-ini path/to/php.ini your-script.php

    I am including the --php-ini switch, because you may need to use your own PHP configuration that enables the MySQL extension. On PHP 5.3.0+ that extension is enabled by default, so that is no longer necessary to use the configuration to enable it.

    Then you can write your export script like any normal PHP script:

        #mysql_connect("localhost", "username", "password") or die(mysql_error());
        mysql_select_db("mydb") or die(mysql_error());
        $result = mysql_query("SELECT * FROM table_with_the_data p WHERE p.type = $typeiwant");
        $result || die(mysql_error());
        while($row = mysql_fetch_row($result)) {
          $comma = false;
          foreach ($row as $item) {
            # Make it comma separated
            if ($comma) {
              echo ',';
            } else {
              $comma = true;
            # Quote the quotes
            $quoted = str_replace("\"", "\"\"", $item);
            # Quote the string
            echo "\"$quoted\"";
            echo "\n";

    The advantage of this method is, that it has no problems with varchar and text fields, that have text containing newlines. Those fields are correctly quoted and those newlines in them will be interpreted by the CSV reader as a part of the text, not record separators. That is something that is hard to correct afterwards with sed or so.

    点赞 2 评论 复制链接分享
  • csdnceshi80 胖鸭 7年前
    1. logic :

    CREATE TABLE () (SELECT data FROM other_table ) ENGINE=CSV ;

    When you create a CSV table, the server creates a table format file in the database directory. The file begins with the table name and has an .frm extension. The storage engine also creates a data file. Its name begins with the table name and has a .CSV extension. The data file is a plain text file. When you store data into the table, the storage engine saves it into the data file in comma-separated values format.

    点赞 2 评论 复制链接分享
  • csdnceshi71 Memor.の 8年前

    Using the solution posted by Tim, I created this bash script to facilitate the process (root password is requested, but you can modify the script easily to ask for any other user):

    if [ "$1" == "" ];then
    echo "MySQL password:"
    stty -echo
    read PASS
    stty echo
    mysql -uroot -p$PASS $MCOMM $DBNAME -B -e "SELECT * FROM $TABLE;" | sed "s/'/\'/;s/\t/\",\"/g;s/^/\"/;s/$/\"/;s/\n//g" > $FNAME

    It will create a file named: database.table.csv

    点赞 2 评论 复制链接分享
  • csdnceshi75 衫裤跑路 7年前

    To expand on previous answers, the following one-liner exports a single table as a tab-separated file. It's suitable for automation, exporting the database every day or so.

    mysql -B -D mydatabase -e 'select * from mytable'

    Conveniently, we can use the same technique to list out MySQL's tables, and to describe the fields on a single table:

    mysql -B -D mydatabase -e 'show tables'
    mysql -B -D mydatabase -e 'desc users'
    Field   Type    Null    Key Default Extra
    id  int(11) NO  PRI NULL    auto_increment
    email   varchar(128)    NO  UNI NULL    
    lastName    varchar(100)    YES     NULL    
    title   varchar(128)    YES UNI NULL    
    userName    varchar(128)    YES UNI NULL    
    firstName   varchar(100)    YES     NULL    
    点赞 1 评论 复制链接分享
  • csdnceshi74 7*4 10年前

    This is simple, and it works on anything without needing batch mode or output files:

    select concat_ws(',',
        concat('"', replace(field1, '"', '""'), '"'),
        concat('"', replace(field2, '"', '""'), '"'),
        concat('"', replace(field3, '"', '""'), '"'))
    from your_table where etc;


    1. Replace " with "" in each field --> replace(field1, '"', '""')
    2. Surround each result in quotation marks --> concat('"', result1, '"')
    3. Place a comma between each quoted result --> concat_ws(',', quoted1, quoted2, ...)

    That's it!

    点赞 1 评论 复制链接分享
  • csdnceshi80 胖鸭 3年前

    You can use below command from your SQL location:

    "mysql -h(hostname/IP>) -u(username) -p(password) databasename <(query.sql) > outputFILE(.txt/.xls)"

    e.g hostname -x.x.x.x

    uname - username

    password - password

    DBName - employeeDB

    queryFile - employee.sql

    outputFile - outputFile.xls

    mysql -hx.x.x.x -uusername -ppassword employeeDB< employee.sql> outputFile.xls

    Make sure you are executing the command from the directory where SQL is located or mention the full path of the sql location in the above command.

    点赞 评论 复制链接分享
  • weixin_41568184 叼花硬汉 3年前

    What worked for me:

    SELECT *
    FROM students
    WHERE foo = 'bar'
    LIMIT 0,1200000
    INTO OUTFILE './students-1200000.csv'

    None of the solutions on this thread worked for my particular case, I had pretty json data inside one of the columns, which would get messed up in my csv output. For those with a similar problem, try lines terminated by \r\n instead.

    Also another problem for those trying to open the csv with Microsoft Excel, keep in mind there is a limit of 32,767 characters that a single cell can hold, above that it overflows to the rows below. To identify which records in a column have the issue, use the query below. You can then truncate those records or handle them as you'd like.

    SELECT id,name,CHAR_LENGTH(json_student_description) AS 'character length'
    FROM students
    WHERE CHAR_LENGTH(json_student_description)>32767;
    点赞 评论 复制链接分享