duankuang7928 2012-10-19 15:00
浏览 30
已采纳

使用PHP将大型数据库转储为JSON

I have a slight problem with an application I am working on. The application is used as a developer tool to dump tables from a database in a MySQL server to a JSON file which the devs grab by using the Unix curl command. So far the databases we've been using are relatively small tables(2GB or less) however recently we've moved into another stage of testing that use fully populated tables (40GB+) and my simple PHP script breaks. Here's my script:

[<?php 

$database = $_GET['db'];

ini_set('display_errors', 'On');
error_reporting(E_ALL);

# Connect
mysql_connect('localhost', 'root', 'root') or die('Could not connect: ' . mysql_error());

# Choose a database
mysql_select_db('user_recording') or die('Could not select database');

# Perform database query
$query = "SELECT * from `".$database."`";
$result = mysql_query($query) or die('Query failed: ' . mysql_error());

while ($row = mysql_fetch_object($result)) {
   echo json_encode($row);
   echo ",";
}

?>] 

My question to you is what can I do to make this script better about handling larger database dumps.

  • 写回答

3条回答 默认 最新

  • dsdvr06648 2012-10-19 15:21
    关注

    This is what I think that the problem is:

    you are using mysql_query. mysql_query buffers data in memory and then mysql_fetch_object just fetches that data from the memory. For very large tables, you just don't have enough memory (most likely you are getting all 40G of rows into that one single call).

    Use mysql_unbuffered_query instead. More info here on MySQL performance blog There you can find some other possible causes for this behavior.

    本回答被题主选为最佳回答 , 对您是否有帮助呢?
    评论
查看更多回答(2条)

报告相同问题?

悬赏问题

  • ¥15 关于logstash转发日志时发生的部分内容丢失问题
  • ¥17 pro*C预编译“闪回查询”报错SCN不能识别
  • ¥15 微信会员卡接入微信支付商户号收款
  • ¥15 如何获取烟草零售终端数据
  • ¥15 数学建模招标中位数问题
  • ¥15 phython路径名过长报错 不知道什么问题
  • ¥15 深度学习中模型转换该怎么实现
  • ¥15 Stata外部命令安装问题求帮助!
  • ¥15 从键盘随机输入A-H中的一串字符串,用七段数码管方法进行绘制。提交代码及运行截图。
  • ¥15 如何用python向钉钉机器人发送可以放大的图片?