I have a slight problem with an application I am working on. The application is used as a developer tool to dump tables from a database in a MySQL server to a JSON file which the devs grab by using the Unix curl command. So far the databases we've been using are relatively small tables(2GB or less) however recently we've moved into another stage of testing that use fully populated tables (40GB+) and my simple PHP script breaks. Here's my script:
[<?php
$database = $_GET['db'];
ini_set('display_errors', 'On');
error_reporting(E_ALL);
# Connect
mysql_connect('localhost', 'root', 'root') or die('Could not connect: ' . mysql_error());
# Choose a database
mysql_select_db('user_recording') or die('Could not select database');
# Perform database query
$query = "SELECT * from `".$database."`";
$result = mysql_query($query) or die('Query failed: ' . mysql_error());
while ($row = mysql_fetch_object($result)) {
echo json_encode($row);
echo ",";
}
?>]
My question to you is what can I do to make this script better about handling larger database dumps.