I'm trying to import huge csv (> 1Gb) files in MySQL database:
/**
* @param $file_path
*/
private function importFileContents($file_path)
{
$query = sprintf("LOAD DATA LOCAL INFILE '%s'
INTO TABLE file_import_contents
FIELDS TERMINATED BY ';'
LINES TERMINATED BY '\
'
IGNORE 1 LINES
(@col1, @col2, @col3, @col4, @col5, @col6, @col7, @col8, @col9, @col10, @col11, @col12, @col13, @col14, @col15, @col16, @col17, @col18, @col19, @col20, @col21)
set add_params=@col1, available=@col2, category_id=@col3, currency_id=@col4, delivery=@col5, description=@col6, model=@col8, modified_time=@col9, name=@col10, oldprice=@col11, param=@col12, pickup=@col13, picture=@col14, price=@col15, sales_notes=@col16, type=@col17, type_prefix=@col18, url=@col19, vendor=@col20, vendor_code=@col21
", addslashes($file_path));
$em = $this->getContainer()->get('doctrine.orm.default_entity_manager');
$connection = $em->getConnection();
$statement = $connection->prepare($query);
$statement->execute();
}
And the problem is when I execute sql count in db after import:
SELECT COUNT(*) FROM file_import_contents;
it returns me 1 648 990 rows, but when I run command in terminal for counting rows in all csv files:
find ./ -type f -name "*csv*" -exec wc -w {} +
it returns 76 319 426 total rows...
My php params are: upload_max_filesize = 32000M
max_execution_time = 300000
max_input_time = 600000
memory_limit = 1024M
...and in script:ini_set('memory_limit', '-1');
Why is this happening and how could I upload those files entirely? Thank for any advice