I have a series of txt files that have info for around 200 people. This info is generated and exported 5 or 6 times a day. Each txt file has average 800 lines each.
I set up a cron that calls (from php command line) a codeigniter controller that makes this process:
- constructor loads model
- a method get txt files from folder, removes blanks and special chars from filename and renames
- return files' paths stored in an array
- another method loops through files array and call $this->process($file)
- process() reads each line from file
- ignores blank lines and builds 1 array from each line with values in each line read: array_filter(preg_split('/\s+/',$line));
- finally it calls model->insert_line($line)
How could I:
1- optimize code so I can lower the 2min (avg) execution time for each cron call? Each execution process 5/6 txt files with 800avg. lines each
2- setup the MySQL table so it can hold a very large qtty of records w/o trouble? Fields stored are 2: "code" int(2) and "fecha" timestamp , set both in an unique index(code,fecha)
I have a fast PC, and the table is set to InnoDB