I have a long running script (anywhere from 30 to 160 seconds, based on settings) that requests data from an API using NuSOAP, based on this data it builds one big ~1000-4000 row insert query. It than truncates a table and inserts the big query.
When I run the scripts too closely timed one after the other it creates a problem where it loses data. I want to prevent this script from being run twice simultaneously.
This script will in the future also be run every ~5-10 minutes via cron/task scheduler.
Currently I block running the script simultaneously by checking if a file exists:
<?php
header('content-type: application/json');
ignore_user_abort(true);
if (!file_exists('lock.txt')) {
$lock = fopen('lock.txt','w');
fclose($lock);
//~450 API requests using NuSOAP.
//TRUNCATE `table`
//INSERT ~1000-4000 rows into `table
$jsonArray = array(utf8_encode('script')=>utf8_encode('finished'));
unlink('lock.txt');
} else {
$jsonArray = array(utf8_encode('script')=>utf8_encode('locked'));
}
echo json_encode($jsonArray);
?>
Is this a secure way of blocking a script from being run simultaneously? Is is better to check wether a MySQL column contains 'true' or 'false, instead of a file?
Is there a Better way?