We have a Laravel API with a MariaDB database where we are having the issue of duplicate records in the database that are created by concurrent requests.
We assumed that implementing a queue would solve this, by handling one request at a time. But we still have the same issue of duplicate records.
In the application we have a check that looks for duplicates, but of course that won't work for concurrent requests that will create the same record at the same time.
We are unable to use a unique constraint at database level, partially because of Laravel's soft delete mechanism, and partially because of complex business logic that should allow duplicates if the value of one of the fields matches a certain set of values.
We use Redis as the queue driver.
Are we wrong by thinking that the queue should prevent these issues? Or did we somehow made a mistake somewhere implementing the queue?
This is a simplified version of the controller:
class CreateRecordJob extends Job implements SelfHandling
{
public function __construct(array $data)
{
$this->data = $data;
}
public function handle()
{
$data = $this->data;
// check if we have this record in the database already
if(!$this->hasDuplicate($data)) {
$this->createRecord();
}
}
}
In config/queue.php we have set Redis as the default queue driver:
'default' => env('QUEUE_DRIVER', 'redis'),
And also in config/queue.php we have this for the redis connection:
'redis' => [
'driver' => 'redis',
'connection' => 'default',
'queue' => 'default',
'expire' => 60,
],