I have a PHP app which receives information from an external source, and saves data to MySQL.
The algorithm is very simple. The data being sent is a string, and it works this way:
- Receive the data in $data. Ex: $data = "info"
- Checks in database if "info" exists. Ex: select count(*) from table where data="info"
- If "info" exists, then exit
- Else insert "info" in database and exit.
When there are several parallel requests to the page, there are some duplicates. The data can be sent twice of more times, but we need to save it just once.
The reason is that there's a time lapse between the steps 2 and 3, where the data is still not inserted, so two instances can insert it at the same time.
What's the correct way to handle this in PHP?
(The idea is not to add a unique index to MySQL, I know it'll work but will throw an error. Also the data being sent is very long, and indexes have a limit.)