I have a PHP app which receives information from an external source, and saves data to MySQL.
The algorithm is very simple. The data being sent is a string, and it works this way:
When there are several parallel requests to the page, there are some duplicates. The data can be sent twice of more times, but we need to save it just once.
The reason is that there's a time lapse between the steps 2 and 3, where the data is still not inserted, so two instances can insert it at the same time.
What's the correct way to handle this in PHP?
(The idea is not to add a unique index to MySQL, I know it'll work but will throw an error. Also the data being sent is very long, and indexes have a limit.)
You can do the test and INSERT
in a single query:
INSERT INTO yourTable (columnName)
SELECT :data
FROM DUAL
WHERE NOT EXISTS (SELECT 1 FROM yourTable WHERE columnName = :data)