too long

I'm writing a script that shares data between two websites.

The stress test developed by the developer of the other website sends 400~ files to me in approximately 12 seconds, the issue is when I look at the data being received it looks as though i've been sent duplicates of the data. I'm fairly confident that he's sending me the appropriate data so I can only assume the issue is on my end.

The first thing I do after validating the connection is from an authorized user is grab the data from the file using file_get_contents($_FILES['file']['tmp_name']).

I also wrote a stress test, mine sends 400~ files over 22 seconds and it processes all the data without accessing the wrong tmp file data. So my assumption is that the issue is based on how long it takes to handle the requests, the more quickly they are sent the more likely it is that an issue will develop.

What could be causing my script to read in data from a different tmp file?