In my windows client, I should sent my data to server with JSON like this:
[{"score":"MathJun: 90","shenase":"2981051261"},
{"score":"MathJun: 80","shenase":"2981021192"},
{"score":"ChemJun: 90","shenase":"2981027931"},
{"score":"MathFeb: 90","shenase":"2981060775"},
{"score":"MathJun: 90","shenase":"2981010824"},
{"score":"MathJun: 00","shenase":"2981017039"},
{"score":"ChemJun: 10","shenase":"3120292011"}]
And number of JSON blocks in between 1 to 40. And in my PHP file, in a For loop I insert a record to my database with data of every JSON blocks. So the JSON string will be so long. One person told to divide it to 5 parts and send it into 5 GET. Does it really effects?
What is the best solution for this job? Does then length of JSON string causes problems? And how should I fix it? Does executing 40 query in a for loop caused problem?
Some time JSON will create problem, because it follows proper structure and more over its not stranded if you want to send big data better to use POST. You can refer the bellow link for more details about GET and POST http://www.w3schools.com/tags/ref_httpmethods.asp
And coming to your problem If you avoid multiple req's to the server it will in busy mode. Instead of better make send data in single shot and you can insert data using for loop or else like bellow wise,
insert into table1 (First,Last) values ('Fred','Smith'), ('John','Smith'), ('Michael','Smith'), ('Robert','Smith');
I suggest you, use POST and send data in single shot, and crate a string
$str="insert into table1 (First,Last) values ('Fred','Smith'), ('John','Smith'), ('Michael','Smith'), ('Robert','Smith')";
like wise and execute the query. It might be fast.