This is a theoretical question. I know exactly what's going wrong, I just don't see a way around it.
I'm calling calling php page B from php page A in an ajax request and submitting a query in php page B. I'm trying to get the data back to php page A as a json_encoded array. The problem is that the json variable is overflowing (maxing out the memory_limit).
As a result I'm getting an exception:
[object Object] parsererror SyntaxError: JSON.parse: unexpected character
If I wasn't trying to get this data back as a json_encoded array for purposes of displaying it, I would just echo the data to the screen rather than store it in the variable. But, since I need to get that data back as a json_encoded array, I kind of have to store it.
The only solution I can think of is to simply echo the data out on php page B and change the way I'm handling the date on page A (i.e. not expecting json as the dataType).
Is there any other possible way (barring limiting the rows returned from the query)?
EDIT: I also don't want to change any of the settings in the php.ini. There has to be a clever solution without mucking around with settings.
Use pagination to request the data from the server. Essentially, you'll request n results at a time, until you reach a page number that returns 0 results. So, instead of requesting 20k records at once, you'll request 1000 records 20 times in succession (or whatever number you page your data at).
var resultArr = [];
var url = "foo.php";
function handleData(data) {
// do stuff with data here
console.log(data);
}
function getData(url,page) {
$.ajax(url,{data: { page: page}, type: "GET", dataType:"json"}).done(function(data){
if (data.length != 0) {
resultArr = resultArr.concat(data);
getData(url,page+1);
}
else {
handleData(resultArr);
}
});
}
getData(url,0);
And for your query, limit it to n
rows starting with $_GET["page"]*n+1