I've made an function that should do an long polling and fetch live data that is being "pushed" to me. Right now I'm testing against an json object that is formatted in the way that it will look once I receive the data. It seems as it is working accurate so far. I was merely wondering what you think about it? Would you refactor it somehow or do it entirely in another way?
var url = '../../path_to_script/respondents.json';
function fetchData() {
$.ajax({
url: url,
method: 'GET',
dataType: 'json',
contentType: "application/json; charset=utf-8",
cache: false,
success: function (data) {
//parseData(data);
setTimeout(function () { fetchData() }, 5000);
console.log(data);
},
error: function (data) {
setTimeout(function () { fetchData() }, 5000)
}
});
}
Regards
I would do some changes
method
to type
, method
isn't a valid parameter for $.ajax
. This is an errorcontentType
, with dataType: 'json'
is enough to have those values.
error: function (xhr, status, errorThrown) {
alert("There was an error processing your request.
Please try again.
Status: " + status);
}
Hope this helps. Cheers
This works like expected. Since you've wisely choosen to fire a setTimeout
once the request returned, there can't be "overlapping" requests. That is a good thing.
Anyway, you could use jQuerys "new" deferred ajax objects which is probably a little bit more convinient.
(function _poll() {
$.getJSON( url ).always(function( data ) {
console.log( data );
_poll();
});
}());
Note: .always()
is brandnew (jQuery 1.6).
Edit
Example: http://jsfiddle.net/rjgwW/6/
I suggest changing the events to:
success: function (data) {
console.log(data);
},
complete: function () {
setTimeout(function () { fetchData() }, 5000)
}
The complete event is always called after success and error. This way you will only have the setTimeout line once, which is better.