Let's say I have a simple jQuery POST script:
$(document).ready(function () {
$.post("post-data.php")
.done(function (data) {
// Done function
}).fail(function () {
// Fail function
});
});
Let's say I have this HTML:
<div id="page">
<a href="link.php">Some link</a>
<div id="jquery-data-response"></div>
</div>
If post-data.php returns data after 10 seconds and I click on "Some link" within 10 seconds, the page "link.php" is loaded after the jQuery POST request is completed, so the user has to wait for a maximum of 10 seconds.
This applies to jQuery GET requests as well.
Now I have two questions.
Where is the problem?
The issue at hand is how you manage interactions independently of the request. If you don't want someone to click a link before a request has been fulfilled, then you should manage that until it does.
For example, declare a flag in your code that determines whether or not your request has been resolved. Defaults to false
. Then run your ajax query. Upon success, it will set requestComplete
to true
.
let requestComplete = false;
$.post("post-data.php")
.done(function (data) {
// this will prompt your listener to allow native execution
requestComplete = true;
})
.fail(function () { ... });
Bind a listener to your link(s) that only allows interaction if the flag is true
. The return value of an event handler determines whether or not the default browser behaviour should take place as well. In the case of clicking on links this would be following the link.
$(document).on('click', '.link', function (event) {
return requestComplete;
});
And so your markup will look something like this:
<div id="page">
<a href="link.php" class="link">Some link</a>
<div id="jquery-data-response"></div>
</div>
To demonstrate the blocking of a browsers default behaviour, explore this fiddle.