I need to make multiple HTTP requests to an external server via $.ajax and as far as my understanding goes, this should be asynchronous meaning that the multiple HTTP requests do not wait until the preceding request is complete before executing another one.
However, I have found that in practice, this does not hold up. I have this function:
function gethttp(i, v) {
var data = new FormData();
data.append("raw", v.raw);
$.ajax({
url: "gethttp.php?cache=" + Math.random(),
type: "POST",
data: data,
cache: false,
dataType: "json",
processData: false,
contentType: false,
success: function(data, textStatus, jqXHR) {
renderdata(i, data);
},
error: function(jqXHR, textStatus, errorThrown) {
renderdata(i, {found: "0", bibs:""});
apply(i, "danger", false);
console.log(jqXHR);
}
});
}
I then use $.each to iterate through an array requests
which passes variables i
and v
to gethttp()
for the HTTP request.
$(document).ready(function() {
$.each(requests, function(i, v) {
gethttp(i, v);
});
});
However, this process can take up to 5 minutes depending on the size of my requests
array which can range in length anywhere between 5-500 items. My thought process was that this should take roughly 2 seconds because each HTTP request by itself takes that long, but by looping through it like the above, the requests are executed sequentiality and the entire process takes forever.
Do I need to go deploy this with node.js in order to get parallel processing? If so, I would really appreciate good tutorial on the subject. Can I do this with jQuery itself without resorting to node.js?
Thanks