I am trying to write a Node.js application which accepts incoming requests from client and then make a call to some web service on the remote server to retrieve data.
const express = require('express')
const request = require('request')
const moment = require('moment')
const app = express()
app.get('/', (req, res) => {
request('http://localhost/sleep.php', (error, response, body) => {
res.send('get data at ' + moment().format())
})
})
app.listen(3000)
The remote service is written in PHP:
<?php
sleep(10);
echo 'Return something!';
The problem is that if the new request comes in then Node.js is blocked until it has finished the last callback. How to fix this? Any ideas?
Update:
I actually make two requests at the same time via Firefox and the second request spent almost 20 seconds.
Here's a quick demonstration that concurrent requests for the same URL will not be pipelined by the browser, but different URLs generally will. Adding a distinct value to the query string is a technique to work around this: localhost:3000/?1517849200341
using Date.now()
for instance.
(Broadly speaking, pipelining is disabled in HTTP/1.1, but browsers will use multiple TCP connections to the same end. Pipelining is part of HTTP/2 by setting the maximum number of streams. I don't really know what this means or how to interpret the result below.)
async function log(fn) {
console.log(Date());
await Promise.all(fn());
console.log(Date());
}
const req1 = 'https://httpbin.org/delay/1';
const req2 = 'https://nghttp2.org/httpbin/delay/1';
const req3 = 'https://httpbin.org/delay/1?a';
const req4 = 'https://httpbin.org/delay/1?b';
const req5 = 'https://httpbin.org/delay/1?c';
const req6 = 'https://httpbin.org/delay/1?d';
const req7 = 'https://nghttp2.org/httpbin/delay/1?a';
const req8 = 'https://nghttp2.org/httpbin/delay/1?b';
const req9 = 'https://nghttp2.org/httpbin/delay/1?c';
const req10 = 'https://nghttp2.org/httpbin/delay/1?d';
btn1.addEventListener('click', () => log(() => [
fetch(req1),
fetch(req1),
fetch(req1),
fetch(req1),
fetch(req1),
]));
btn2.addEventListener('click', () => log(() => [
fetch(req2),
fetch(req2),
fetch(req2),
fetch(req2),
fetch(req2),
]));
btn3.addEventListener('click', () => log(() => [
fetch(req1),
fetch(req3),
fetch(req4),
fetch(req5),
fetch(req6),
]));
btn4.addEventListener('click', () => log(() => [
fetch(req2),
fetch(req7),
fetch(req8),
fetch(req9),
fetch(req10),
]));
<button id=btn1>HTTP/1.1, same URLs</button>
<button id=btn2>HTTP/2, same URLs</button>
<button id=btn3>HTTP/1.1, different URLs</button>
<button id=btn4>HTTP/2, different URLs</button>
</div>
The Chrome Cache is to blame. Open in each tab a chrome dev console and click 'disable cache' then refresh each tab, you'll see the responses coming back asynchronously.I assume Firefox might also have a cache setting somewhere. Or use Postman to make multiple requests...
Or, if you really want to see this working in multiple tabs, i guess you could also disable caching from the node server (I don't recommend it for anything other than a proof of concept):
...
var nocache = require('nocache')
const app = express()
app.use(nocache())
...