I have the following scenario:
A local PC receives data samples via bluetooth at 50.000 bits/sec. The data is send via UDP to some server. The server in turn distributes the data via web page/JavaScript and web sockets to connected browsers where the data is processed. Eventually, the results arriving from the browsers are passed on via UDP back to the local PC.
So far I'm experimenting with a strictly local setup, i.e. everything runs on one machine which has a CPU with four cores. I've written server code in both node.js and golang. In both cases there is a significant data loss, i.e. not every sample that is send via UDP is successfully received by the server even in case only one web socket client is connected.
Where is the bottleneck causing the loss? Is it the fact that everything runs on a local machine? Could it be that the web socket bandwidth is too small? Would I be better of with WebRTC? Or is it be something else entirely?
It is hard to say where exactly the bottleneck in your case is.
But UDP is an unreliable protocol (can loose data) while WebSockets (which uses TCP) is not. This means that the messages are probably lost by a processes which reads or writes the UDP data. Such packet loss might for example occur because these apps are too slow in general to read the data or because the socket buffers are too small to handle fluctuations in reading/writing speed caused by process scheduling or similar.