There are some possibilities for working with requests that require heavy processing. I will base my answer assuming you are using nodejs as the frontend server.
First you will have to decide if the request that requires processing will be synchronous (user waits for server response) or asynchronous (user sends the request and forgets it ... at some point the server finishes processing and sends a notification to the server. client, by websocket or long pooling of the client).
Based on the answer to the previous question, the architecture of your back will be different. If you choose to do synchronous, you have to ensure that the response will not take too long to generate. So I think the best option in this case is to make a compiled library and wrap it to the nodejs and make the direct call on the node. In case you have chosen the asynchronous option. You can choose to use queues to distribute the processing load to other processes and leave the nodejs server free to understand other requests.
When using a queue nodejs would throw the messages containing the information needed to process into the queue and other processes would be monitoring the queue to get those messages and actually do the processing.
You can also use queues with synchronous requests, but you'll have to ensure that you always have enough workers to flow quickly.
Some examples of queues would be: Zeromq, RabbitMQ, IronMQ, SQS (AWS) ... There are many more, you have to see which fits your requirements better.
Some advantages of the queuing approach are:
- Decoupling between modules
- Easy to scale
- Most of the queues have an http interface, which makes it simple to use in any language. (ZeroMQ being an exception because it is a library and not a program, but it has a wrapper for many languages so it gives the same).
- Queues cushion the impact of request peaks.
As for the synchronous vs asynchronous part, I will go a little deeper to avoid doubts.
There are two levels to choose whether the call will be asynchronous or synchronous. First level is the client request to the server. Will the client wait for the response or will it do the request and go do other things while the server processes it? Second level is how the server will handle the very operation it should do when receiving a request. in the case of synchronous the server will be able to attend a request per process / thread. in the case of asynchronous the server is able to serve multiple requests per thread.
In case of the node because it is javascript and has only one thread that executes its code, the language forces you to use the server operations that require I / O asynchronously. It turned out that this model can meet more requests than the synchronous model + one request per thread.
When I asked whether the request was asynchronous or synchronous in your case, I was referring to the first option rather than the second. As you use node all your internal server operations using I / O will be asynchronous. What you should decide is whether your user will receive his response on time, with the response of the HTTP request he makes, or by following one of the 3 options:
- Querying the server with pooling to know if the operation has already been completed.
- Receiving a notification via websocket.
- When refreshing the page.