1

I have a Node.js HTTP REST API server and a Node.js WS server. I separated them into 2 entrypoints so that they can be more easily debugged and run separated from each other, but now I need the REST API to produce results inside the WS server, to send specific data to WS clients that are connected, depending on which of them are connected. I want to send data from REST server to the WS server so that there is no polling or inefficient DB queries. A vague idea formed in my mind: IPC. Is this the best way to do it? I never did it before. I deploy both server programs on a Linux server computer and I wish to be able to do debugging as easy as until this point in time.

asked Aug 3, 2021 at 5:13
4
  • 1
    A common solution is a message bus/ message queue like kafka, RabbitMQ, NATS etc. Otherwise you can create a REST endpoint on your WS-Server which can be only called by your REST Api. Commented Aug 3, 2021 at 5:53
  • @Darem I solved the issue by creating a REST endpoint on my WS server. You can post an answer if you wish to. Thank you. Commented Aug 6, 2021 at 12:40
  • 1
    I think there are more than enough articles out there that explain the topic very nicely. Therefore a answer would not be enough to describe all necessary details. For example dzone.com/articles/communicating-between-microservices Commented Aug 9, 2021 at 12:40
  • It seems like splitting them apart created extra complication. Have you considered unsplitting them? Commented Sep 14, 2022 at 20:08

1 Answer 1

1

There are multiple solutions for that:

  1. Creating REST API on your WS server to allow connections from HTTP server on the same machine. Actually, you can use UNIX socket to expose this service only to local machine and configure users/groups of socket file for better security. This approach is good if you need to receive a response from WS servers on call, and it's good for synchronous communication.
  2. Run a message queue on your local machine, and connect REST server as a producer and WS server as a consumer to this queue. It's good if you need reliability on message delivery, e.g. if you update the version of WS server when you send the request, the message queue will be responsible to deliver this message after restart. Also, you can deploy message queue server to multiple instances for better availability. Here you get asynchronous communication, and REST server won't get immediate responses.
  3. Local communications on same machine (IPC, shared-memory, signals, etc) - it could be much faster than first two options, but you will not be able to split servers and deploy to different machines in the future (and keeping communication channel)
  4. Database - if you don't need immediate reaction from WS, you can put some data for processing to database and take periodically pull it on WS process.
answered Oct 12, 2021 at 6:30

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.