2

I'm having a problem creating a inter-process communication for my python application. I have two python scripts at hand, let's say A and B. A is used to open a huge file, keep it in memory and do some processing that Mysql can't do, and B is a process used to query A very often.

Since the file A needs to read is really large, I hope to read it once and have it hang there waiting for my Bs' to query.

What I do now is, I use cherrypy to build a http-server. However, I feel it's kind of awkward to do so since what I'm trying to do is absolutely local. So, I'm wondering are there other more organic way to achieve this goal?

I don't know much about TCP/socket etc. If possible, toy examples would be appreciate (please include the part to read file).

Charles
51.5k13 gold badges107 silver badges146 bronze badges
asked Jun 14, 2013 at 23:35

1 Answer 1

3

Python has good support for ZeroMQ, which is much easier and more robust than using raw sockets.

The ZeroMQ site treats Python as one of its primary languages and offers copious Python examples in its documentation. Indeed, the example in "Learn the Basics" is written in Python.

answered Jun 14, 2013 at 23:41
Sign up to request clarification or add additional context in comments.

1 Comment

Thx! I tried it out, and works like charm. Actually it's so simple that I just checked the page: nichol.as/zeromq-an-introduction

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.