I am trying to use Autobahn|Python as a websocket client for the following use case:
I have a remote server which uses WebSockets to carryout JSON RPC commands.
A WebSocket client can send a JSON RPC command (with a command id) and the server asynchronously responds to that command with the same id.
The server can also publish events asynchronously on the same WebSocket channel which are to be interpreted by the client code as appropriate.
The problem I am facing is that I want the protocol handling to be detached from the main thread of my Python application. Event loop achieves that but I also need synchronization events between my main thread and the WebSocket event loop. For example, here’s a use case that I am not able to find examples for in the community: Send a JSON RPC command to the server and waits synchronously for a response to arrive.
How do I achieve this in autobahn|python? Are there any examples that are a bit more fancier than a simple echo example?
This will depend on your event-loop. If you’re writing all-new code, I’d avoid threads entirely.
However you can use https://pypi.org/project/crochet/ to use Twisted code from synchronous code. I’m not sure if there’s an asyncio equivalent, but asyncio.run_until_complete may work for you use-case.
I may misunderstand your use-case, but if you’re doing what I think you’re doing, one possible solution would be to run two processes rather than threads and run an event loop in each process … then synchronise the processes via a Queue object or similar. The “multiprocessing” library pretty much makes this approach as easy as using threads … benefit here is you effectively get access to two discrete processes i.e. twice the processing potential … (you “can” run an event loop per thread, but I’m not sure it’s “recommended”)
@oddjobz, thanks for the response. I was finally able to make autobahn work nicely with my app. As to your comment:
… (you “can” run an event loop per thread, but I’m not sure it’s “recommended”)
In my use case, the two threads (one for communication with the websocket client event-loop and the other for handling the incoming events) are both io-bound. I thought on io-bound situations, multiple threads yield good performance.
While multiple processes are a good idea, I am running all of this inside a docker where the general recommendation seems to be to not run too many processes.
If you have IO bound threads then I would expect better performance than a single thread. However (!) the performance increase may be marginal depending on the implementation of the IO drivers you are using. If the driver in question does not release the GIL (!) then it will prevent the other thread(s) from getting any air-time. Moving beyond that, you will still only have python running on one thread at any given time. If on the other hand you use processes, you are more likely to get double the speed from two threads as there is no GIL overlap and Python can run in both processes at the same time.
I’ve heard many people say that if you’re using asyncio, the benefits of using threads are very limited … and I wouldn’t disagree. Threads on the other hand are quite handy for solving issues involving blocking and asyncio, for example if you need to call a sync IO driver from your asyncio code, and you don’t want to block your event loop.