Is this performance normal?

Hello, I am using the RPC function of crossbar. My callee is as follows:

# -*- coding: utf-8 -*-
import datetime
from autobahn.asyncio.component import run
from autobahn.asyncio.component import Component
from autobahn.wamp import RegisterOptions

component = Component(
    transports="ws://127.0.0.1:8080/ws",
    realm="realm1",
)

passwords = ["jack@123"]


@component.register("com.signature.proxy")
def proxy(password):
    if password in passwords:
        return datetime.datetime.utcnow().strftime("%Y-%m-%dT%H:%M:%SZ")
    return "error"


if __name__ == '__main__':
    run([component])

I use httpbridge to call:

# -*- coding: utf-8 -*-
import requests


class SignatureInvoke:

    def __init__(self):
        self.invoke_url = 'http://127.0.0.1:8080/invoke'

    def invoke(self):
        payload = {
            'procedure': "com.signature.proxy",
            "args": ["jack@123"],
        }
        response = requests.post(self.invoke_url, json=payload)
        print(response.text)

if __name__ == '__main__':
    signature_invoke = SignatureInvoke()
    import time

    start = time.time()

    for index in range(200):
        signature_invoke.invoke()

    end = time.time()
    print(end-start)

The problem is: when I call locally, the speed is very fast. 200 requests are completed in about 0.8 seconds, but when I deploy crossbar on my server, the speed is slow. 200 requests, about 4 seconds to complete. My server configuration is as follows:

model name      : Intel(R) Xeon(R) Platinum 8269CY CPU @ 2.50GHz
model name      : Intel(R) Xeon(R) Platinum 8269CY CPU @ 2.50GHz

MemTotal:        3892516 kB

Currently used bandwidth:2Mbps

All this is due to the impact of my server performance and bandwidth? I don’t know whether this performance is normal. :grinning:

Hi there,
from a quick look, yes, this is what I would expect: compare the times report in “ping localhost” and “ping yourserver”. This is the factor of slowdown to expect because of the network latency to server. The actual problem is: your test is bogus;) If you want to measure throughput, you need to issue many HTTP requests from your load client concurrently … to hide the non-zero latency.
Hope this helps,
Cheers,
/Tobias

1 Like