RejectedSoftware Forums

Sign up

Using Parallel with vibe.d

Having to do 3 requests, d how can i do them parallel?

auto category = redis.getDatabase().send("HGETALL category:1");
auto user = redis.getDatabase().send("HGETALL user:1");
auto test = redis.getDatabase().send("HGETALL test:3");

I tried, with async from vibe.core.concurrency, but i got less performance.
Tried with .parallel but it crash's since the threads gets messed up.

How are you guys doing parallel requests? Like promises all.

Ex javascript promises.

Promise.all([p1, p2, p3]).then(values => { 
  console.log(values); // [3, 1337, "foo"] 
});

Re: Using Parallel with vibe.d

On Wed, 12 Oct 2016 15:51:31 GMT, JFaianca wrote:

Having to do 3 requests, d how can i do them parallel?

auto category = redis.getDatabase().send("HGETALL category:1");
auto user = redis.getDatabase().send("HGETALL user:1");
auto test = redis.getDatabase().send("HGETALL test:3");

I tried, with async from vibe.core.concurrency, but i got less performance.
Tried with .parallel but it crash's since the threads gets messed up.

How are you guys doing parallel requests? Like promises all.

Ex javascript promises.

Promise.all([p1, p2, p3]).then(values => { 
  console.log(values); // [3, 1337, "foo"] 
});

It would be interesting to see why async made it slower. Generally, any kind of parallelism will introduce some ampunt of overhead, so a slightly worse overall performance should be expected - except for cluster configurations. The latency of a single request on the other hand should ideally improve as long as there is any network latency involved.

For vibe.d, the canonical tool would be runTask:

Response category, user, test;
auto t1 = runTask({ category = redis.getDatabase().send("HGETALL category:1"); });
auto t2 = runTask({ user = redis.getDatabase().send("HGETALL user:1"); });
auto t3 = runTask({ test = redis.getDatabase().send("HGETALL test:3"); });
t1.join();
t2.join();
t3.join();

async should result in more or less the same operations, but has some overhead because it needs to do a dynamic memory allocation. Would be interesting to implement a scoped variant of this to avoid that. But if the above version is also considerably slower (in terms of throughput), I can try to investigate further.

A vibe.core.concurrency.parallel() function could also be handy (this could also avoid the creation of a heap delegate by guaranteeing that the delegate doesn't leave the function scope):

Response category, user, test;
parallel(
  { category = redis.getDatabase().send("HGETALL category:1"); },
  { user = redis.getDatabase().send("HGETALL user:1"); },
  { test = redis.getDatabase().send("HGETALL test:3"); }
);

Re: Using Parallel with vibe.d

On Fri, 14 Oct 2016 16:50:32 GMT, Sönke Ludwig wrote:

On Wed, 12 Oct 2016 15:51:31 GMT, JFaianca wrote:

Having to do 3 requests, d how can i do them parallel?

auto category = redis.getDatabase().send("HGETALL category:1");
auto user = redis.getDatabase().send("HGETALL user:1");
auto test = redis.getDatabase().send("HGETALL test:3");

I tried, with async from vibe.core.concurrency, but i got less performance.
Tried with .parallel but it crash's since the threads gets messed up.

How are you guys doing parallel requests? Like promises all.

Ex javascript promises.

Promise.all([p1, p2, p3]).then(values => { 
  console.log(values); // [3, 1337, "foo"] 
});

It would be interesting to see why async made it slower. Generally, any kind of parallelism will introduce some ampunt of overhead, so a slightly worse overall performance should be expected - except for cluster configurations. The latency of a single request on the other hand should ideally improve as long as there is any network latency involved.

For vibe.d, the canonical tool would be runTask:

Response category, user, test;
auto t1 = runTask({ category = redis.getDatabase().send("HGETALL category:1"); });
auto t2 = runTask({ user = redis.getDatabase().send("HGETALL user:1"); });
auto t3 = runTask({ test = redis.getDatabase().send("HGETALL test:3"); });
t1.join();
t2.join();
t3.join();

async should result in more or less the same operations, but has some overhead because it needs to do a dynamic memory allocation. Would be interesting to implement a scoped variant of this to avoid that. But if the above version is also considerably slower (in terms of throughput), I can try to investigate further.

A vibe.core.concurrency.parallel() function could also be handy (this could also avoid the creation of a heap delegate by guaranteeing that the delegate doesn't leave the function scope):

Response category, user, test;
parallel(
  { category = redis.getDatabase().send("HGETALL category:1"); },
  { user = redis.getDatabase().send("HGETALL user:1"); },
  { test = redis.getDatabase().send("HGETALL test:3"); }
);

Performance between both

 Response category, user;

 auto t1 = runTask({ category = redis.getDatabase().send("HGETALL category:1"); });
 auto t2 = runTask({ user = redis.getDatabase().send("HGETALL user:1"); });
 t1.join();
 t2.join();

wrk -t12 -c400 -d5s http://127.0.0.1:8080/posts/

Running 5s test @ http://127.0.0.1:8080/posts/
12 threads and 400 connections
Thread Stats Avg Stdev Max +/- Stdev

Latency   296.27ms  220.68ms   1.99s    91.42%
Req/Sec   103.84     80.75   666.00     67.97%

5483 requests in 5.10s, 6.35MB read
Socket errors: connect 0, read 0, write 0, timeout 74
Requests/sec: 1075.68
Transfer/sec: 1.25MB

 Response category, user;

 category = redis.getDatabase().send("HGETALL category:1");
 user = redis.getDatabase().send("HGETALL user:1");

wrk -t12 -c400 -d5s http://127.0.0.1:8080/posts/
Running 5s test @ http://127.0.0.1:8080/posts/
12 threads and 400 connections
Thread Stats Avg Stdev Max +/- Stdev

Latency   108.04ms   79.12ms 974.15ms   95.19%
Req/Sec   306.96    135.38     1.42k    75.43%

17694 requests in 5.09s, 20.50MB read
Requests/sec: 3476.40
Transfer/sec: 4.03MB

The last one is 3x times faster, it just seems odd to me.

Re: Using Parallel with vibe.d

To make it clear here, the same test.

Async version is 3x slower than the sync one.

It looks a bit odd, i can add a full example in github if needed.