Redisson openning too many connections to add data to Redis queue

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP

Redisson openning too many connections to add data to Redis queue


Redis: 4.9.103
Redisson:3.7.5



I have a Spring boot application that uses Redisson to access Redis.



I have been struggling with connection leak issue that is ultimately causing the application to fail with the following "Unable to get connection" error


[ERROR] – Servlet.service() for servlet [dispatcherServlet] in context with path threw exception [Request processing failed; nested exception is org.redisson.client.RedisTimeoutException: Unable to get connection! Node source: NodeSource [slot=null, addr=null, redisClient=null, redirect=null, entry=org.redisson.connection.SingleEntry@30bc754c], command: (RPUSH), command params: [PQ5, PooledUnsafeDirectByteBuf(ridx: 0, widx: 72, cap: 256)] after 3 retry attempts] with root cause
org.redisson.client.RedisTimeoutException: Unable to get connection! Node source: NodeSource [slot=null, addr=null, redisClient=null, redirect=null, entry=org.redisson.connection.SingleEntry@30bc754c], command: (RPUSH), command params: [PQ5, PooledUnsafeDirectByteBuf(ridx: 0, widx: 72, cap: 256)] after 3 retry attempts
at org.redisson.command.CommandAsyncService$8.run(CommandAsyncService.java:532) ~[redisson-3.6.5.jar!/:?]
at io.netty.util.HashedWheelTimer$HashedWheelTimeout.expire(HashedWheelTimer.java:668) ~[netty-common-4.1.24.Final.jar!/:4.1.24.Final]
at io.netty.util.HashedWheelTimer$HashedWheelBucket.expireTimeouts(HashedWheelTimer.java:743) ~[netty-common-4.1.24.Final.jar!/:4.1.24.Final]
at io.netty.util.HashedWheelTimer$Worker.run(HashedWheelTimer.java:471) ~[netty-common-4.1.24.Final.jar!/:4.1.24.Final]
at java.lang.Thread.run(Thread.java:745) [?:1.8.0_121]



The primary activity of the application is to get data from the customer and post it on a blocking queue.



To test what exactly is happening to the connection count when data is added to the queue, I tested the following test program.


public static void main(String args)
RedissonClient redisson = Redisson.create();
RBlockingQueue<String> pq = redisson.getBlockingQueue("PQString");
for (int i = 0; i < 1_000; i++)
pq.add("x");
try
Thread.sleep(100);
catch (InterruptedException e)

redisson.shutdown();



Before starting this program, Redis connections are 3


->watch -n 0.5 'lsof -i tcp:6379 | wc -l'

Every 0.5s: lsof -i tcp:6379 | wc -l
3



When the program runs, my connection count jumps to 69 .


Every 0.5s: lsof -i tcp:6379 | wc -l
69



I was not expecting the connections to go above 67 (64 default connections + 3 to begin with). I am trying to find out who is using additional 2 connections and is that the reason why in the actual application, connections increase rapidly as in the actual application, I have many such queues where data is being added. There I see, connections keep on increasing before they run out.



My Redisson configuration in the actual application is as follows:


Config config = new Config();
config.useSingleServer()
.setIdleConnectionTimeout(20_000)
.setPingTimeout(1000)
.setConnectTimeout(20_000)
.setTimeout(30_000)
.setRetryAttempts(3)
.setRetryInterval(1500)
.setReconnectionTimeout(3000)
.setFailedAttempts(3)
.setAddress("http://SERVER:PORT")
.setConnectionMinimumIdleSize(10)
.setConnectionPoolSize(64)
.setSubscriptionsPerConnection(5);



Why does queue add method adds up these many connections. What am I missing that is causing connections to leak?









By clicking "Post Your Answer", you acknowledge that you have read our updated terms of service, privacy policy and cookie policy, and that your continued use of the website is subject to these policies.

Comments

Popular posts from this blog

Executable numpy error

Trying to Print Gridster Items to PDF without overlapping contents

Mass disable jenkins jobs