求助:netty 4.x服务器端出现CLOSE_WAIT的问题

1.主题:我最近用netty4.x 做了一个app服务端,在部署到服务器上之后出现了很多close_wait 状态的TCP连接
,导致服务端卡住,不能再接收新的连接,但是换回本地测试又不会出现这样的问题。

2.详细描述:
1)当服务端出现卡住的情况时,使用netstat -ano 命令可以看到服务器的连接状态还是established,抓包也能看到客户端仍然在正常发送数据包,但是服务器只是回应了一个ACK(此时服务端已经卡住,控制台没有任何动作,也没有日志记录)。

图片说明

下面那一个是心跳包。
只要客户端不关闭连接,一直是established,直到客户端断开连接后,就变成了close_wait。只有一次服务端从这种“卡死”状态恢复,并且打印了日志(比如"用户的断开连接")。
一开始我以为是某一步阻塞,而导致了这个情况,于是又用jstack 命令查看了阻塞状态,转下节,
2) 显示结果:
图片说明

3)在查看官方的使用手册和《netty 实战》中,有提到入站出站消息需要使用ReferenceCountUtil.release()进行,手动释放,但我的编解码器用的分别是ByteToMessageDecoder和MessageToByteEncoder,源码上这两个都进行了Bytebuf的释放处理,
所以问题应该不是出在这里吧.....
以下是编解码器的部分代码:

encoder

     @Override
    protected void encode(ChannelHandlerContext ctx, YingHeMessage msg, ByteBuf out) throws Exception {
        checkMsg(msg);// not null
        int type = msg.getProtoId();
        int contentLength = msg.getContentLength();
        String body = msg.getBody();
        out.writeInt(type);
        out.writeInt(contentLength);
        out.writeBytes(body.getBytes(Charset.forName("UTF-8")));
    }

decoder

     //int+int
    private static final int HEADER_SIZE = 8;
    private static final int LEAST_SIZE = 4;
    private static final Logger LOG = LoggerFactory.getLogger(YingHeMessageDecoder.class);

    @Override
    protected void decode(ChannelHandlerContext ctx, ByteBuf in, List<Object> out) throws Exception {

        in.markReaderIndex();//第一次mark
        int readable = in.readableBytes();
        LOG.info("check:{}", in.readableBytes() < HEADER_SIZE);
        LOG.info("readable:{}", readable);
        if (in.readableBytes() < HEADER_SIZE) {//消息过小回滚指针,不作处理
            LOG.warn(">>>可读字节数小于头部长度!");
            LOG.info("before reset:{}", in.readerIndex());
            in.resetReaderIndex();
            LOG.info("after reset:{}", in.readerIndex());
            return;
        }
        //读取消息类型
        int type = in.readInt();
        int contentLength = in.readInt();
        LOG.info("type:{},contentLength:{}", type, contentLength);
        in.markReaderIndex();//第二次mark
        int readable2 = in.readableBytes();
        if (readable2 < contentLength) {
            LOG.error("内容长度错误!length=" + contentLength);
            in.resetReaderIndex();//重设readerIndex
            LOG.info("重设,当前readerIndex:" + in.readerIndex());
            return;
        }
        //读取内容
        ByteBuf buf = in.readBytes(contentLength);
        byte[] content = new byte[buf.readableBytes()];
        buf.readBytes(content);
        String body = new String(content, "UTF-8");
        YingHeMessage message = new YingHeMessage(type, contentLength, body);
        out.add(message);
    }

下面是服务器启动类的配置:

  public void run() throws Exception {

        EventLoopGroup boss = new NioEventLoopGroup();
        EventLoopGroup worker = new NioEventLoopGroup(5);

        try {
            ServerBootstrap b = new ServerBootstrap();
            b.group(boss, worker)
                    .channel(NioServerSocketChannel.class)
                    .option(ChannelOption.SO_BACKLOG, 1024)
                    .option(ChannelOption.SO_REUSEADDR, true)
                    .childOption(ChannelOption.TCP_NODELAY, true)
                    .childOption(ChannelOption.SO_KEEPALIVE, true)
                    .handler(new LoggingHandler(LogLevel.DEBUG))
                    .childHandler(new ChannelInitializer<SocketChannel>() {
                        @Override
                        protected void initChannel(SocketChannel ch) throws Exception {
                            ch.pipeline()
                                    .addLast(new LengthFieldBasedFrameDecoder(MAX_LENGTH,
                                            LENGTH_FIELD_OFFSET,
                                            LENGTH_FIELD_LENGTH,
                                            LENGTH_ADJUSTMENT,
                                            INITIAL_BYTES_TO_STRIP))
                                    .addLast(new ReadTimeoutHandler(60))
                                    .addLast(new YingHeMessageDecoder())
                                    .addLast(new YingHeMessageEncoder())
                                    .addLast(new ServerHandlerInitializer())
                                    .addLast(new Zenith());
                        }
                    });
            Properties properties = new Properties();
            InputStream in = YingHeServer.class.getClassLoader().getResourceAsStream("net.properties");
            properties.load(in);
            Integer port = Integer.valueOf(properties.getProperty("port"));
            ChannelFuture f = b.bind(port).sync();
            LOG.info("服务器启动,绑定端口:" + port);
            DiscardProcessorUtil.init();
            System.out.println(">>>flush all:" + RedisConnector.getConnector().flushAll());
            LOG.info(">>>redis connect test:ping---received:{}", RedisConnector.getConnector().ping());
            f.channel().closeFuture().sync();
        } catch (IOException e) {
            e.printStackTrace();
        } finally {
            //清除
            LOG.info("优雅退出...");
            boss.shutdownGracefully();
            worker.shutdownGracefully();
            ChannelGroups.clear();
        }
    }

4)其他补充说明:
服务器为windows server 2012r;
客户端使用的C sharp编写;
服务端使用了Netty 4.1.26.Final,Mybatis,Spring,fastjson,redis(缓存),c3p0(连接池);
本地测试不会出现这种情况!

40c币奉上,还请各位大牛不吝赐教,救小弟于水火啊!

1个回答

服务器端有没有close关闭跟客户端的连接,有没有失败等信息。

u011022489
liulovehong 回复CelLew: 你好,你的问题是怎么解决的,我也遇到同样的问题,我把LoggingHandler去了以后还是有同样的问题。
一年多之前 回复
CelLew
CelLew 问题找到了...很傻的问题:原因出在日志上,当同时配置了输出到控制台和输出到文件时会造成阻塞(狂汗-_-|||)...不过还是谢谢大佬,40c奉上
一年多之前 回复
CelLew
CelLew 在客户端没有关闭与服务器的连接之前TCP状态都是正常的,只是服务器卡在那儿无法接收和处理客户端请求
一年多之前 回复
CelLew
CelLew 服务器端没有失败消息,是直接没反应了,也没法close和客户端的连接;服务器内存也是正常的;
一年多之前 回复
oyljerry
oyljerry 还有就是close时间间隔调小点,以及服务器内存是否比较紧张
一年多之前 回复
Csdn user default icon
上传中...
上传图片
插入图片
抄袭、复制答案,以达到刷声望分或其他目的的行为,在CSDN问答是严格禁止的,一经发现立刻封号。是时候展现真正的技术了!
其他相关推荐
netty在运行一段时候以后就会卡死,请大佬帮忙看看是怎么回事
服务端在运行若干小时以后就会卡死,不管客户端再怎么请求也收不到任何返回。主要是出现在业务逻辑线程池里: 我先放出线程堆栈: Full thread dump Java HotSpot(TM) 64-Bit Server VM (25.121-b13 mixed mode): "Attach Listener" #82 daemon prio=9 os_prio=0 tid=0x00007fe4e4001000 nid=0x5f1 waiting on condition [0x0000000000000000] java.lang.Thread.State: RUNNABLE "nioEventLoopGroup-4-32" #74 prio=10 os_prio=0 tid=0x00007fe46403c800 nid=0x6e28 runnable [0x00007fe45814d000] java.lang.Thread.State: RUNNABLE at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method) at sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:269) at sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:93) at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86) - locked <0x0000000080b08d08> (a io.netty.channel.nio.SelectedSelectionKeySet) - locked <0x0000000080b09dc0> (a java.util.Collections$UnmodifiableSet) - locked <0x0000000080b08c10> (a sun.nio.ch.EPollSelectorImpl) at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97) at io.netty.channel.nio.SelectedSelectionKeySetSelector.select(SelectedSelectionKeySetSelector.java:62) at io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:752) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:408) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) at java.lang.Thread.run(Thread.java:745) "nioEventLoopGroup-4-31" #72 prio=10 os_prio=0 tid=0x00007fe46403a800 nid=0x6dc2 waiting on condition [0x00007fe45818d000] java.lang.Thread.State: WAITING (parking) at sun.misc.Unsafe.park(Native Method) - parking to wait for <0x000000008012d6d8> (a java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject) at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175) at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039) at org.apache.commons.pool2.impl.LinkedBlockingDeque.takeFirst(LinkedBlockingDeque.java:587) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:440) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:361) at redis.clients.jedis.util.Pool.getResource(Pool.java:50) at redis.clients.jedis.JedisPool.getResource(JedisPool.java:234) at com.game.redis.RedisManager.getString(RedisManager.java:219) at com.game.handler.CapitalHandler.hander(CapitalHandler.java:36) at com.game.netty.TcpServerHanler.channelRead(TcpServerHanler.java:107) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:310) at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:297) at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:413) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:265) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1334) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:926) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:134) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:644) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:579) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:496) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:458) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) at java.lang.Thread.run(Thread.java:745) "nioEventLoopGroup-4-30" #71 prio=10 os_prio=0 tid=0x00007fe464038800 nid=0x6cd0 waiting on condition [0x00007fe4581ce000] java.lang.Thread.State: WAITING (parking) at sun.misc.Unsafe.park(Native Method) - parking to wait for <0x000000008012d6d8> (a java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject) at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175) at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039) at org.apache.commons.pool2.impl.LinkedBlockingDeque.takeFirst(LinkedBlockingDeque.java:587) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:440) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:361) at redis.clients.jedis.util.Pool.getResource(Pool.java:50) at redis.clients.jedis.JedisPool.getResource(JedisPool.java:234) at com.game.redis.RedisManager.getString(RedisManager.java:219) at com.game.handler.CapitalHandler.hander(CapitalHandler.java:36) at com.game.netty.TcpServerHanler.channelRead(TcpServerHanler.java:107) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:310) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:284) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1334) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:926) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:134) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:644) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:579) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:496) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:458) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) at java.lang.Thread.run(Thread.java:745) "nioEventLoopGroup-4-29" #70 prio=10 os_prio=0 tid=0x00007fe464037000 nid=0x6cae waiting on condition [0x00007fe45820f000] java.lang.Thread.State: WAITING (parking) at sun.misc.Unsafe.park(Native Method) - parking to wait for <0x000000008012d6d8> (a java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject) at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175) at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039) at org.apache.commons.pool2.impl.LinkedBlockingDeque.takeFirst(LinkedBlockingDeque.java:587) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:440) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:361) at redis.clients.jedis.util.Pool.getResource(Pool.java:50) at redis.clients.jedis.JedisPool.getResource(JedisPool.java:234) at com.game.redis.RedisManager.getString(RedisManager.java:219) at com.game.handler.CapitalHandler.hander(CapitalHandler.java:36) at com.game.netty.TcpServerHanler.channelRead(TcpServerHanler.java:107) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:310) at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:297) at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:413) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:265) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1334) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:926) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:134) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:644) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:579) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:496) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:458) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) at java.lang.Thread.run(Thread.java:745) "nioEventLoopGroup-4-28" #69 prio=10 os_prio=0 tid=0x00007fe464036000 nid=0x6c70 waiting on condition [0x00007fe458291000] java.lang.Thread.State: WAITING (parking) at sun.misc.Unsafe.park(Native Method) - parking to wait for <0x000000008012d6d8> (a java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject) at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175) at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039) at org.apache.commons.pool2.impl.LinkedBlockingDeque.takeFirst(LinkedBlockingDeque.java:587) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:440) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:361) at redis.clients.jedis.util.Pool.getResource(Pool.java:50) at redis.clients.jedis.JedisPool.getResource(JedisPool.java:234) at com.game.redis.RedisManager.getString(RedisManager.java:219) at com.game.handler.CapitalHandler.hander(CapitalHandler.java:36) at com.game.netty.TcpServerHanler.channelRead(TcpServerHanler.java:107) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:310) at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:297) at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:413) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:265) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1334) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:926) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:134) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:644) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:579) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:496) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:458) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) at java.lang.Thread.run(Thread.java:745) "nioEventLoopGroup-4-27" #68 prio=10 os_prio=0 tid=0x00007fe464034000 nid=0x6c1d waiting on condition [0x00007fe458250000] java.lang.Thread.State: WAITING (parking) at sun.misc.Unsafe.park(Native Method) - parking to wait for <0x000000008012d6d8> (a java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject) at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175) at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039) at org.apache.commons.pool2.impl.LinkedBlockingDeque.takeFirst(LinkedBlockingDeque.java:587) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:440) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:361) at redis.clients.jedis.util.Pool.getResource(Pool.java:50) at redis.clients.jedis.JedisPool.getResource(JedisPool.java:234) at com.game.redis.RedisManager.getString(RedisManager.java:219) at com.game.handler.CapitalHandler.hander(CapitalHandler.java:36) at com.game.netty.TcpServerHanler.channelRead(TcpServerHanler.java:107) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:310) at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:297) at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:413) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:265) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1334) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:926) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:134) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:644) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:579) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:496) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:458) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) at java.lang.Thread.run(Thread.java:745) "nioEventLoopGroup-4-26" #66 prio=10 os_prio=0 tid=0x00007fe464032800 nid=0x6bf4 waiting on condition [0x00007fe4582d2000] java.lang.Thread.State: WAITING (parking) at sun.misc.Unsafe.park(Native Method) - parking to wait for <0x000000008012d6d8> (a java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject) at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175) at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039) at org.apache.commons.pool2.impl.LinkedBlockingDeque.takeFirst(LinkedBlockingDeque.java:587) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:440) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:361) at redis.clients.jedis.util.Pool.getResource(Pool.java:50) at redis.clients.jedis.JedisPool.getResource(JedisPool.java:234) at com.game.redis.RedisManager.getString(RedisManager.java:219) at com.game.handler.CapitalHandler.hander(CapitalHandler.java:36) at com.game.netty.TcpServerHanler.channelRead(TcpServerHanler.java:107) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:310) at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:297) at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:413) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:265) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1334) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:926) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:134) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:644) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:579) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:496) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:458) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) at java.lang.Thread.run(Thread.java:745) "nioEventLoopGroup-4-25" #65 prio=10 os_prio=0 tid=0x00007fe464030800 nid=0x6be5 waiting on condition [0x00007fe458313000] java.lang.Thread.State: WAITING (parking) at sun.misc.Unsafe.park(Native Method) - parking to wait for <0x000000008012d6d8> (a java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject) at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175) at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039) at org.apache.commons.pool2.impl.LinkedBlockingDeque.takeFirst(LinkedBlockingDeque.java:587) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:440) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:361) at redis.clients.jedis.util.Pool.getResource(Pool.java:50) at redis.clients.jedis.JedisPool.getResource(JedisPool.java:234) at com.game.redis.RedisManager.getKeysList(RedisManager.java:324) at com.game.handler.DrawTaskRewardHandler.hander(DrawTaskRewardHandler.java:51) at com.game.netty.TcpServerHanler.channelRead(TcpServerHanler.java:107) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:310) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:284) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1334) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:926) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:134) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:644) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:579) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:496) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:458) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) at java.lang.Thread.run(Thread.java:745) "nioEventLoopGroup-4-24" #64 prio=10 os_prio=0 tid=0x00007fe46402f000 nid=0x6bd9 runnable [0x00007fe458355000] java.lang.Thread.State: RUNNABLE at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method) at sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:269) at sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:93) at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86) - locked <0x0000000080c7c3b0> (a io.netty.channel.nio.SelectedSelectionKeySet) - locked <0x0000000080c7c3a0> (a java.util.Collections$UnmodifiableSet) - locked <0x0000000080c7c3c8> (a sun.nio.ch.EPollSelectorImpl) at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97) at io.netty.channel.nio.SelectedSelectionKeySetSelector.select(SelectedSelectionKeySetSelector.java:62) at io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:752) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:408) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) at java.lang.Thread.run(Thread.java:745) "nioEventLoopGroup-4-23" #63 prio=10 os_prio=0 tid=0x00007fe46402d000 nid=0x6b77 waiting on condition [0x00007fe458395000] java.lang.Thread.State: WAITING (parking) at sun.misc.Unsafe.park(Native Method) - parking to wait for <0x000000008012d6d8> (a java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject) at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175) at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039) at org.apache.commons.pool2.impl.LinkedBlockingDeque.takeFirst(LinkedBlockingDeque.java:587) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:440) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:361) at redis.clients.jedis.util.Pool.getResource(Pool.java:50) at redis.clients.jedis.JedisPool.getResource(JedisPool.java:234) at com.game.redis.RedisManager.getString(RedisManager.java:219) at com.game.handler.CapitalHandler.hander(CapitalHandler.java:36) at com.game.netty.TcpServerHanler.channelRead(TcpServerHanler.java:107) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:310) at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:297) at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:413) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:265) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1334) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:926) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:134) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:644) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:579) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:496) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:458) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) at java.lang.Thread.run(Thread.java:745) "nioEventLoopGroup-4-22" #62 prio=10 os_prio=0 tid=0x00007fe46402b000 nid=0x6b25 runnable [0x00007fe4583d7000] java.lang.Thread.State: RUNNABLE at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method) at sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:269) at sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:93) at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86) - locked <0x0000000080c19bb8> (a io.netty.channel.nio.SelectedSelectionKeySet) - locked <0x0000000080c19ba8> (a java.util.Collections$UnmodifiableSet) - locked <0x0000000080c19bd0> (a sun.nio.ch.EPollSelectorImpl) at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97) at io.netty.channel.nio.SelectedSelectionKeySetSelector.select(SelectedSelectionKeySetSelector.java:62) at io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:752) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:408) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) at java.lang.Thread.run(Thread.java:745) "nioEventLoopGroup-4-21" #61 prio=10 os_prio=0 tid=0x00007fe46402a000 nid=0x6ae7 waiting on condition [0x00007fe458417000] java.lang.Thread.State: WAITING (parking) at sun.misc.Unsafe.park(Native Method) - parking to wait for <0x000000008012d6d8> (a java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject) at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175) at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039) at org.apache.commons.pool2.impl.LinkedBlockingDeque.takeFirst(LinkedBlockingDeque.java:587) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:440) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:361) at redis.clients.jedis.util.Pool.getResource(Pool.java:50) at redis.clients.jedis.JedisPool.getResource(JedisPool.java:234) at com.game.redis.RedisManager.getString(RedisManager.java:219) at com.game.handler.CapitalHandler.hander(CapitalHandler.java:36) at com.game.netty.TcpServerHanler.channelRead(TcpServerHanler.java:107) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:310) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:284) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1334) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:926) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:134) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:644) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:579) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:496) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:458) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) at java.lang.Thread.run(Thread.java:745) "nioEventLoopGroup-4-20" #59 prio=10 os_prio=0 tid=0x00007fe464028800 nid=0x6ab6 waiting on condition [0x00007fe458ab5000] java.lang.Thread.State: WAITING (parking) at sun.misc.Unsafe.park(Native Method) - parking to wait for <0x000000008012d6d8> (a java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject) at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175) at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039) at org.apache.commons.pool2.impl.LinkedBlockingDeque.takeFirst(LinkedBlockingDeque.java:587) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:440) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:361) at redis.clients.jedis.util.Pool.getResource(Pool.java:50) at redis.clients.jedis.JedisPool.getResource(JedisPool.java:234) at com.game.redis.RedisManager.getString(RedisManager.java:219) at com.game.handler.CapitalHandler.hander(CapitalHandler.java:36) at com.game.netty.TcpServerHanler.channelRead(TcpServerHanler.java:107) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:310) at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:297) at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:413) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:265) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1334) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:926) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:134) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:644) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:579) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:496) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:458) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) at java.lang.Thread.run(Thread.java:745) "nioEventLoopGroup-4-19" #58 prio=10 os_prio=0 tid=0x00007fe464026800 nid=0x6aad waiting on condition [0x00007fe458af6000] java.lang.Thread.State: WAITING (parking) at sun.misc.Unsafe.park(Native Method) - parking to wait for <0x000000008012d6d8> (a java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject) at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175) at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039) at org.apache.commons.pool2.impl.LinkedBlockingDeque.takeFirst(LinkedBlockingDeque.java:587) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:440) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:361) at redis.clients.jedis.util.Pool.getResource(Pool.java:50) at redis.clients.jedis.JedisPool.getResource(JedisPool.java:234) at com.game.redis.RedisManager.getString(RedisManager.java:219) at com.game.handler.CapitalHandler.hander(CapitalHandler.java:36) at com.game.netty.TcpServerHanler.channelRead(TcpServerHanler.java:107) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:310) at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:297) at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:413) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:265) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1334) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:926) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:134) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:644) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:579) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:496) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:458) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) at java.lang.Thread.run(Thread.java:745) "nioEventLoopGroup-4-18" #57 prio=10 os_prio=0 tid=0x00007fe464025000 nid=0x6aac runnable [0x00007fe458b38000] java.lang.Thread.State: RUNNABLE at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method) at sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:269) at sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:93) at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86) - locked <0x000000008086c6f0> (a io.netty.channel.nio.SelectedSelectionKeySet) - locked <0x000000008086c6e0> (a java.util.Collections$UnmodifiableSet) - locked <0x000000008086c708> (a sun.nio.ch.EPollSelectorImpl) at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97) at io.netty.channel.nio.SelectedSelectionKeySetSelector.select(SelectedSelectionKeySetSelector.java:62) at io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:752) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:408) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) at java.lang.Thread.run(Thread.java:745) "nioEventLoopGroup-4-17" #56 prio=10 os_prio=0 tid=0x00007fe464023000 nid=0x6a9a waiting on condition [0x00007fe458b78000] java.lang.Thread.State: WAITING (parking) at sun.misc.Unsafe.park(Native Method) - parking to wait for <0x000000008012d6d8> (a java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject) at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175) at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039) at org.apache.commons.pool2.impl.LinkedBlockingDeque.takeFirst(LinkedBlockingDeque.java:587) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:440) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:361) at redis.clients.jedis.util.Pool.getResource(Pool.java:50) at redis.clients.jedis.JedisPool.getResource(JedisPool.java:234) at com.game.redis.RedisManager.getString(RedisManager.java:219) at com.game.handler.CapitalHandler.hander(CapitalHandler.java:36) at com.game.netty.TcpServerHanler.channelRead(TcpServerHanler.java:107) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:310) at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:297) at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:413) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:265) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1334) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:926) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:134) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:644) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:579) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:496) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:458) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) at java.lang.Thread.run(Thread.java:745) "nioEventLoopGroup-4-16" #55 prio=10 os_prio=0 tid=0x00007fe464021800 nid=0x6a87 runnable [0x00007fe458bba000] java.lang.Thread.State: RUNNABLE at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method) at sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:269) at sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:93) at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86) - locked <0x00000000807dcf40> (a io.netty.channel.nio.SelectedSelectionKeySet) - locked <0x00000000807dcf30> (a java.util.Collections$UnmodifiableSet) - locked <0x00000000807dcf58> (a sun.nio.ch.EPollSelectorImpl) at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97) at io.netty.channel.nio.SelectedSelectionKeySetSelector.select(SelectedSelectionKeySetSelector.java:62) at io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:752) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:408) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) at java.lang.Thread.run(Thread.java:745) "nioEventLoopGroup-4-15" #54 prio=10 os_prio=0 tid=0x00007fe46401f800 nid=0x6a02 runnable [0x00007fe458bfb000] java.lang.Thread.State: RUNNABLE at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method) at sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:269) at sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:93) at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86) - locked <0x00000000808b60d8> (a io.netty.channel.nio.SelectedSelectionKeySet) - locked <0x00000000808ae818> (a java.util.Collections$UnmodifiableSet) - locked <0x00000000808b6040> (a sun.nio.ch.EPollSelectorImpl) at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97) at io.netty.channel.nio.SelectedSelectionKeySetSelector.select(SelectedSelectionKeySetSelector.java:62) at io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:752) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:408) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) at java.lang.Thread.run(Thread.java:745) "nioEventLoopGroup-4-14" #53 prio=10 os_prio=0 tid=0x00007fe46401d800 nid=0x69ee runnable [0x00007fe4e806a000] java.lang.Thread.State: RUNNABLE at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method) at sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:269) at sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:93) at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86) - locked <0x0000000080866e78> (a io.netty.channel.nio.SelectedSelectionKeySet) - locked <0x0000000080866e68> (a java.util.Collections$UnmodifiableSet) - locked <0x0000000080866e90> (a sun.nio.ch.EPollSelectorImpl) at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97) at io.netty.channel.nio.SelectedSelectionKeySetSelector.select(SelectedSelectionKeySetSelector.java:62) at io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:752) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:408) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) at java.lang.Thread.run(Thread.java:745) "nioEventLoopGroup-4-13" #52 prio=10 os_prio=0 tid=0x00007fe46401c000 nid=0x69cb waiting on condition [0x00007fe4e80aa000] java.lang.Thread.State: WAITING (parking) at sun.misc.Unsafe.park(Native Method) - parking to wait for <0x000000008012d6d8> (a java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject) at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175) at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039) at org.apache.commons.pool2.impl.LinkedBlockingDeque.takeFirst(LinkedBlockingDeque.java:587) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:440) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:361) at redis.clients.jedis.util.Pool.getResource(Pool.java:50) at redis.clients.jedis.JedisPool.getResource(JedisPool.java:234) at com.game.redis.RedisManager.getString(RedisManager.java:219) at com.game.handler.CapitalHandler.hander(CapitalHandler.java:36) at com.game.netty.TcpServerHanler.channelRead(TcpServerHanler.java:107) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:310) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:284) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1334) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:926) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:134) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:644) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:579) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:496) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:458) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) at java.lang.Thread.run(Thread.java:745) "nioEventLoopGroup-4-12" #51 prio=10 os_prio=0 tid=0x00007fe46401a800 nid=0x6928 waiting on condition [0x00007fe4e80eb000] java.lang.Thread.State: WAITING (parking) at sun.misc.Unsafe.park(Native Method) - parking to wait for <0x000000008012d6d8> (a java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject) at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175) at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039) at org.apache.commons.pool2.impl.LinkedBlockingDeque.takeFirst(LinkedBlockingDeque.java:587) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:440) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:361) at redis.clients.jedis.util.Pool.getResource(Pool.java:50) at redis.clients.jedis.JedisPool.getResource(JedisPool.java:234) at com.game.redis.RedisManager.getString(RedisManager.java:219) at com.game.handler.CapitalHandler.hander(CapitalHandler.java:36) at com.game.netty.TcpServerHanler.channelRead(TcpServerHanler.java:107) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:310) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:284) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1334) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:926) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:134) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:644) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:579) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:496) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:458) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) at java.lang.Thread.run(Thread.java:745) "nioEventLoopGroup-4-11" #50 prio=10 os_prio=0 tid=0x00007fe464018800 nid=0x68fe runnable [0x00007fe4e812d000] java.lang.Thread.State: RUNNABLE at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method) at sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:269) at sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:93) at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86) - locked <0x0000000080036480> (a io.netty.channel.nio.SelectedSelectionKeySet) - locked <0x0000000080036470> (a java.util.Collections$UnmodifiableSet) - locked <0x0000000080036428> (a sun.nio.ch.EPollSelectorImpl) at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97) at io.netty.channel.nio.SelectedSelectionKeySetSelector.select(SelectedSelectionKeySetSelector.java:62) at io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:752) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:408) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) at java.lang.Thread.run(Thread.java:745) "nioEventLoopGroup-4-10" #49 prio=10 os_prio=0 tid=0x00007fe464017000 nid=0x68dc waiting on condition [0x00007fe4e816d000] java.lang.Thread.State: WAITING (parking) at sun.misc.Unsafe.park(Native Method) - parking to wait for <0x000000008012d6d8> (a java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject) at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175) at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039) at org.apache.commons.pool2.impl.LinkedBlockingDeque.takeFirst(LinkedBlockingDeque.java:587) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:440) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:361) at redis.clients.jedis.util.Pool.getResource(Pool.java:50) at redis.clients.jedis.JedisPool.getResource(JedisPool.java:234) at com.game.redis.RedisManager.getString(RedisManager.java:219) at com.game.handler.CapitalHandler.hander(CapitalHandler.java:36) at com.game.netty.TcpServerHanler.channelRead(TcpServerHanler.java:107) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:310) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:284) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1334) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:926) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:134) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:644) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:579) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:496) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:458) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) at java.lang.Thread.run(Thread.java:745) "nioEventLoopGroup-4-9" #48 prio=10 os_prio=0 tid=0x00007fe464015000 nid=0x68c7 runnable [0x00007fe4e81af000] java.lang.Thread.State: RUNNABLE at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method) at sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:269) at sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:93) at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86) - locked <0x0000000080036788> (a io.netty.channel.nio.SelectedSelectionKeySet) - locked <0x0000000080036778> (a java.util.Collections$UnmodifiableSet) - locked <0x0000000080036730> (a sun.nio.ch.EPollSelectorImpl) at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97) at io.netty.channel.nio.SelectedSelectionKeySetSelector.select(SelectedSelectionKeySetSelector.java:62) at io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:752) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:408) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) at java.lang.Thread.run(Thread.java:745) 线程堆栈有些没看明白,排除法先是看看线程池、数据库线程池还没想好什么想法去看。
netty5.0客户端发送消息,服务器端接收失败
package com.vc.netty.simplechat.client; import io.netty.bootstrap.Bootstrap; import io.netty.channel.Channel; import io.netty.channel.EventLoopGroup; import io.netty.channel.nio.NioEventLoopGroup; import io.netty.channel.socket.nio.NioSocketChannel; import java.io.BufferedReader; import java.io.InputStreamReader; public class SimpleChatClient { public static void main(String[] args) throws Exception{ new SimpleChatClient("127.0.0.1", 9833).run(); } private final String host; private final int port; public SimpleChatClient(String host, int port){ this.host = host; this.port = port; } public void run() throws Exception{ EventLoopGroup group = new NioEventLoopGroup(); try { Bootstrap bootstrap = new Bootstrap() .group(group) .channel(NioSocketChannel.class) .handler(new SimpleChatClientInitializer()); Channel channel = bootstrap.connect(host, port).sync().channel(); BufferedReader in = new BufferedReader(new InputStreamReader(System.in)); System.out.println("["+in.readLine()+"]"); while(true){ channel.writeAndFlush(in.readLine() + "\r\n"); } } catch (Exception e) { e.printStackTrace(); } finally { group.shutdownGracefully(); } } } 以上是客户端代码。 ********************************************************** /** * @author vc DateTime 2015年3月30日 下午2:06:01 * @version 1.0 */ package com.vc.netty.simplechat.client; import io.netty.channel.ChannelHandlerContext; import io.netty.channel.SimpleChannelInboundHandler; public class SimpleChatClientHandler extends SimpleChannelInboundHandler<String> { // @Override // protected void channelRead0(ChannelHandlerContext ctx, String s) throws Exception { // System.out.println(s); // } /*@Override public void channelRead(ChannelHandlerContext ctx, Object obj) throws Exception { System.out.println(ctx.toString()); System.out.println(obj.toString()); };*/ @Override protected void messageReceived(ChannelHandlerContext ctx, String msg) throws Exception { System.out.println(msg); } } 以上是clientHandler代码 ************************************************************** package com.vc.netty.simplechat.server; import io.netty.bootstrap.ServerBootstrap; import io.netty.channel.ChannelFuture; import io.netty.channel.ChannelOption; import io.netty.channel.EventLoopGroup; import io.netty.channel.nio.NioEventLoopGroup; import io.netty.channel.socket.nio.NioServerSocketChannel; public class SimpleChatServer { private int port; public SimpleChatServer(int port) { this.port = port; } public void run() throws Exception { EventLoopGroup bossGroup = new NioEventLoopGroup(); // (1) EventLoopGroup workerGroup = new NioEventLoopGroup(); try { ServerBootstrap b = new ServerBootstrap(); // (2) b.group(bossGroup, workerGroup) .channel(NioServerSocketChannel.class) // (3) .childHandler(new SimpleChatServerInitializer()) //(4) .option(ChannelOption.SO_BACKLOG, 128) // (5) .childOption(ChannelOption.SO_KEEPALIVE, true); // (6) System.out.println("SimpleChatServer 启动了"); // 绑定端口,开始接收进来的连接 ChannelFuture f = b.bind(port).sync(); // (7) // 等待服务器 socket 关闭 。 // 在这个例子中,这不会发生,但你可以优雅地关闭你的服务器。 f.channel().closeFuture().sync(); } finally { workerGroup.shutdownGracefully(); bossGroup.shutdownGracefully(); System.out.println("SimpleChatServer 关闭了"); } } public static void main(String[] args) throws Exception { int port; if (args.length > 0) { port = Integer.parseInt(args[0]); } else { port = 9833; } new SimpleChatServer(port).run(); } } 以上是服务器端代码 ****************************************************************** package com.vc.netty.simplechat.server; import io.netty.channel.Channel; import io.netty.channel.ChannelHandlerContext; import io.netty.channel.SimpleChannelInboundHandler; import io.netty.channel.group.ChannelGroup; import io.netty.channel.group.DefaultChannelGroup; import io.netty.util.concurrent.GlobalEventExecutor; public class SimpleChatServerHandler extends SimpleChannelInboundHandler<String> { // (1) public static ChannelGroup channels = new DefaultChannelGroup(GlobalEventExecutor.INSTANCE); @Override public void handlerAdded(ChannelHandlerContext ctx) throws Exception { // (2) Channel incoming = ctx.channel(); for (Channel channel : channels) { channel.writeAndFlush("[SERVER] - " + incoming.remoteAddress() + " 加入\n"); } channels.add(ctx.channel()); } @Override public void handlerRemoved(ChannelHandlerContext ctx) throws Exception { // (3) Channel incoming = ctx.channel(); for (Channel channel : channels) { channel.writeAndFlush("[SERVER] - " + incoming.remoteAddress() + " 离开\n"); } channels.remove(ctx.channel()); } // @Override // protected void channelRead0(ChannelHandlerContext ctx, String s) throws Exception { // (4) // Channel incoming = ctx.channel(); // for (Channel channel : channels) { // if (channel != incoming){ // channel.writeAndFlush("[" + incoming.remoteAddress() + "]" + s + "\n"); // } else { // channel.writeAndFlush("[you]" + s + "\n"); // } // } // } @Override public void channelActive(ChannelHandlerContext ctx) throws Exception { // (5) Channel incoming = ctx.channel(); System.out.println("SimpleChatClient:"+incoming.remoteAddress()+"在线"); } @Override public void channelInactive(ChannelHandlerContext ctx) throws Exception { // (6) Channel incoming = ctx.channel(); System.out.println("SimpleChatClient:"+incoming.remoteAddress()+"掉线"); } @Override public void exceptionCaught(ChannelHandlerContext ctx, Throwable cause) { // (7) Channel incoming = ctx.channel(); System.out.println("SimpleChatClient:"+incoming.remoteAddress()+"异常"); // 当出现异常就关闭连接 cause.printStackTrace(); ctx.close(); } @Override protected void messageReceived(ChannelHandlerContext ctx, String msg) throws Exception { Channel incoming = ctx.channel(); for (Channel channel : channels) { if (channel != incoming){ channel.writeAndFlush("[" + incoming.remoteAddress() + "]" + msg + "\n"); } else { channel.writeAndFlush("[you]" + msg + "\n"); } } } } 以上是serverHandler代码 ```
服务器挂载大量close_wait无法断开
使用netty实现websocket,在网页关闭的时候断开连接,采用的是H5版本的,微信浏览器,在ios系统下离开页面的时候会出现原链接状态变为close_wait状态而不会断开,在安卓版本下不会出现这种情况 大概的情况是这样的 ![图片说明](https://img-ask.csdn.net/upload/201809/13/1536839036_917725.jpg) 代码是这样的,hander这边基本没有其他业务的东西,业务方面都放线程池执行了, 抓包情况分析 ![图片说明](https://img-ask.csdn.net/upload/201809/13/1536840275_123782.jpg) 15570->80端口的,80端口是服务器的端口,这里的是客户端向服务端发送fin,服务端立刻返回ack,但是属于服务端的fin却一直不会发送过来,,这个抓包是我在服务器上用tcpdump抓的网卡的数据流量情况,在本地直接用wireshark抓包也是差不多的情况。 而根据服务器的挂载情况显示挂载nginx上面,我直接关闭服务也不会断开连接,只有重新加载nginx配置或者nginx进行重启才会把close_wait状态修改为last_ack状态,。 还有一个情况就是,这个服务器暂时只有我一个人在使用没有对外开放,所以应该也不存在什么阻塞的情况
Storm 0.10.0 storm写es会出现大量的netty连接失败的Error
WARNING: An exception was thrown by TimerTask. java.lang.RuntimeException: Giving up to scheduleConnect to Netty-Client-wyzjd02/172.18.2.5:6704 after 41 failed attempts. 221 messages were lost at backtype.storm.messaging.netty.Client$Connect.run(Client.java:511) at org.apache.storm.shade.org.jboss.netty.util.HashedWheelTimer$HashedWheelTimeout.expire(HashedWheelTimer.java:546) at org.apache.storm.shade.org.jboss.netty.util.HashedWheelTimer$Worker.notifyExpiredTimeouts(HashedWheelTimer.java:446) at org.apache.storm.shade.org.jboss.netty.util.HashedWheelTimer$Worker.run(HashedWheelTimer.java:395) at org.apache.storm.shade.org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) at java.lang.Thread.run(Thread.java:745) 2016-09-07 10:16:40.260 b.s.m.n.Client [ERROR] connection attempt 2 to Netty-Client-wyfx03/172.18.2.3:6700 failed: java.net.ConnectException: Connection refused: wyfx03/172.18.2.3:6700 2016-09-07 10:16:40.461 b.s.m.n.Client [ERROR] connection attempt 3 to Netty-Client-wyfx03/172.18.2.3:6700 failed: java.net.ConnectException: Connection refused: wyfx03/172.18.2.3:6700 2016-09-07 10:16:40.661 b.s.m.n.Client [ERROR] connection attempt 4 to Netty-Client-wyfx03/172.18.2.3:6700 failed: java.net.ConnectException: Connection refused: wyfx03/172.18.2.3:6700 2016-09-07 10:16:40.860 b.s.m.n.Client [ERROR] connection attempt 5 to Netty-Client-wyfx03/172.18.2.3:6700 failed: java.net.ConnectException: Connection refused: wyfx03/172.18.2.3:6700 2016-09-07 10:16:41.061 b.s.m.n.Client [ERROR] connection attempt 6 to Netty-Client-wyfx03/172.18.2.3:6700 failed: java.net.ConnectException: Connection refused: wyfx03/172.18.2.3:6700 2016-09-07 10:16:41.361 b.s.m.n.Client [ERROR] connection attempt 7 to Netty-Client-wyfx03/172.18.2.3:6700 failed: java.net.ConnectException: Connection refused: wyfx03/172.18.2.3:6700 2016-09-07 10:16:41.660 b.s.m.n.Client [ERROR] connection attempt 8 to Netty-Client-wyfx03/172.18.2.3:6700 failed: java.net.ConnectException: Connection refused: wyfx03/172.18.2.3:6700 2016-09-07 10:16:42.060 b.s.m.n.Client [ERROR] connection attempt 9 to Netty-Client-wyfx03/172.18.2.3:6700 failed: java.net.ConnectException: Connection refused: wyfx03/172.18.2.3:6700 2016-09-07 10:16:42.463 b.s.m.n.Client [ERROR] connection attempt 10 to Netty-Client-wyfx03/172.18.2.3:6700 failed: java.net.ConnectException: Connection refused: wyfx03/172.18.2.3:6700 2016-09-07 10:16:35.559 b.s.m.n.StormClientHandler [INFO] Connection failed Netty-Client-wyzjd02/172.18.2.5:6704 java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.read0(Native Method) ~[?:1.7.0_80] at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:39) ~[?:1.7.0_80] at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223) ~[?:1.7.0_80] at sun.nio.ch.IOUtil.read(IOUtil.java:192) ~[?:1.7.0_80] at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:384) ~[?:1.7.0_80] at org.apache.storm.shade.org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:64) [storm-core-0.10.0.jar:0.10.0] at org.apache.storm.shade.org.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108) [storm-core-0.10.0.jar:0.10.0] at org.apache.storm.shade.org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:318) [storm-core-0.10.0.jar:0.10.0] at org.apache.storm.shade.org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89) [storm-core-0.10.0.jar:0.10.0] at org.apache.storm.shade.org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178) [storm-core-0.10.0.jar:0.10.0] at org.apache.storm.shade.org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) [storm-core-0.10.0.jar:0.10.0] at org.apache.storm.shade.org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) [storm-core-0.10.0.jar:0.10.0] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [?:1.7.0_80] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [?:1.7.0_80] at java.lang.Thread.run(Thread.java:745) [?:1.7.0_80] 2016-09-07 10:16:35.560 b.s.m.n.Client [ERROR] failed to send 1 messages to Netty-Client-wyzjd02/172.18.2.5:6704: java.nio.channels.ClosedChannelException 2016-09-07 10:16:35.561 b.s.m.n.Client [ERROR] failed to send 1 messages to Netty-Client-wyzjd02/172.18.2.5:6704: java.nio.channels.ClosedChannelException 2016-09-07 10:16:35.561 b.s.m.n.Client [ERROR] failed to send 1 messages to Netty-Client-wyzjd02/172.18.2.5:6704: java.nio.channels.ClosedChannelException 2016-09-07 10:16:35.561 b.s.m.n.Client [ERROR] failed to send 1 messages to Netty-Client-wyzjd02/172.18.2.5:6704: java.nio.channels.ClosedChannelException 2016-09-07 10:16:35.561 b.s.m.n.StormClientHandler [INFO] Connection failed Netty-Client-wyzjd02/172.18.2.5:6704 java.nio.channels.ClosedChannelException at org.apache.storm.shade.org.jboss.netty.channel.socket.nio.AbstractNioWorker.cleanUpWriteBuffer(AbstractNioWorker.java:433) [storm-core-0.10.0.jar:0.10.0] at org.apache.storm.shade.org.jboss.netty.channel.socket.nio.AbstractNioWorker.close(AbstractNioWorker.java:373) [storm-core-0.10.0.jar:0.10.0] at org.apache.storm.shade.org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:93) [storm-core-0.10.0.jar:0.10.0] at org.apache.storm.shade.org.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108) [storm-core-0.10.0.jar:0.10.0] at org.apache.storm.shade.org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:318) [storm-core-0.10.0.jar:0.10.0] at org.apache.storm.shade.org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89) [storm-core-0.10.0.jar:0.10.0] at org.apache.storm.shade.org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178) [storm-core-0.10.0.jar:0.10.0] at org.apache.storm.shade.org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) [storm-core-0.10.0.jar:0.10.0] at org.apache.storm.shade.org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) [storm-core-0.10.0.jar:0.10.0] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [?:1.7.0_80] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [?:1.7.0_80] at java.lang.Thread.run(Thread.java:745) [?:1.7.0_80] 2016-09-07 10:16:35.660 b.s.m.n.Client [ERROR] connection attempt 33 to Netty-Client-wyzjd02/172.18.2.5:6704 failed: java.net.ConnectException: Connection refused: wyzjd02/172.18.2.5:6704 2016-09-07 10:16:36.160 b.s.m.n.Client [ERROR] connection attempt 34 to Netty-Client-wyzjd02/172.18.2.5:6704 failed: java.net.ConnectException: Connection refused: wyzjd02/172.18.2.5:6704 2016-09-07 10:16:36.661 b.s.m.n.Client [ERROR] connection attempt 35 to Netty-Client-wyzjd02/172.18.2.5:6704 failed: java.net.ConnectException: Connection refused: wyzjd02/172.18.2.5:6704 2016-09-07 10:16:37.160 b.s.m.n.Client [ERROR] connection attempt 36 to Netty-Client-wyzjd02/172.18.2.5:6704 failed: java.net.ConnectException: Connection refused: wyzjd02/172.18.2.5:6704 2016-09-07 10:16:37.661 b.s.m.n.Client [ERROR] connection attempt 37 to Netty-Client-wyzjd02/172.18.2.5:6704 failed: java.net.ConnectException: Connection refused: wyzjd02/172.18.2.5:6704
ElasticSearch6.3.1 jdbc连接提示Original type was [request [/_xpack/sql] contains unrecognized parameter: [mode]].
在maven pom文件中添加一下: ``` <dependency> <groupId>org.elasticsearch.plugin</groupId> <artifactId>jdbc</artifactId> <version>6.3.1</version> </dependency> ``` ``` <repositories> <repository> <id>elastic.co</id> <url>https://artifacts.elastic.co/maven</url> </repository> </repositories> ``` 同时也下载了jar: https://artifacts.elastic.co/maven/org/elasticsearch/plugin/jdbc/6.3.1/jdbc-6.3.1.jar 并放入maven相关位置。 接下来准备使用jdbc连接es,代码如下: 连接es: ``` public static void startPool(String driveName, String url, String sql) throws Exception { Class.forName(driveName); Connection connection = DriverManager.getConnection(url, connectionProperties); Statement statement = connection.createStatement(); ResultSet resultSet = statement.executeQuery(sql); int num = 0; while (resultSet.next()) { num = 1; } if (num == 1) { System.out.println("es可以连接"); } else { System.out.println("es不可以连接,请重试"); } } ``` 执行: ``` public void initDBSource() throws Exception { SQLUtil.startPool( "org.elasticsearch.xpack.sql.jdbc.jdbc.JdbcDriver",//ES jdbc驱动 "jdbc:es://http://192.168.38.12:9200",//es链接串 "SHOW tables like \\u0027data%\\u0027" //数据源连接校验sql ); } ``` 结果: ``` Exception in thread "main" java.sql.SQLException: Server sent bad type [illegal_argument_exception]. Original type was [request [/_xpack/sql] contains unrecognized parameter: [mode]]. [java.lang.IllegalArgumentException: request [/_xpack/sql] contains unrecognized parameter: [mode] at org.elasticsearch.rest.BaseRestHandler.handleRequest(BaseRestHandler.java:103) at org.elasticsearch.xpack.security.rest.SecurityRestFilter.handleRequest(SecurityRestFilter.java:87) at org.elasticsearch.rest.RestController.dispatchRequest(RestController.java:240) at org.elasticsearch.rest.RestController.tryAllHandlers(RestController.java:336) at org.elasticsearch.rest.RestController.dispatchRequest(RestController.java:174) at org.elasticsearch.http.netty4.Netty4HttpServerTransport.dispatchRequest(Netty4HttpServerTransport.java:551) at org.elasticsearch.http.netty4.Netty4HttpRequestHandler.channelRead0(Netty4HttpRequestHandler.java:137) at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at org.elasticsearch.http.netty4.pipelining.HttpPipeliningHandler.channelRead(HttpPipeliningHandler.java:68) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at org.elasticsearch.http.netty4.cors.Netty4CorsHandler.channelRead(Netty4CorsHandler.java:86) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102) at io.netty.handler.codec.MessageToMessageCodec.channelRead(MessageToMessageCodec.java:111) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:323) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:297) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1434) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:965) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:656) at io.netty.channel.nio.NioEventLoop.processSelectedKeysPlain(NioEventLoop.java:556) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:510) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:470) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:909) at java.lang.Thread.run(Thread.java:748) ] at org.elasticsearch.xpack.sql.client.shared.JreHttpUrlConnection.parserError(JreHttpUrlConnection.java:181) at org.elasticsearch.xpack.sql.client.shared.JreHttpUrlConnection.request(JreHttpUrlConnection.java:158) at org.elasticsearch.xpack.sql.client.HttpClient.lambda$post$0(HttpClient.java:101) at org.elasticsearch.xpack.sql.client.shared.JreHttpUrlConnection.http(JreHttpUrlConnection.java:62) at org.elasticsearch.xpack.sql.client.HttpClient.lambda$post$1(HttpClient.java:100) at java.security.AccessController.doPrivileged(Native Method) at org.elasticsearch.xpack.sql.client.HttpClient.post(HttpClient.java:99) at org.elasticsearch.xpack.sql.client.HttpClient.query(HttpClient.java:77) at org.elasticsearch.xpack.sql.jdbc.net.client.JdbcHttpClient.query(JdbcHttpClient.java:51) at org.elasticsearch.xpack.sql.jdbc.jdbc.JdbcStatement.initResultSet(JdbcStatement.java:162) at org.elasticsearch.xpack.sql.jdbc.jdbc.JdbcStatement.execute(JdbcStatement.java:153) at org.elasticsearch.xpack.sql.jdbc.jdbc.JdbcStatement.executeQuery(JdbcStatement.java:42) at com.rh.dome.util.ESJDBCTest2.main(ESJDBCTest2.java:32) Process finished with exit code 1 ``` 不知道出现什么问题,在百度上官网上都查不到此类错误 经过一番努力,终于发现问题所在,原因是:JDBC的驱动包跟服务器上的es版本不一致。 一定要一致,哪怕版本很接近,也会出现驱动包兼容问题!!!
netty 高并发快速响应
现业务是接收http post请求并必须在15毫秒内返回, 我的实现思路: 1:首先netty接收完数据,验证contenttype等header信息再将post的数据protobuf反序列化成对象(必须), 2:然后将对象传入到某个方法, 3:该方法必须定时10毫秒完成,如果没有完成就直接返回null和请求的header 具体实现是: netty 主线程接收完数据转成request对象传到1号线程池里面,1号线程池验证对象和反序列化,传给2号线程池并wait10毫秒然后直接得到2号线程池的返回对象,有就有,没有就是null,然后再channel返回。2号线程池主要处理业务生成对象供1号线程池调用。 Linux version 2.6.32-431.11.25.el6.ucloud.x86_64 gcc version 4.4.7 20120313 (Red Hat 4.4.7-4) Mem: 16G CPU(s): 8 sysctl.conf: net.ipv4.ip_forward = 0 net.ipv4.conf.default.rp_filter = 1 net.ipv4.conf.default.accept_source_route = 0 kernel.core_uses_pid = 1 net.ipv4.tcp_syncookies = 1 net.bridge.bridge-nf-call-ip6tables = 0 net.bridge.bridge-nf-call-iptables = 0 net.bridge.bridge-nf-call-arptables = 0 kernel.msgmnb = 65536 kernel.msgmax = 65536 kernel.shmmax = 68719476736 kernel.shmall = 4294967296 net.netfilter.nf_conntrack_max = 1000000 kernel.unknown_nmi_panic = 0 kernel.sysrq = 1 fs.file-max = 1000000 vm.swappiness = 10 fs.inotify.max_user_watches = 10000000 net.core.wmem_max = 67108864 net.core.rmem_max = 67108864 net.ipv4.conf.all.send_redirects = 0 net.ipv4.conf.default.send_redirects = 0 net.ipv4.conf.all.secure_redirects = 0 net.ipv4.conf.default.secure_redirects = 0 net.ipv4.conf.all.accept_redirects = 0 net.ipv4.conf.default.accept_redirects = 0 fs.notify.max_queued_events = 3276792 net.ipv4.neigh.default.gc_thresh1 = 2048 net.ipv4.neigh.default.gc_thresh2 = 4096 net.ipv4.neigh.default.gc_thresh3 = 8192 net.ipv6.conf.all.disable_ipv6 = 1 ############## net.ipv4.tcp_max_tw_buckets = 6000 net.ipv4.ip_local_port_range = 1024 65000 net.ipv4.tcp_tw_recycle = 1 net.ipv4.tcp_tw_reuse = 1 net.ipv4.tcp_syncookies = 1 net.core.somaxconn = 262144 net.core.netdev_max_backlog = 262144 net.ipv4.tcp_max_orphans = 262144 net.ipv4.tcp_max_syn_backlog = 262144 net.ipv4.tcp_synack_retries = 1 net.ipv4.tcp_syn_retries = 1 net.ipv4.tcp_fin_timeout = 1 net.ipv4.tcp_keepalive_time = 30 #### net.ipv4.tcp_rmem = 4096 87380 67108864 net.ipv4.tcp_wmem = 4096 65536 67108864 net.core.netdev_max_backlog = 250000 net.ipv4.tcp_mtu_probing=1 net.ipv4.tcp_congestion_control=hybla jdk 1.8 现用tomcat同样配置执行是4000qps,超时率为4%; netty是1000qps,超时率为5%; netty: cpu使用10%左右 java -Xms10g -Xmx10g -XX:NewSize=7098m -XX:SurvivorRatio=16 -XX:+DisableExplicitGC -XX:+CMSScavengeBeforeRemark -XX:+UseConcMarkSweepGC -XX:+UParNewGC S0C S1C S0U S1U EC EU OC OU MC MU CCSC CCSU YGC YGCT FGC FGCT GCT 403776.0 403776.0 25637.9 0.0 6460800.0 5273672.7 3217408.0 0.0 27928.0 25782.3 3352.0 2883.6 2 0.042 2 1.411 1.453 netstat -apn | grep TIME_WAIT | wl -h 5733 现需要qps达到1w以上。希望收到你的回答或demo。谢谢 ps: akka ,但觉得个人写future和他差不多吧。
No provider available from registry ip for service com.zys.store.service.api.HelloServiceApi:1.0.0 on consumer ip use dubbo version 2.6.2
springboot整合zookeeper和dubbo报错 linux上zk版本号是3.4.14 项目用的zk包是3.4.9 dubbo用的是2.6.2 springboot是2.1.6 用main方式先运行store的模块 再运行order的模块 这个时候用浏览器去访问http://localhost:8088/hello?name=zys这个时候控制台就报错No provider available from registry 启动的时候不报错 浏览器访问就报错了 看zk日志有说是HelloServiceApi这个类的节点已经存在 但是我觉得跟这个问题不大 因为运行一次就会创建一个节点 多次的话肯定报节点存在的错 我的linux是虚拟机 并且已经关闭了防火墙 不存在端口跑不通的情况 1. 项目架构是maven的聚合项目![图片说明](https://img-ask.csdn.net/upload/201911/20/1574220333_973372.png) 2. 在zys-store-api写接口 package com.zys.store.service.api; public interface HelloServiceApi { String sayHello(String name); } 3.在zys-store暴露接口 package com.zys.store.service.provider; import com.alibaba.dubbo.config.annotation.Service; import com.zys.store.service.api.HelloServiceApi; @Service( version = "1.0.0", application = "zys-store", protocol = "dubbo", registry = "zys-store-registry" ) public class HelloServiceProvider implements HelloServiceApi{ @Override public String sayHello(String name) { return "hello" + name; } } 下面是zys-store的yml的配置 dubbo: application: id: zys-store name: zys-store qos-port: 22212 qos-enable: true scan: base-packages: com.zys.store.* protocol: id: dubbo name: dubbo port: 12343 registry: id: zys-store-registry address: zookeeper://192.168.146.132:2181?backup=192.168.146.131:2181,192.168.146.133:2181 4.在zys-order调用接口 package com.zys.order.controller; import com.alibaba.dubbo.config.annotation.Reference; import com.zys.store.service.api.HelloServiceApi; import org.springframework.web.bind.annotation.RequestMapping; import org.springframework.web.bind.annotation.RequestParam; import org.springframework.web.bind.annotation.RestController; import javax.annotation.Resource; @RestController public class HelloController { @Reference( version = "1.0.0", application = "zys-order", interfaceName = "com.zys.store.service.api.HelloServiceApi", check = false, timeout = 3000, retries = 0 //读请求3次 写请求不重复 ) private HelloServiceApi helloServiceApi; @RequestMapping("/hello") public String hello(@RequestParam("name") String name){ return helloServiceApi.sayHello(name); } } 下面是zys-order的yml的配置 ```dubbo: application: id: zys-order name: zys-order qos-port: 22211 qos-enable: true qos-accept-foreign-ip: false scan: base-packages: com.zys.order.* protocol: id: dubbo name: dubbo port: 12343 registry: id: zys-order-registry address: zookeeper://192.168.146.132:2181?backup=192.168.146.131:2181,192.168.146.133:2181 store端的启动日志 2019-11-20 10:18:35.426 INFO 50632 --- [ main] a.b.d.c.e.WelcomeLogoApplicationListener : :: Dubbo Spring Boot (v0.2.0) : https://github.com/apache/incubator-dubbo-spring-boot-project :: Dubbo (v2.6.2) : https://github.com/apache/incubator-dubbo :: Google group : dev@dubbo.incubator.apache.org 2019-11-20 10:18:35.430 INFO 50632 --- [ main] e.OverrideDubboConfigApplicationListener : Dubbo Config was overridden by externalized configuration {dubbo.application.id=zys-store, dubbo.application.name=zys-store, dubbo.application.qos-enable=true, dubbo.application.qos-port=22212, dubbo.protocol.id=dubbo, dubbo.protocol.name=dubbo, dubbo.protocol.port=12343, dubbo.registry.address=zookeeper://192.168.146.132:2181?backup=192.168.146.131:2181,192.168.146.133:2181, dubbo.registry.id=zys-store-registry, dubbo.scan.base-packages=com.zys.store.*} . ____ _ __ _ _ /\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \ ( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \ \\/ ___)| |_)| | | | | || (_| | ) ) ) ) ' |____| .__|_| |_|_| |_\__, | / / / / =========|_|==============|___/=/_/_/_/ :: Spring Boot :: (v2.1.6.RELEASE) 2019-11-20 10:18:35.488 INFO 50632 --- [ main] com.zys.store.ZysStoreApplication : Starting ZysStoreApplication on DESKTOP-O0S84K8 with PID 50632 (D:\ideaworkspace\zys-project\zys-store\target\classes started by Administrator in D:\ideaworkspace\zys-project) 2019-11-20 10:18:35.489 INFO 50632 --- [ main] com.zys.store.ZysStoreApplication : No active profile set, falling back to default profiles: default 2019-11-20 10:18:35.963 WARN 50632 --- [ main] o.m.s.mapper.ClassPathMapperScanner : No MyBatis mapper was found in '[com.zys.store.dao, com.zys.store.mapper]' package. Please check your configuration. 2019-11-20 10:18:36.030 WARN 50632 --- [ main] o.s.boot.actuate.endpoint.EndpointId : Endpoint ID 'dubbo-configs' contains invalid characters, please migrate to a valid format. 2019-11-20 10:18:36.031 WARN 50632 --- [ main] o.s.boot.actuate.endpoint.EndpointId : Endpoint ID 'dubbo-properties' contains invalid characters, please migrate to a valid format. 2019-11-20 10:18:36.031 WARN 50632 --- [ main] o.s.boot.actuate.endpoint.EndpointId : Endpoint ID 'dubbo-references' contains invalid characters, please migrate to a valid format. 2019-11-20 10:18:36.031 WARN 50632 --- [ main] o.s.boot.actuate.endpoint.EndpointId : Endpoint ID 'dubbo-services' contains invalid characters, please migrate to a valid format. 2019-11-20 10:18:36.031 WARN 50632 --- [ main] o.s.boot.actuate.endpoint.EndpointId : Endpoint ID 'dubbo-shutdown' contains invalid characters, please migrate to a valid format. 2019-11-20 10:18:36.037 INFO 50632 --- [ main] .a.d.c.s.c.a.DubboConfigBindingRegistrar : The dubbo config bean definition [name : zys-store, class : com.alibaba.dubbo.config.ApplicationConfig] has been registered. 2019-11-20 10:18:36.037 INFO 50632 --- [ main] .a.d.c.s.c.a.DubboConfigBindingRegistrar : The BeanPostProcessor bean definition [com.alibaba.dubbo.config.spring.beans.factory.annotation.DubboConfigBindingBeanPostProcessor] for dubbo config bean [name : zys-store] has been registered. 2019-11-20 10:18:36.037 INFO 50632 --- [ main] .a.d.c.s.c.a.DubboConfigBindingRegistrar : The dubbo config bean definition [name : zys-store-registry, class : com.alibaba.dubbo.config.RegistryConfig] has been registered. 2019-11-20 10:18:36.037 INFO 50632 --- [ main] .a.d.c.s.c.a.DubboConfigBindingRegistrar : The BeanPostProcessor bean definition [com.alibaba.dubbo.config.spring.beans.factory.annotation.DubboConfigBindingBeanPostProcessor] for dubbo config bean [name : zys-store-registry] has been registered. 2019-11-20 10:18:36.037 INFO 50632 --- [ main] .a.d.c.s.c.a.DubboConfigBindingRegistrar : The dubbo config bean definition [name : dubbo, class : com.alibaba.dubbo.config.ProtocolConfig] has been registered. 2019-11-20 10:18:36.038 INFO 50632 --- [ main] .a.d.c.s.c.a.DubboConfigBindingRegistrar : The BeanPostProcessor bean definition [com.alibaba.dubbo.config.spring.beans.factory.annotation.DubboConfigBindingBeanPostProcessor] for dubbo config bean [name : dubbo] has been registered. 2019-11-20 10:18:36.060 WARN 50632 --- [ main] o.m.s.mapper.ClassPathMapperScanner : No MyBatis mapper was found in '[com.zys.store]' package. Please check your configuration. 2019-11-20 10:18:36.307 INFO 50632 --- [ main] trationDelegate$BeanPostProcessorChecker : Bean 'com.alibaba.boot.dubbo.autoconfigure.DubboAutoConfiguration' of type [com.alibaba.boot.dubbo.autoconfigure.DubboAutoConfiguration$$EnhancerBySpringCGLIB$$437665e0] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) 2019-11-20 10:18:36.357 INFO 50632 --- [ main] trationDelegate$BeanPostProcessorChecker : Bean 'relaxedDubboConfigBinder' of type [com.alibaba.boot.dubbo.autoconfigure.RelaxedDubboConfigBinder] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) 2019-11-20 10:18:36.364 INFO 50632 --- [ main] trationDelegate$BeanPostProcessorChecker : Bean 'org.springframework.transaction.annotation.ProxyTransactionManagementConfiguration' of type [org.springframework.transaction.annotation.ProxyTransactionManagementConfiguration$$EnhancerBySpringCGLIB$$3474c06e] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) 2019-11-20 10:18:36.382 INFO 50632 --- [ main] trationDelegate$BeanPostProcessorChecker : Bean 'relaxedDubboConfigBinder' of type [com.alibaba.boot.dubbo.autoconfigure.RelaxedDubboConfigBinder] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) 2019-11-20 10:18:36.383 INFO 50632 --- [ main] trationDelegate$BeanPostProcessorChecker : Bean 'relaxedDubboConfigBinder' of type [com.alibaba.boot.dubbo.autoconfigure.RelaxedDubboConfigBinder] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) 2019-11-20 10:18:36.657 INFO 50632 --- [ main] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat initialized with port(s): 8089 (http) 2019-11-20 10:18:36.675 INFO 50632 --- [ main] o.apache.catalina.core.StandardService : Starting service [Tomcat] 2019-11-20 10:18:36.675 INFO 50632 --- [ main] org.apache.catalina.core.StandardEngine : Starting Servlet engine: [Apache Tomcat/9.0.21] 2019-11-20 10:18:36.770 INFO 50632 --- [ main] o.a.c.c.C.[Tomcat].[localhost].[/] : Initializing Spring embedded WebApplicationContext 2019-11-20 10:18:36.770 INFO 50632 --- [ main] o.s.web.context.ContextLoader : Root WebApplicationContext: initialization completed in 1249 ms 2019-11-20 10:18:37.038 INFO 50632 --- [ main] .f.a.DubboConfigBindingBeanPostProcessor : The properties of bean [name : dubbo] have been binding by prefix of configuration properties : dubbo.protocol 2019-11-20 10:18:37.568 INFO 50632 --- [ main] o.s.s.concurrent.ThreadPoolTaskExecutor : Initializing ExecutorService 'applicationTaskExecutor' 2019-11-20 10:18:37.744 INFO 50632 --- [ main] .f.a.DubboConfigBindingBeanPostProcessor : The properties of bean [name : zys-store] have been binding by prefix of configuration properties : dubbo.application 2019-11-20 10:18:37.747 INFO 50632 --- [ main] .f.a.DubboConfigBindingBeanPostProcessor : The properties of bean [name : zys-store-registry] have been binding by prefix of configuration properties : dubbo.registry 2019-11-20 10:18:37.816 INFO 50632 --- [ main] com.zaxxer.hikari.HikariDataSource : MyHikariCP - Starting... 2019-11-20 10:18:37.993 INFO 50632 --- [ main] com.zaxxer.hikari.HikariDataSource : MyHikariCP - Start completed. order端的启动日志 ```2019-11-20 10:18:47.233 INFO 48096 --- [ main] a.b.d.c.e.WelcomeLogoApplicationListener : :: Dubbo Spring Boot (v0.2.0) : https://github.com/apache/incubator-dubbo-spring-boot-project :: Dubbo (v2.6.2) : https://github.com/apache/incubator-dubbo :: Google group : dev@dubbo.incubator.apache.org 2019-11-20 10:18:47.239 INFO 48096 --- [ main] e.OverrideDubboConfigApplicationListener : Dubbo Config was overridden by externalized configuration {dubbo.application.id=zys-order, dubbo.application.name=zys-order, dubbo.application.qos-accept-foreign-ip=false, dubbo.application.qos-enable=true, dubbo.application.qos-port=22211, dubbo.protocol.id=dubbo, dubbo.protocol.name=dubbo, dubbo.protocol.port=12343, dubbo.registry.address=zookeeper://192.168.146.132:2181?backup=192.168.146.131:2181,192.168.146.133:2181, dubbo.registry.id=zys-order-registry, dubbo.scan.base-packages=com.zys.order.*} . ____ _ __ _ _ /\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \ ( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \ \\/ ___)| |_)| | | | | || (_| | ) ) ) ) ' |____| .__|_| |_|_| |_\__, | / / / / =========|_|==============|___/=/_/_/_/ :: Spring Boot :: (v2.1.6.RELEASE) 2019-11-20 10:18:47.301 INFO 48096 --- [ main] com.zys.order.ZysOrderApplication : Starting ZysOrderApplication on DESKTOP-O0S84K8 with PID 48096 (D:\ideaworkspace\zys-project\zys-order\target\classes started by Administrator in D:\ideaworkspace\zys-project) 2019-11-20 10:18:47.302 INFO 48096 --- [ main] com.zys.order.ZysOrderApplication : No active profile set, falling back to default profiles: default 2019-11-20 10:18:47.803 WARN 48096 --- [ main] o.m.s.mapper.ClassPathMapperScanner : No MyBatis mapper was found in '[com.zys.order.dao, com.zys.order.mapper]' package. Please check your configuration. 2019-11-20 10:18:47.868 WARN 48096 --- [ main] o.s.boot.actuate.endpoint.EndpointId : Endpoint ID 'dubbo-configs' contains invalid characters, please migrate to a valid format. 2019-11-20 10:18:47.869 WARN 48096 --- [ main] o.s.boot.actuate.endpoint.EndpointId : Endpoint ID 'dubbo-properties' contains invalid characters, please migrate to a valid format. 2019-11-20 10:18:47.870 WARN 48096 --- [ main] o.s.boot.actuate.endpoint.EndpointId : Endpoint ID 'dubbo-references' contains invalid characters, please migrate to a valid format. 2019-11-20 10:18:47.870 WARN 48096 --- [ main] o.s.boot.actuate.endpoint.EndpointId : Endpoint ID 'dubbo-services' contains invalid characters, please migrate to a valid format. 2019-11-20 10:18:47.870 WARN 48096 --- [ main] o.s.boot.actuate.endpoint.EndpointId : Endpoint ID 'dubbo-shutdown' contains invalid characters, please migrate to a valid format. 2019-11-20 10:18:47.875 INFO 48096 --- [ main] .a.d.c.s.c.a.DubboConfigBindingRegistrar : The dubbo config bean definition [name : zys-order, class : com.alibaba.dubbo.config.ApplicationConfig] has been registered. 2019-11-20 10:18:47.875 INFO 48096 --- [ main] .a.d.c.s.c.a.DubboConfigBindingRegistrar : The BeanPostProcessor bean definition [com.alibaba.dubbo.config.spring.beans.factory.annotation.DubboConfigBindingBeanPostProcessor] for dubbo config bean [name : zys-order] has been registered. 2019-11-20 10:18:47.875 INFO 48096 --- [ main] .a.d.c.s.c.a.DubboConfigBindingRegistrar : The dubbo config bean definition [name : zys-order-registry, class : com.alibaba.dubbo.config.RegistryConfig] has been registered. 2019-11-20 10:18:47.875 INFO 48096 --- [ main] .a.d.c.s.c.a.DubboConfigBindingRegistrar : The BeanPostProcessor bean definition [com.alibaba.dubbo.config.spring.beans.factory.annotation.DubboConfigBindingBeanPostProcessor] for dubbo config bean [name : zys-order-registry] has been registered. 2019-11-20 10:18:47.876 INFO 48096 --- [ main] .a.d.c.s.c.a.DubboConfigBindingRegistrar : The dubbo config bean definition [name : dubbo, class : com.alibaba.dubbo.config.ProtocolConfig] has been registered. 2019-11-20 10:18:47.876 INFO 48096 --- [ main] .a.d.c.s.c.a.DubboConfigBindingRegistrar : The BeanPostProcessor bean definition [com.alibaba.dubbo.config.spring.beans.factory.annotation.DubboConfigBindingBeanPostProcessor] for dubbo config bean [name : dubbo] has been registered. 2019-11-20 10:18:47.894 WARN 48096 --- [ main] o.m.s.mapper.ClassPathMapperScanner : No MyBatis mapper was found in '[com.zys.order]' package. Please check your configuration. 2019-11-20 10:18:48.164 INFO 48096 --- [ main] trationDelegate$BeanPostProcessorChecker : Bean 'com.alibaba.boot.dubbo.autoconfigure.DubboAutoConfiguration' of type [com.alibaba.boot.dubbo.autoconfigure.DubboAutoConfiguration$$EnhancerBySpringCGLIB$$5fbcce96] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) 2019-11-20 10:18:48.239 INFO 48096 --- [ main] trationDelegate$BeanPostProcessorChecker : Bean 'org.springframework.transaction.annotation.ProxyTransactionManagementConfiguration' of type [org.springframework.transaction.annotation.ProxyTransactionManagementConfiguration$$EnhancerBySpringCGLIB$$50bb2924] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) 2019-11-20 10:18:48.261 INFO 48096 --- [ main] trationDelegate$BeanPostProcessorChecker : Bean 'relaxedDubboConfigBinder' of type [com.alibaba.boot.dubbo.autoconfigure.RelaxedDubboConfigBinder] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) 2019-11-20 10:18:48.263 INFO 48096 --- [ main] trationDelegate$BeanPostProcessorChecker : Bean 'relaxedDubboConfigBinder' of type [com.alibaba.boot.dubbo.autoconfigure.RelaxedDubboConfigBinder] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) 2019-11-20 10:18:48.264 INFO 48096 --- [ main] trationDelegate$BeanPostProcessorChecker : Bean 'relaxedDubboConfigBinder' of type [com.alibaba.boot.dubbo.autoconfigure.RelaxedDubboConfigBinder] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) 2019-11-20 10:18:48.482 INFO 48096 --- [ main] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat initialized with port(s): 8088 (http) 2019-11-20 10:18:48.496 INFO 48096 --- [ main] o.apache.catalina.core.StandardService : Starting service [Tomcat] 2019-11-20 10:18:48.496 INFO 48096 --- [ main] org.apache.catalina.core.StandardEngine : Starting Servlet engine: [Apache Tomcat/9.0.21] 2019-11-20 10:18:48.596 INFO 48096 --- [ main] o.a.c.c.C.[Tomcat].[localhost].[/] : Initializing Spring embedded WebApplicationContext 2019-11-20 10:18:48.596 INFO 48096 --- [ main] o.s.web.context.ContextLoader : Root WebApplicationContext: initialization completed in 1268 ms 2019-11-20 10:18:48.847 INFO 48096 --- [ main] .f.a.DubboConfigBindingBeanPostProcessor : The properties of bean [name : dubbo] have been binding by prefix of configuration properties : dubbo.protocol 2019-11-20 10:18:49.338 INFO 48096 --- [ main] .f.a.DubboConfigBindingBeanPostProcessor : The properties of bean [name : zys-order] have been binding by prefix of configuration properties : dubbo.application 2019-11-20 10:18:49.345 INFO 48096 --- [ main] .f.a.DubboConfigBindingBeanPostProcessor : The properties of bean [name : zys-order-registry] have been binding by prefix of configuration properties : dubbo.registry 2019-11-20 10:18:49.919 INFO 48096 --- [ main] o.a.c.f.imps.CuratorFrameworkImpl : Starting 2019-11-20 10:18:58.930 INFO 48096 --- [ main] org.apache.zookeeper.ZooKeeper : Client environment:zookeeper.version=3.4.9-1757313, built on 08/23/2016 06:50 GMT 2019-11-20 10:18:58.930 INFO 48096 --- [ main] org.apache.zookeeper.ZooKeeper : Client environment:host.name=DESKTOP-O0S84K8 2019-11-20 10:18:58.930 INFO 48096 --- [ main] org.apache.zookeeper.ZooKeeper : Client environment:java.version=1.8.0_211 2019-11-20 10:18:58.930 INFO 48096 --- [ main] org.apache.zookeeper.ZooKeeper : Client environment:java.vendor=Oracle Corporation 2019-11-20 10:18:58.930 INFO 48096 --- [ main] org.apache.zookeeper.ZooKeeper : Client environment:java.home=C:\Program Files\Java\jdk1.8.0_211\jre 2019-11-20 10:18:58.930 INFO 48096 --- [ main] org.apache.zookeeper.ZooKeeper : Client environment:java.class.path=C:\Program Files\Java\jdk1.8.0_211\jre\lib\charsets.jar;C:\Program Files\Java\jdk1.8.0_211\jre\lib\deploy.jar;C:\Program Files\Java\jdk1.8.0_211\jre\lib\ext\access-bridge-64.jar;C:\Program Files\Java\jdk1.8.0_211\jre\lib\ext\cldrdata.jar;C:\Program Files\Java\jdk1.8.0_211\jre\lib\ext\dnsns.jar;C:\Program Files\Java\jdk1.8.0_211\jre\lib\ext\jaccess.jar;C:\Program Files\Java\jdk1.8.0_211\jre\lib\ext\jfxrt.jar;C:\Program Files\Java\jdk1.8.0_211\jre\lib\ext\localedata.jar;C:\Program Files\Java\jdk1.8.0_211\jre\lib\ext\nashorn.jar;C:\Program Files\Java\jdk1.8.0_211\jre\lib\ext\sunec.jar;C:\Program Files\Java\jdk1.8.0_211\jre\lib\ext\sunjce_provider.jar;C:\Program Files\Java\jdk1.8.0_211\jre\lib\ext\sunmscapi.jar;C:\Program Files\Java\jdk1.8.0_211\jre\lib\ext\sunpkcs11.jar;C:\Program Files\Java\jdk1.8.0_211\jre\lib\ext\zipfs.jar;C:\Program Files\Java\jdk1.8.0_211\jre\lib\javaws.jar;C:\Program Files\Java\jdk1.8.0_211\jre\lib\jce.jar;C:\Program Files\Java\jdk1.8.0_211\jre\lib\jfr.jar;C:\Program Files\Java\jdk1.8.0_211\jre\lib\jfxswt.jar;C:\Program Files\Java\jdk1.8.0_211\jre\lib\jsse.jar;C:\Program Files\Java\jdk1.8.0_211\jre\lib\management-agent.jar;C:\Program Files\Java\jdk1.8.0_211\jre\lib\plugin.jar;C:\Program Files\Java\jdk1.8.0_211\jre\lib\resources.jar;C:\Program Files\Java\jdk1.8.0_211\jre\lib\rt.jar;D:\ideaworkspace\zys-project\zys-order\target\classes;D:\repository\org\springframework\boot\spring-boot-starter-web\2.1.6.RELEASE\spring-boot-starter-web-2.1.6.RELEASE.jar;D:\repository\org\springframework\boot\spring-boot-starter-json\2.1.6.RELEASE\spring-boot-starter-json-2.1.6.RELEASE.jar;D:\repository\com\fasterxml\jackson\core\jackson-databind\2.9.9\jackson-databind-2.9.9.jar;D:\repository\com\fasterxml\jackson\datatype\jackson-datatype-jdk8\2.9.9\jackson-datatype-jdk8-2.9.9.jar;D:\repository\com\fasterxml\jackson\datatype\jackson-datatype-jsr310\2.9.9\jackson-datatype-jsr310-2.9.9.jar;D:\repository\com\fasterxml\jackson\module\jackson-module-parameter-names\2.9.9\jackson-module-parameter-names-2.9.9.jar;D:\repository\org\springframework\boot\spring-boot-starter-tomcat\2.1.6.RELEASE\spring-boot-starter-tomcat-2.1.6.RELEASE.jar;D:\repository\org\apache\tomcat\embed\tomcat-embed-core\9.0.21\tomcat-embed-core-9.0.21.jar;D:\repository\org\apache\tomcat\embed\tomcat-embed-el\9.0.21\tomcat-embed-el-9.0.21.jar;D:\repository\org\apache\tomcat\embed\tomcat-embed-websocket\9.0.21\tomcat-embed-websocket-9.0.21.jar;D:\repository\org\hibernate\validator\hibernate-validator\6.0.17.Final\hibernate-validator-6.0.17.Final.jar;D:\repository\javax\validation\validation-api\2.0.1.Final\validation-api-2.0.1.Final.jar;D:\repository\org\jboss\logging\jboss-logging\3.3.2.Final\jboss-logging-3.3.2.Final.jar;D:\repository\com\fasterxml\classmate\1.4.0\classmate-1.4.0.jar;D:\repository\org\springframework\spring-web\5.1.8.RELEASE\spring-web-5.1.8.RELEASE.jar;D:\repository\org\springframework\spring-beans\5.1.8.RELEASE\spring-beans-5.1.8.RELEASE.jar;D:\repository\org\springframework\spring-webmvc\5.1.8.RELEASE\spring-webmvc-5.1.8.RELEASE.jar;D:\repository\org\springframework\spring-aop\5.1.8.RELEASE\spring-aop-5.1.8.RELEASE.jar;D:\repository\org\springframework\spring-context\5.1.8.RELEASE\spring-context-5.1.8.RELEASE.jar;D:\repository\org\springframework\spring-expression\5.1.8.RELEASE\spring-expression-5.1.8.RELEASE.jar;D:\repository\org\springframework\boot\spring-boot-starter-actuator\2.1.6.RELEASE\spring-boot-starter-actuator-2.1.6.RELEASE.jar;D:\repository\org\springframework\boot\spring-boot-actuator-autoconfigure\2.1.6.RELEASE\spring-boot-actuator-autoconfigure-2.1.6.RELEASE.jar;D:\repository\org\springframework\boot\spring-boot-actuator\2.1.6.RELEASE\spring-boot-actuator-2.1.6.RELEASE.jar;D:\repository\io\micrometer\micrometer-core\1.1.5\micrometer-core-1.1.5.jar;D:\repository\org\latencyutils\LatencyUtils\2.0.3\LatencyUtils-2.0.3.jar;D:\repository\org\springframework\spring-core\5.1.8.RELEASE\spring-core-5.1.8.RELEASE.jar;D:\repository\org\springframework\spring-jcl\5.1.8.RELEASE\spring-jcl-5.1.8.RELEASE.jar;D:\repository\org\springframework\boot\spring-boot-starter-jdbc\2.1.6.RELEASE\spring-boot-starter-jdbc-2.1.6.RELEASE.jar;D:\repository\com\zaxxer\HikariCP\3.2.0\HikariCP-3.2.0.jar;D:\repository\org\springframework\spring-jdbc\5.1.8.RELEASE\spring-jdbc-5.1.8.RELEASE.jar;D:\repository\org\springframework\spring-tx\5.1.8.RELEASE\spring-tx-5.1.8.RELEASE.jar;D:\repository\com\baomidou\mybatis-plus-boot-starter\3.1.1\mybatis-plus-boot-starter-3.1.1.jar;D:\repository\com\baomidou\mybatis-plus\3.1.1\mybatis-plus-3.1.1.jar;D:\repository\com\baomidou\mybatis-plus-extension\3.1.1\mybatis-plus-extension-3.1.1.jar;D:\repository\com\baomidou\mybatis-plus-core\3.1.1\mybatis-plus-core-3.1.1.jar;D:\repository\com\baomidou\mybatis-plus-annotation\3.1.1\mybatis-plus-annotation-3.1.1.jar;D:\repository\com\github\jsqlparser\jsqlparser\1.2\jsqlparser-1.2.jar;D:\repository\org\mybatis\mybatis\3.5.1\mybatis-3.5.1.jar;D:\repository\org\mybatis\mybatis-spring\2.0.1\mybatis-spring-2.0.1.jar;D:\repository\org\springframework\boot\spring-boot-autoconfigure\2.1.6.RELEASE\spring-boot-autoconfigure-2.1.6.RELEASE.jar;D:\repository\log4j\log4j\1.2.17\log4j-1.2.17.jar;D:\repository\mysql\mysql-connector-java\6.0.6\mysql-connector-java-6.0.6.jar;D:\repository\com\alibaba\boot\dubbo-spring-boot-starter\0.2.0\dubbo-spring-boot-starter-0.2.0.jar;D:\repository\com\alibaba\dubbo\2.6.2\dubbo-2.6.2.jar;D:\repository\org\javassist\javassist\3.20.0-GA\javassist-3.20.0-GA.jar;D:\repository\org\jboss\netty\netty\3.2.5.Final\netty-3.2.5.Final.jar;D:\repository\org\apache\zookeeper\zookeeper\3.4.9\zookeeper-3.4.9.jar;D:\repository\jline\jline\0.9.94\jline-0.9.94.jar;D:\repository\io\netty\netty\3.10.5.Final\netty-3.10.5.Final.jar;D:\repository\org\apache\curator\curator-framework\2.12.0\curator-framework-2.12.0.jar;D:\repository\org\apache\curator\curator-client\2.12.0\curator-client-2.12.0.jar;D:\repository\com\alibaba\boot\dubbo-spring-boot-autoconfigure\0.2.0\dubbo-spring-boot-autoconfigure-0.2.0.jar;D:\repository\com\alibaba\boot\dubbo-spring-boot-actuator\0.2.0\dubbo-spring-boot-actuator-0.2.0.jar;D:\repository\com\netflix\hystrix\hystrix-core\1.5.12\hystrix-core-1.5.12.jar;D:\repository\org\slf4j\slf4j-api\1.7.26\slf4j-api-1.7.26.jar;D:\repository\com\netflix\archaius\archaius-core\0.4.1\archaius-core-0.4.1.jar;D:\repository\commons-configuration\commons-configuration\1.8\commons-configuration-1.8.jar;D:\repository\commons-lang\commons-lang\2.6\commons-lang-2.6.jar;D:\repository\io\reactivex\rxjava\1.3.8\rxjava-1.3.8.jar;D:\repository\org\hdrhistogram\HdrHistogram\2.1.9\HdrHistogram-2.1.9.jar;D:\repository\com\netflix\hystrix\hystrix-metrics-event-stream\1.5.12\hystrix-metrics-event-stream-1.5.12.jar;D:\repository\com\netflix\hystrix\hystrix-serialization\1.5.12\hystrix-serialization-1.5.12.jar;D:\repository\com\fasterxml\jackson\module\jackson-module-afterburner\2.9.9\jackson-module-afterburner-2.9.9.jar;D:\repository\com\fasterxml\jackson\core\jackson-core\2.9.9\jackson-core-2.9.9.jar;D:\repository\com\fasterxml\jackson\core\jackson-annotations\2.9.0\jackson-annotations-2.9.0.jar;D:\repository\com\netflix\hystrix\hystrix-javanica\1.5.12\hystrix-javanica-1.5.12.jar;D:\repository\org\aspectj\aspectjrt\1.9.4\aspectjrt-1.9.4.jar;D:\repository\org\apache\commons\commons-lang3\3.3.1\commons-lang3-3.3.1.jar;D:\repository\org\ow2\asm\asm\5.0.4\asm-5.0.4.jar;D:\repository\org\aspectj\aspectjweaver\1.9.4\aspectjweaver-1.9.4.jar;D:\repository\com\google\guava\guava\15.0\guava-15.0.jar;D:\repository\com\google\code\findbugs\jsr305\2.0.0\jsr305-2.0.0.jar;D:\repository\org\apache\rocketmq\rocketmq-client\4.3.0\rocketmq-client-4.3.0.jar;D:\repository\org\apache\rocketmq\rocketmq-common\4.3.0\rocketmq-common-4.3.0.jar;D:\repository\org\apache\rocketmq\rocketmq-remoting\4.3.0\rocketmq-remoting-4.3.0.jar;D:\repository\com\alibaba\fastjson\1.2.29\fastjson-1.2.29.jar;D:\repository\io\netty\netty-all\4.1.36.Final\netty-all-4.1.36.Final.jar;D:\repository\org\apache\rocketmq\rocketmq-logging\4.3.0\rocketmq-logging-4.3.0.jar;D:\repository\io\netty\netty-tcnative-boringssl-static\2.0.25.Final\netty-tcnative-boringssl-static-2.0.25.Final.jar;D:\repository\com\zys\zys-store-api\0.0.1-SNAPSHOT\zys-store-api-0.0.1-SNAPSHOT.jar;D:\repository\org\springframework\boot\spring-boot-starter\2.1.6.RELEASE\spring-boot-starter-2.1.6.RELEASE.jar;D:\repository\org\springframework\boot\spring-boot\2.1.6.RELEASE\spring-boot-2.1.6.RELEASE.jar;D:\repository\org\springframework\boot\spring-boot-starter-logging\2.1.6.RELEASE\spring-boot-starter-logging-2.1.6.RELEASE.jar;D:\repository\ch\qos\logback\logback-classic\1.2.3\logback-classic-1.2.3.jar;D:\repository\ch\qos\logback\logback-core\1.2.3\logback-core-1.2.3.jar;D:\repository\org\apache\logging\log4j\log4j-to-slf4j\2.11.2\log4j-to-slf4j-2.11.2.jar;D:\repository\org\apache\logging\log4j\log4j-api\2.11.2\log4j-api-2.11.2.jar;D:\repository\org\slf4j\jul-to-slf4j\1.7.26\jul-to-slf4j-1.7.26.jar;D:\repository\javax\annotation\javax.annotation-api\1.3.2\javax.annotation-api-1.3.2.jar;D:\repository\org\yaml\snakeyaml\1.23\snakeyaml-1.23.jar;D:\idea\IntelliJ IDEA 2019.2\lib\idea_rt.jar 2019-11-20 10:18:58.930 INFO 48096 --- [ main] org.apache.zookeeper.ZooKeeper : Client environment:java.library.path=C:\Program Files\Java\jdk1.8.0_211\bin;C:\Windows\Sun\Java\bin;C:\Windows\system32;C:\Windows;D:\idea\IntelliJ IDEA 2019.2\jbr\\bin;D:\idea\IntelliJ IDEA 2019.2\jbr\\bin\server;D:\oracleClient\product\11.2.0\client_1\bin;D:\xshell\;C:\Program Files (x86)\Common Files\Oracle\Java\javapath;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;C:\Windows\System32\OpenSSH\;C:\Program Files\Java\jdk1.8.0_211;D:\Git\cmd;C:\Program Files\MySQL\MySQL Server 5.7\bin;D:\TortoiseSVN1.10\bin;D:\apache-maven-idea\bin;C:\Users\Administrator\AppData\Local\Microsoft\WindowsApps;;. 2019-11-20 10:18:58.931 INFO 48096 --- [ main] org.apache.zookeeper.ZooKeeper : Client environment:java.io.tmpdir=C:\Users\ADMINI~1\AppData\Local\Temp\ 2019-11-20 10:18:58.931 INFO 48096 --- [ main] org.apache.zookeeper.ZooKeeper : Client environment:java.compiler=<NA> 2019-11-20 10:18:58.931 INFO 48096 --- [ main] org.apache.zookeeper.ZooKeeper : Client environment:os.name=Windows 10 2019-11-20 10:18:58.931 INFO 48096 --- [ main] org.apache.zookeeper.ZooKeeper : Client environment:os.arch=amd64 2019-11-20 10:18:58.931 INFO 48096 --- [ main] org.apache.zookeeper.ZooKeeper : Client environment:os.version=10.0 2019-11-20 10:18:58.931 INFO 48096 --- [ main] org.apache.zookeeper.ZooKeeper : Client environment:user.name=Administrator 2019-11-20 10:18:58.931 INFO 48096 --- [ main] org.apache.zookeeper.ZooKeeper : Client environment:user.home=C:\Users\Administrator 2019-11-20 10:18:58.931 INFO 48096 --- [ main] org.apache.zookeeper.ZooKeeper : Client environment:user.dir=D:\ideaworkspace\zys-project 2019-11-20 10:18:58.932 INFO 48096 --- [ main] org.apache.zookeeper.ZooKeeper : Initiating client connection, connectString=192.168.146.132:2181,192.168.146.131:2181,192.168.146.133:2181 sessionTimeout=60000 watcher=org.apache.curator.ConnectionState@5a3a1bf9 2019-11-20 10:18:58.942 INFO 48096 --- [8.146.132:2181)] org.apache.zookeeper.ClientCnxn : Opening socket connection to server 192.168.146.132/192.168.146.132:2181. Will not attempt to authenticate using SASL (unknown error) 2019-11-20 10:18:58.942 INFO 48096 --- [8.146.132:2181)] org.apache.zookeeper.ClientCnxn : Socket connection established to 192.168.146.132/192.168.146.132:2181, initiating session 2019-11-20 10:18:58.953 INFO 48096 --- [8.146.132:2181)] org.apache.zookeeper.ClientCnxn : Session establishment complete on server 192.168.146.132/192.168.146.132:2181, sessionid = 0x200000235350000, negotiated timeout = 40000 2019-11-20 10:18:58.957 INFO 48096 --- [ain-EventThread] o.a.c.f.state.ConnectionStateManager : State change: CONNECTED 2019-11-20 10:18:59.030 INFO 48096 --- [ main] c.a.d.c.s.b.f.a.ReferenceBeanBuilder : <dubbo:reference object="com.alibaba.dubbo.common.bytecode.proxy0@3b5c665c" singleton="true" interface="com.zys.store.service.api.HelloServiceApi" uniqueServiceName="com.zys.store.service.api.HelloServiceApi:1.0.0" generic="false" version="1.0.0" check="false" timeout="3000" id="com.zys.store.service.api.HelloServiceApi" /> has been built. 2019-11-20 10:18:59.242 INFO 48096 --- [ main] o.s.s.concurrent.ThreadPoolTaskExecutor : Initializing ExecutorService 'applicationTaskExecutor' 2019-11-20 10:18:59.420 INFO 48096 --- [ main] com.zaxxer.hikari.HikariDataSource : MyHikariCP - Starting... 2019-11-20 10:18:59.592 INFO 48096 --- [ main] com.zaxxer.hikari.HikariDataSource : MyHikariCP - Start completed. _ _ |_ _ _|_. ___ _ | _ | | |\/|_)(_| | |_\ |_)||_|_\ / | 3.1.1 2019-11-20 10:18:59.836 INFO 48096 --- [ main] o.s.b.a.e.web.EndpointLinksResolver : Exposing 2 endpoint(s) beneath base path '/actuator' 2019-11-20 10:18:59.897 INFO 48096 --- [ main] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat started on port(s): 8088 (http) with context path '' 2019-11-20 10:18:59.899 INFO 48096 --- [ main] com.zys.order.ZysOrderApplication : Started ZysOrderApplication in 12.907 seconds (JVM running for 13.464) 2019-11-20 10:18:59.956 INFO 48096 --- [-192.168.255.54] o.a.c.c.C.[Tomcat].[localhost].[/] : Initializing Spring DispatcherServlet 'dispatcherServlet' 2019-11-20 10:18:59.956 INFO 48096 --- [-192.168.255.54] o.s.web.servlet.DispatcherServlet : Initializing Servlet 'dispatcherServlet' 2019-11-20 10:18:59.969 INFO 48096 --- [-192.168.255.54] o.s.web.servlet.DispatcherServlet : Completed initialization in 13 ms ```
netty 系统空闲一段后,客户端与服务端的链接会自动断开如何处理
大伙有没有出现过这样的问题。用netty做集群服务器,包括客户端(clientBootstrap)和服务端(serverBootstrap),系统空闲3个多小时候,客户端与服务端的连接会自动断开(“远程主机强迫关闭了一个现有的连接”),系统重新后连接报异常: java.lang.IllegalArgumentException: promise already done: DefaultChannelPromise@2038329b(failure(java.util.concurrent.CancellationException) at io.netty.channel.DefaultChannelHandlerContext.validatePromise(DefaultChannelHandlerContext.java:806) at io.netty.channel.DefaultChannelHandlerContext.connect(DefaultChannelHandlerContext.java:477) at io.netty.channel.DefaultChannelHandlerContext.connect(DefaultChannelHandlerContext.java:467) at io.netty.channel.DefaultChannelPipeline.connect(DefaultChannelPipeline.java:847) at io.netty.channel.AbstractChannel.connect(AbstractChannel.java:199) at io.netty.bootstrap.Bootstrap$2.run(Bootstrap.java:165) at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:354) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:353) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:101) at java.lang.Thread.run(Unknown Source) 最后: Caused by: java.net.NoRouteToHostException: No route to host: no further information at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source) at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:191) at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:279) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:461) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:378) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:350) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:101) 请问这个问题如何解决,我有做心跳处理,客户端每个几秒会请求服务端,这样都还会自动断开连接,好郁闷....
netty4.0 read0方法 wirteflush之后,就捕获异常了。
netty4 io.netty.util.IllegalReferenceCountException: refCnt: 0, decrement: 1 网上也看了,是relase了,造成实例被收回。这个要怎么处理?前提这是tcp长连接,怎么也会被收回???
Netty4.3SSL通信失败,握手失败
使用Netty4.3连接某产品服务端,但ssl通信无法正常建立,请有经验的人联系我,可提供测试代码。 QQ:31604274
Netty4.1运行一段时间后监听端口收不到请求
项目用的Netty4.1编写, 情况是运行一段时间后,监听端口就接收不到前端请求,大概过1分钟自动又恢复,貌似运行越久这种状况出现的越是频繁。上线时并发测试还挺不错的。就是这个问题折腾到现在反复重现,求救大神。 关键代码如下: ``` EventLoopGroup bossGroup = new NioEventLoopGroup(); //定义一个线程组,这个线程组的作用是用来接收客户端的连接 EventLoopGroup workerGroup = new NioEventLoopGroup(); //定义一个线程组,这个线程组用来处理业务逻辑 final EventExecutorGroup e2=new DefaultEventExecutorGroup(32); try { ServerBootstrap b = new ServerBootstrap(); //定义一个ServerBootstarp类,这个类用来初始化netty服务器 //将两个线程组绑定到ServerBootstarp中,channel使用的模式为非阻塞模式 b.group(bossGroup, workerGroup); b.channel(NioServerSocketChannel.class); b.childHandler(new ChannelInitializer<SocketChannel>() { int i = 0; @Override //当有连接接入的时候会调用这个方法 public void initChannel(SocketChannel ch) throws Exception { //server端发送的是httpResponse,所以要使用HttpResponseEncoder进行编码,将HttpResponse转化为ByteBuffer ch.pipeline().addLast(new HttpResponseEncoder()); //server端接收到的是httpRequest,所以要使用HttpRequestDecoder进行解码,将HttpRequset解码为ByteBuffer ch.pipeline().addLast(new HttpRequestDecoder()); //处理接收HTTP报文不全的特殊设置 ch.pipeline().addLast("aggregator", new HttpObjectAggregator(3200)); //收到客户端的连接之后就调用HttpServerInboundHandler来处理 //ch.pipeline().addLast(new HttpServerInboundHandler()); ch.pipeline().addLast(e2,new HttpServerInboundHandler()); } }); b.option(ChannelOption.SO_BACKLOG, 1024); b.childOption(ChannelOption.CONNECT_TIMEOUT_MILLIS,30); b.childOption(ChannelOption.SO_KEEPALIVE, false); ChannelFuture f = b.bind(port).sync(); //和套接字的绑定类似,监听班底的port端口 f.channel().closeFuture().sync(); //等待结束 ```
netty4 服务端丢包的问题
目标:实现服务端100000车辆数据接收 解析 问题:当模拟终端数量达到3000以上时,服务端只能接收到2000多个数据包 备注:2000一下数据包解码 解析没有问题 模拟客户端与服务端netty配置如下 ``` package org.dht.vehicle.data.com; import java.net.InetSocketAddress; import java.util.Map; import java.util.concurrent.ConcurrentHashMap; import org.dht.vehicle.com.deCoder.BJVehicleDeviceDataDecoder; import org.dht.vehicle.com.message.MessageManager; import org.dht.vehicle.com.message.MessageSendVehicleRegister; import io.netty.bootstrap.Bootstrap; import io.netty.buffer.PooledByteBufAllocator; import io.netty.channel.AdaptiveRecvByteBufAllocator; import io.netty.channel.Channel; import io.netty.channel.ChannelFuture; import io.netty.channel.ChannelInitializer; import io.netty.channel.ChannelOption; import io.netty.channel.ChannelPipeline; import io.netty.channel.EventLoopGroup; import io.netty.channel.nio.NioEventLoopGroup; import io.netty.channel.socket.SocketChannel; import io.netty.channel.socket.nio.NioSocketChannel; public class BJTCPComService { static final int SIZE = Integer.parseInt(System.getProperty("size", "256")); private static ChannelFuture f = null;; private static EventLoopGroup group = new NioEventLoopGroup(); private static Bootstrap b = new Bootstrap(); private static Map<String, DeviceConInfo> map = new ConcurrentHashMap<String, DeviceConInfo>(); public static void start() { // TODO Auto-generated method stub b.group(group) .channel(NioSocketChannel.class) .option(ChannelOption.SO_BACKLOG, 1024) .option(ChannelOption.SO_RCVBUF, 1024*1024) .option(ChannelOption.SO_SNDBUF, 10*1024*1024) .option(ChannelOption.ALLOCATOR, PooledByteBufAllocator.DEFAULT) .handler(new ChannelInitializer<SocketChannel>() { @Override protected void initChannel(SocketChannel ch) throws Exception { ChannelPipeline p = ch.pipeline(); p.addLast(new BJVehicleDeviceDataDecoder()); p.addLast(new DeviceClientHandler()); } }); } public static Map<String, DeviceConInfo> getMap() { return map; } public static Channel getChannel() { if (null != f) return f.channel(); return null; } public static void connect(String ip, String port) throws NumberFormatException, InterruptedException { ChannelFuture f = b.connect(ip, Integer.parseInt(port)).sync(); DeviceConInfo d = new DeviceConInfo(); d.socketChannel = (SocketChannel) f.channel(); map.put(String.valueOf(1), d); } public static void connect(int num, int oneNums, String ip, String port, int beginID) { for (int i = 0; i < num; i++) { try { ChannelFuture f = b.connect(ip, Integer.parseInt(port)).sync(); System.out.println("====" + MessageManager.getMessageManger() + "====" + f.channel()); DeviceConInfo d = new DeviceConInfo(); String strID = String.format("%07d", i + 1 + beginID); String identiCode = "abcdefghij" + strID; d.socketChannel = (SocketChannel) f.channel(); d.identiCode = identiCode; d.onState = BJProtocolConst.CONNECTED; map.put(identiCode, d); MessageSendVehicleRegister messagePack = new MessageSendVehicleRegister( null, 0, d); MessageManager.getMessageManger().addSocketMessage(messagePack); Thread.sleep(5); } catch (InterruptedException e) { // TODO Auto-generated catch block e.printStackTrace(); } } } public static void connectNums() { System.out.println("======client nums:" + map.size() + "====="); } public static int getOnLineDevices() { int nums = 0; for (Map.Entry entry : map.entrySet()) { DeviceConInfo devConInfo = (DeviceConInfo) entry.getValue(); if (BJProtocolConst.LOGINED == devConInfo.onState) { nums++; } } return nums; } public static void diConnect() { for (Map.Entry entry : map.entrySet()) { DeviceConInfo devConInfo = (DeviceConInfo) entry.getValue(); if (null != devConInfo.socketChannel) devConInfo.socketChannel.close(); try { devConInfo.socketChannel.closeFuture().sync(); } catch (InterruptedException e) { // TODO Auto-generated catch block e.printStackTrace(); } map.remove(entry.getKey()); } } public static void stop() { diConnect(); group.shutdownGracefully(); } public static DeviceConInfo update(String identiCode, DeviceConInfo deviceConInfo) { return map.put(identiCode, deviceConInfo); } public static DeviceConInfo get(String identiCode) { return map.get(identiCode); } public static void remove(SocketChannel socketChannel) { for (Map.Entry entry : map.entrySet()) { DeviceConInfo devConInfo = (DeviceConInfo) entry.getValue(); if (devConInfo.socketChannel == socketChannel) { map.remove(entry.getKey()); } } } } ``` ``` package org.dht.vehicle.com.socketfactory; import java.util.ArrayList; import java.util.Iterator; import java.util.List; import io.netty.bootstrap.ServerBootstrap; import io.netty.buffer.PooledByteBufAllocator; import io.netty.channel.AdaptiveRecvByteBufAllocator; import io.netty.channel.Channel; import io.netty.channel.ChannelFuture; import io.netty.channel.ChannelHandler; import io.netty.channel.ChannelOption; import io.netty.channel.EventLoopGroup; import io.netty.channel.nio.NioEventLoopGroup; import io.netty.channel.socket.nio.NioServerSocketChannel; import io.netty.handler.logging.LogLevel; import io.netty.handler.logging.LoggingHandler; public class BasicSocketServer implements SocketServer { protected ChannelHandler serverChannel; protected Channel acceptorChannel; protected ServerBootstrap b ; protected EventLoopGroup bossGroup ; protected EventLoopGroup workerGroup ; protected List<Integer> port; protected List<ChannelFuture> channelFuture; public BasicSocketServer(){ this.channelFuture = new ArrayList<ChannelFuture>(); } public void setServerChannel(ChannelHandler serverChannel){ this.serverChannel = serverChannel; } public ChannelHandler getServerChannel(){ return this.serverChannel ; } public void setPort(List<Integer> port){ this.port = port; } public void Start() throws Exception { // TODO Auto-generated method stub try{ createServerBootstrap(); }finally{ Stop(); } } public void Stop() throws Exception { // TODO Auto-generated method stub closeFuture(); bossGroup.shutdownGracefully(); workerGroup.shutdownGracefully(); } public void Restart() throws Exception { // TODO Auto-generated method stub Stop(); Start(); } public void createServerBootstrap() throws Exception{ // TODO Auto-generated method stub try{ b = new ServerBootstrap(); bossGroup = new NioEventLoopGroup(1); workerGroup = new NioEventLoopGroup(); b.group(bossGroup, workerGroup) .channel(NioServerSocketChannel.class) .option(ChannelOption.SO_BACKLOG, 1024) .option(ChannelOption.SO_RCVBUF, 10*1024*1024) .option(ChannelOption.SO_SNDBUF, 1024*1024) .option(ChannelOption.ALLOCATOR, PooledByteBufAllocator.DEFAULT) .childOption(ChannelOption.ALLOCATOR, PooledByteBufAllocator.DEFAULT) .childOption(ChannelOption.SO_KEEPALIVE, true) .handler(new LoggingHandler(LogLevel.INFO)) .childHandler(serverChannel); bindPort(); // Wait until the server socket is closed. closeFuture(); } finally { // Shut down all event loops to terminate all threads. bossGroup.shutdownGracefully(); workerGroup.shutdownGracefully(); } } public void bindPort() throws InterruptedException { // TODO Auto-generated method stub Iterator<Integer> iter = port.iterator(); int nPort; while(iter.hasNext()) { nPort = (Integer)iter.next().intValue(); if(nPort>0){ ChannelFuture f = b.bind(nPort).sync(); channelFuture.add(f); } //port.remove(iter.next()); } } /** * �ر����е�ChannelFuture */ public void closeFuture() throws InterruptedException { // TODO Auto-generated method stub Iterator<ChannelFuture> iter = channelFuture.iterator(); ChannelFuture f = null; while(iter.hasNext()) { f=(ChannelFuture)iter.next(); if(null != f){ f.channel().closeFuture().sync(); } //port.remove(iter.next()); } } } ```
springboot、netty、redis
springboot 整合netty搭建socket,使用redis缓存,可以正常启动,在不适用缓存时可以交互,使用缓存出异常 ``` java.lang.NoSuchMethodError: io.netty.bootstrap.Bootstrap.channel(Ljava/lang/Class;)Lio/netty/bootstrap/AbstractBootstrap; at io.lettuce.core.AbstractRedisClient.channelType(AbstractRedisClient.java:179) ~[lettuce-core-5.1.4.RELEASE.jar:?] at io.lettuce.core.RedisClient.connectStatefulAsync(RedisClient.java:304) ~[lettuce-core-5.1.4.RELEASE.jar:?] at io.lettuce.core.RedisClient.connectStandaloneAsync(RedisClient.java:271) ~[lettuce-core-5.1.4.RELEASE.jar:?] at io.lettuce.core.RedisClient.connect(RedisClient.java:204) ~[lettuce-core-5.1.4.RELEASE.jar:?] at org.springframework.data.redis.connection.lettuce.StandaloneConnectionProvider.lambda$getConnection$1(StandaloneConnectionProvider.java:113) ~[spring-data-redis-2.1.5.RELEASE.jar:2.1.5.RELEASE] at java.util.Optional.orElseGet(Optional.java:267) ~[?:1.8.0_201] at org.springframework.data.redis.connection.lettuce.StandaloneConnectionProvider.getConnection(StandaloneConnectionProvider.java:113) ~[spring-data-redis-2.1.5.RELEASE.jar:2.1.5.RELEASE] at org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory$SharedConnection.getNativeConnection(LettuceConnectionFactory.java:1085) ~[spring-data-redis-2.1.5.RELEASE.jar:2.1.5.RELEASE] at org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory$SharedConnection.getConnection(LettuceConnectionFactory.java:1065) ~[spring-data-redis-2.1.5.RELEASE.jar:2.1.5.RELEASE] at org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory.getSharedConnection(LettuceConnectionFactory.java:865) ~[spring-data-redis-2.1.5.RELEASE.jar:2.1.5.RELEASE] at org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory.getConnection(LettuceConnectionFactory.java:340) ~[spring-data-redis-2.1.5.RELEASE.jar:2.1.5.RELEASE] at org.springframework.data.redis.core.RedisConnectionUtils.doGetConnection(RedisConnectionUtils.java:132) ~[spring-data-redis-2.1.5.RELEASE.jar:2.1.5.RELEASE] at org.springframework.data.redis.core.RedisConnectionUtils.getConnection(RedisConnectionUtils.java:95) ~[spring-data-redis-2.1.5.RELEASE.jar:2.1.5.RELEASE] at org.springframework.data.redis.core.RedisConnectionUtils.getConnection(RedisConnectionUtils.java:82) ~[spring-data-redis-2.1.5.RELEASE.jar:2.1.5.RELEASE] at org.springframework.data.redis.core.RedisTemplate.execute(RedisTemplate.java:211) ~[spring-data-redis-2.1.5.RELEASE.jar:2.1.5.RELEASE] at org.springframework.data.redis.core.RedisTemplate.execute(RedisTemplate.java:184) ~[spring-data-redis-2.1.5.RELEASE.jar:2.1.5.RELEASE] at org.springframework.data.redis.core.AbstractOperations.execute(AbstractOperations.java:95) ~[spring-data-redis-2.1.5.RELEASE.jar:2.1.5.RELEASE] at org.springframework.data.redis.core.DefaultValueOperations.get(DefaultValueOperations.java:53) ~[spring-data-redis-2.1.5.RELEASE.jar:2.1.5.RELEASE] at com.viewhigh.hiot.elec.service.serviceimp.RedisServiceImpl.get(RedisServiceImpl.java:76) ~[classes/:?] at com.viewhigh.hiot.elec.protocol.ProtocolElecService.isCheck(ProtocolElecService.java:177) ~[classes/:?] at com.viewhigh.hiot.elec.protocol.ProtocolElecService.checkMsg(ProtocolElecService.java:161) ~[classes/:?] at com.viewhigh.hiot.elec.server.SocketServerHandler.channelRead(SocketServerHandler.java:89) ~[classes/:?] at io.netty.channel.ChannelHandlerInvokerUtil.invokeChannelReadNow(ChannelHandlerInvokerUtil.java:74) [netty-all-5.0.0.Alpha1.jar:5.0.0.Alpha1] at io.netty.channel.DefaultChannelHandlerInvoker.invokeChannelRead(DefaultChannelHandlerInvoker.java:138) [netty-all-5.0.0.Alpha1.jar:5.0.0.Alpha1] at io.netty.channel.DefaultChannelHandlerContext.fireChannelRead(DefaultChannelHandlerContext.java:320) [netty-all-5.0.0.Alpha1.jar:5.0.0.Alpha1] at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:154) [netty-all-5.0.0.Alpha1.jar:5.0.0.Alpha1] at io.netty.channel.ChannelHandlerInvokerUtil.invokeChannelReadNow(ChannelHandlerInvokerUtil.java:74) [netty-all-5.0.0.Alpha1.jar:5.0.0.Alpha1] at io.netty.channel.DefaultChannelHandlerInvoker.invokeChannelRead(DefaultChannelHandlerInvoker.java:138) [netty-all-5.0.0.Alpha1.jar:5.0.0.Alpha1] at io.netty.channel.DefaultChannelHandlerContext.fireChannelRead(DefaultChannelHandlerContext.java:320) [netty-all-5.0.0.Alpha1.jar:5.0.0.Alpha1] at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:253) [netty-all-5.0.0.Alpha1.jar:5.0.0.Alpha1] at io.netty.channel.ChannelHandlerInvokerUtil.invokeChannelReadNow(ChannelHandlerInvokerUtil.java:74) [netty-all-5.0.0.Alpha1.jar:5.0.0.Alpha1] at io.netty.channel.DefaultChannelHandlerInvoker.invokeChannelRead(DefaultChannelHandlerInvoker.java:138) [netty-all-5.0.0.Alpha1.jar:5.0.0.Alpha1] at io.netty.channel.DefaultChannelHandlerContext.fireChannelRead(DefaultChannelHandlerContext.java:320) [netty-all-5.0.0.Alpha1.jar:5.0.0.Alpha1] at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846) [netty-all-5.0.0.Alpha1.jar:5.0.0.Alpha1] at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:127) [netty-all-5.0.0.Alpha1.jar:5.0.0.Alpha1] at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:485) [netty-all-5.0.0.Alpha1.jar:5.0.0.Alpha1] at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:452) [netty-all-5.0.0.Alpha1.jar:5.0.0.Alpha1] at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:346) [netty-all-5.0.0.Alpha1.jar:5.0.0.Alpha1] at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:794) [netty-all-5.0.0.Alpha1.jar:5.0.0.Alpha1] at java.lang.Thread.run(Thread.java:748) [?:1.8.0_201] ``` 可能是个包冲突,但是不确定也没有找到是那个冲突,求助,帮看看
fabric网络环境整合java-sdk grpc连接异常
1、调用fabric-java-sdk整合fabric网络环境。启动的过程中一直报错 ``` 23:16:57.825 [main] ERROR org.hyperledger.fabric.sdk.Channel - Channel Channel{id: 1, name: mychannel} Sending proposal with transaction: 5fe505ed0e555ac50cc4773876d8eb3746da951a3ad2f8461b2fa177901e5862 to Peer{ id: 2, name: peer0.org1.example.com, channelName: mychannel, url: grpc://x.x.x.x:7051} failed because of: gRPC failure=Status{code=INTERNAL, description=http2 exception, cause=io.netty.handler.codec.http2.Http2Exception: First received frame was not SETTINGS. Hex dump for first 5 bytes: 1503010002 at io.netty.handler.codec.http2.Http2Exception.connectionError(Http2Exception.java:85) at io.netty.handler.codec.http2.Http2ConnectionHandler$PrefaceDecoder.verifyFirstFrameIsSettings(Http2ConnectionHandler.java:350) at io.netty.handler.codec.http2.Http2ConnectionHandler$PrefaceDecoder.decode(Http2ConnectionHandler.java:251) at io.netty.handler.codec.http2.Http2ConnectionHandler.decode(Http2ConnectionHandler.java:450) at io.netty.handler.codec.ByteToMessageDecoder.decodeRemovalReentryProtection(ByteToMessageDecoder.java:502) at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:441) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:278) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:359) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:345) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:337) at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1408) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:359) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:345) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:930) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:677) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:612) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:529) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:491) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:905) at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) at java.lang.Thread.run(Thread.java:748) } java.lang.Exception: io.grpc.StatusRuntimeException: INTERNAL: http2 exception at org.hyperledger.fabric.sdk.Channel.sendProposalToPeers(Channel.java:4179) at org.hyperledger.fabric.sdk.Channel.getConfigBlock(Channel.java:854) at org.hyperledger.fabric.sdk.Channel.parseConfigBlock(Channel.java:1820) at org.hyperledger.fabric.sdk.Channel.loadCACertificates(Channel.java:1657) at org.hyperledger.fabric.sdk.Channel.initialize(Channel.java:1103) ``` grpc://x.x.x.x:7051 访问失败 服务器地址为阿里云服务 根据网上提供的方法都说是grpc通信错误,添加相应的依赖, 然后并没用。 实在搞不懂是怎么弄了 哪位大神指导一下。 Fabric环境为1.0的版本
redis报错 有大神遇到过么
2018.11.09 at 17:51:45.533 CST [lettuce-nioEventLoop-3-4] WARN io.netty.util.internal.logging.Slf4JLogger 141 warn - [/192.168.1.103:55005 -> /192.168.1.101:6379] Unexpected exception during request: java.lang.NullPointerException java.lang.NullPointerException: null at com.lambdaworks.redis.protocol.RedisStateMachine.safeSet(RedisStateMachine.java:195) ~[lettuce-3.5.0.Final.jar:?] at com.lambdaworks.redis.protocol.RedisStateMachine.decode(RedisStateMachine.java:161) ~[lettuce-3.5.0.Final.jar:?] at com.lambdaworks.redis.protocol.RedisStateMachine.decode(RedisStateMachine.java:61) ~[lettuce-3.5.0.Final.jar:?] at com.lambdaworks.redis.pubsub.PubSubCommandHandler.decode(PubSubCommandHandler.java:60) ~[lettuce-3.5.0.Final.jar:?] at com.lambdaworks.redis.protocol.CommandHandler.channelRead(CommandHandler.java:153) ~[lettuce-3.5.0.Final.jar:?] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333) [netty-all-4.0.24.Final.jar:4.0.24.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319) [netty-all-4.0.24.Final.jar:4.0.24.Final] at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86) [netty-all-4.0.24.Final.jar:4.0.24.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333) [netty-all-4.0.24.Final.jar:4.0.24.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319) [netty-all-4.0.24.Final.jar:4.0.24.Final] at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86) [netty-all-4.0.24.Final.jar:4.0.24.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333) [netty-all-4.0.24.Final.jar:4.0.24.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319) [netty-all-4.0.24.Final.jar:4.0.24.Final] at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86) [netty-all-4.0.24.Final.jar:4.0.24.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:333) [netty-all-4.0.24.Final.jar:4.0.24.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:319) [netty-all-4.0.24.Final.jar:4.0.24.Final] at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:787) [netty-all-4.0.24.Final.jar:4.0.24.Final] at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:130) [netty-all-4.0.24.Final.jar:4.0.24.Final] at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511) [netty-all-4.0.24.Final.jar:4.0.24.Final] at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468) [netty-all-4.0.24.Final.jar:4.0.24.Final] at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382) [netty-all-4.0.24.Final.jar:4.0.24.Final] at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) [netty-all-4.0.24.Final.jar:4.0.24.Final] at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116) [netty-all-4.0.24.Final.jar:4.0.24.Final] at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:137) [netty-all-4.0.24.Final.jar:4.0.24.Final] at java.lang.Thread.run(Thread.java:722) [?:1.7.0_17]
storm提示connection attempt to Netty-Client-ubuntu1
当我提交任务给storm时,发现2台supervisor中对应的日志输出都如下,请问如何解决: connection attempt 367 to Netty-Client-ubuntu1:6704 failed: java.nio.channels.UnresolvedAddressException b.s.m.n.StormClientHandler [INFO] Connection failed Netty-Client-ubuntu1:6704 java.nio.channels.UnresolvedAddressException at sun.nio.ch.Net.checkAddress(Net.java:101) ~[?:1.8.0_121] at sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:622) ~[?:1.8.0_121] at org.apache.storm.shade.org.jboss.netty.channel.socket.nio.NioClientSocketPipelineSink.connect(NioClientSocketPipelineSink.java:108) [storm-core-0.10.2.jar:0.10.2] at org.apache.storm.shade.org.jboss.netty.channel.socket.nio.NioClientSocketPipelineSink.eventSunk(NioClientSocketPipelineSink.java:70) [storm-core-0.10.2.jar:0.10.2] at org.apache.storm.shade.org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendDownstream(DefaultChannelPipeline.java:779) [storm-core-0.10.2.jar:0.10.2] at org.apache.storm.shade.org.jboss.netty.handler.codec.oneone.OneToOneEncoder.handleDownstream(OneToOneEncoder.java:54) [storm-core-0.10.2.jar:0.10.2] at org.apache.storm.shade.org.jboss.netty.channel.DefaultChannelPipeline.sendDownstream(DefaultChannelPipeline.java:591) [storm-core-0.10.2.jar:0.10.2] at org.apache.storm.shade.org.jboss.netty.channel.DefaultChannelPipeline.sendDownstream(DefaultChannelPipeline.java:582) [storm-core-0.10.2.jar:0.10.2] at org.apache.storm.shade.org.jboss.netty.channel.Channels.connect(Channels.java:634) [storm-core-0.10.2.jar:0.10.2] at org.apache.storm.shade.org.jboss.netty.channel.AbstractChannel.connect(AbstractChannel.java:207) [storm-core-0.10.2.jar:0.10.2] at org.apache.storm.shade.org.jboss.netty.bootstrap.ClientBootstrap.connect(ClientBootstrap.java:229) [storm-core-0.10.2.jar:0.10.2] at org.apache.storm.shade.org.jboss.netty.bootstrap.ClientBootstrap.connect(ClientBootstrap.java:182) [storm-core-0.10.2.jar:0.10.2] at backtype.storm.messaging.netty.Client$Connect.run(Client.java:518) [storm-core-0.10.2.jar:0.10.2] at org.apache.storm.shade.org.jboss.netty.util.HashedWheelTimer$HashedWheelTimeout.expire(HashedWheelTimer.java:546) [storm-core-0.10.2.jar:0.10.2] at org.apache.storm.shade.org.jboss.netty.util.HashedWheelTimer$Worker.notifyExpiredTimeouts(HashedWheelTimer.java:446) [storm-core-0.10.2.jar:0.10.2] at org.apache.storm.shade.org.jboss.netty.util.HashedWheelTimer$Worker.run(HashedWheelTimer.java:395) [storm-core-0.10.2.jar:0.10.2] at org.apache.storm.shade.org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) [storm-core-0.10.2.jar:0.10.2] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_121]
zipkin2和Elasticsearch7.0.1整合出现的问题
![图片说明](https://img-ask.csdn.net/upload/201905/20/1558343865_955644.png)![图片说明](https://img-ask.csdn.net/upload/201905/20/1558343873_621189.png) 2019-05-20 17:03:03.673 WARN 8144 --- [cking-tasks-1-2] c.l.a.i.a.DefaultExceptionHandler : [id: 0x6bf14625, L:/192.168.5.146:9411 - R:/192.168.5.146:52483][h1c://sc-201904251550:9411/zipkin/api/v2/spans#GET] Unhandled exception from an annotated service: java.lang.IllegalStateException: response for update-template failed: {"error":{"root_cause":[{"type":"illegal_argument_exception","reason":"Setting index.mapper.dynamic was removed after version 6.0.0"}],"type":"illegal_argument_exception","reason":"Setting index.mapper.dynamic was removed after version 6.0.0"},"status":400} at zipkin2.elasticsearch.internal.client.HttpCall.parseResponse(HttpCall.java:151) ~[zipkin-storage-elasticsearch-2.12.9.jar!/:?] at zipkin2.elasticsearch.internal.client.HttpCall.execute(HttpCall.java:79) ~[zipkin-storage-elasticsearch-2.12.9.jar!/:?] at zipkin2.elasticsearch.EnsureIndexTemplate.apply(EnsureIndexTemplate.java:42) ~[zipkin-storage-elasticsearch-2.12.9.jar!/:?] at zipkin2.elasticsearch.ElasticsearchStorage.ensureIndexTemplates(ElasticsearchStorage.java:351) ~[zipkin-storage-elasticsearch-2.12.9.jar!/:?] at zipkin2.elasticsearch.AutoValue_ElasticsearchStorage.ensureIndexTemplates(AutoValue_ElasticsearchStorage.java:31) ~[zipkin-storage-elasticsearch-2.12.9.jar!/:?] at zipkin2.elasticsearch.ElasticsearchStorage.spanStore(ElasticsearchStorage.java:248) ~[zipkin-storage-elasticsearch-2.12.9.jar!/:?] at zipkin2.server.internal.ZipkinQueryApiV2.getSpanNames(ZipkinQueryApiV2.java:86) ~[classes!/:?] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_211] at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[?:1.8.0_211] at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[?:1.8.0_211] at java.lang.reflect.Method.invoke(Unknown Source) ~[?:1.8.0_211] at com.linecorp.armeria.internal.annotation.AnnotatedHttpService.invoke(AnnotatedHttpService.java:254) ~[armeria-0.83.0.jar!/:?] at com.linecorp.armeria.internal.annotation.AnnotatedHttpService.lambda$serve0$3(AnnotatedHttpService.java:241) ~[armeria-0.83.0.jar!/:?] at java.util.concurrent.CompletableFuture.uniApply(Unknown Source) ~[?:1.8.0_211] at java.util.concurrent.CompletableFuture$UniApply.tryFire(Unknown Source) ~[?:1.8.0_211] at java.util.concurrent.CompletableFuture$Completion.run(Unknown Source) ~[?:1.8.0_211] at com.linecorp.armeria.common.AbstractRequestContext.lambda$makeContextAware$1(AbstractRequestContext.java:69) ~[armeria-0.83.0.jar!/:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) [?:1.8.0_211] at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) [?:1.8.0_211] at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) [netty-common-4.1.34.Final.jar!/:4.1.34.Final] at java.lang.Thread.run(Unknown Source) [?:1.8.0_211]
netty 开发问题,客户端发了请求后收不到响应
package org.netty_client; import io.netty.bootstrap.Bootstrap; import io.netty.channel.ChannelFuture; import io.netty.channel.ChannelInitializer; import io.netty.channel.ChannelOption; import io.netty.channel.EventLoopGroup; import io.netty.channel.nio.NioEventLoopGroup; import io.netty.channel.socket.SocketChannel; import io.netty.channel.socket.nio.NioSocketChannel; public class Client { public void connect(String host,int port) { EventLoopGroup group = new NioEventLoopGroup(); try { Bootstrap strap = new Bootstrap(); strap.group(group); strap.channel(NioSocketChannel.class); strap.option(ChannelOption.TCP_NODELAY, true); strap.handler(new ChannelInitializer<SocketChannel>() { @Override protected void initChannel(SocketChannel arg0) throws Exception { arg0.pipeline().addLast(new ClientHandler()); } }); ChannelFuture future = strap.connect(host, port).sync(); future.channel().closeFuture().sync(); System.out.println("close"); } catch (InterruptedException e) { e.printStackTrace(); }finally { group.shutdownGracefully(); } } public static void main(String[] args) { new Client().connect("127.0.0.1", 8080); } } package org.netty_client; import java.io.UnsupportedEncodingException; import io.netty.buffer.ByteBuf; import io.netty.buffer.Unpooled; import io.netty.channel.ChannelHandlerAdapter; import io.netty.channel.ChannelHandlerContext; public class ClientHandler extends ChannelHandlerAdapter{ private ByteBuf buf; public ClientHandler() { String r = "first request"; byte[] req = r.getBytes(); buf = Unpooled.buffer(req.length); buf.writeBytes(req); System.out.println("send request:" + r); } public void channelActive(ChannelHandlerContext ctx) { ctx.writeAndFlush(buf); } public void channelkRead(ChannelHandlerContext ctx,Object msg) throws UnsupportedEncodingException { System.out.println(1); ByteBuf buf = (ByteBuf)msg; byte[] req = new byte[buf.readableBytes()]; buf.readBytes(req); String body = new String(req,"UTF-8"); System.out.println("receive response:" + body); } public void exceptionCaught(ChannelHandlerContext ctx,Throwable e) { e.printStackTrace(); ctx.close(); } } package org.netty_server; import io.netty.bootstrap.ServerBootstrap; import io.netty.channel.ChannelFuture; import io.netty.channel.ChannelInitializer; import io.netty.channel.ChannelOption; import io.netty.channel.EventLoopGroup; import io.netty.channel.nio.NioEventLoopGroup; import io.netty.channel.socket.SocketChannel; import io.netty.channel.socket.nio.NioServerSocketChannel; public class Server { public void bind(int port) { EventLoopGroup bossGroup = new NioEventLoopGroup(); EventLoopGroup workerGroup = new NioEventLoopGroup(); try { ServerBootstrap server = new ServerBootstrap(); server.group(bossGroup,workerGroup); server.channel(NioServerSocketChannel.class); server.option(ChannelOption.SO_BACKLOG, 1024); server.childHandler(new ChannelInitializer<SocketChannel>(){ @Override protected void initChannel(SocketChannel arg0) throws Exception { arg0.pipeline().addLast(new ServerHandler()); } }); ChannelFuture future = server.bind(port).sync(); System.out.println("启动服务端口:" + port); future.channel().closeFuture().sync(); System.out.println("close"); } catch (InterruptedException e) { e.printStackTrace(); }finally { bossGroup.shutdownGracefully(); workerGroup.shutdownGracefully(); } } public static void main(String[] args) { new Server().bind(8080); } } package org.netty_server; import java.io.UnsupportedEncodingException; import io.netty.buffer.ByteBuf; import io.netty.buffer.Unpooled; import io.netty.channel.ChannelHandlerAdapter; import io.netty.channel.ChannelHandlerContext; public class ServerHandler extends ChannelHandlerAdapter{ @Override public void channelRead(ChannelHandlerContext ctx,Object msg) throws UnsupportedEncodingException { ByteBuf buf = (ByteBuf)msg; byte[] req = new byte[buf.readableBytes()]; buf.readBytes(req); String body = new String(req,"UTF-8"); System.out.println("receive request:" + body); String r = "response"; byte[] re = r.getBytes(); ByteBuf resp = Unpooled.copiedBuffer(re); // resp.readBytes(re); System.out.println("send response:" + r); ctx.write(resp); } @Override public void channelReadComplete(ChannelHandlerContext ctx) { ctx.flush(); } public void exceptionCaught(ChannelHandlerContext ctx,Throwable e) { e.printStackTrace(); ctx.close(); } } 测试结果: send request:first request 启动服务端口:8080 receive request:first request send response:response 但是客户端没收到响应,有朋友能看下么,只是写个简单的例子而已
springboot2.x引入spring-boot-starter-data-redis依赖,启动报错
springboot2.x引入spring-boot-starter-data-redis依赖,启动报错,redis框架用的jedis就没问题,2.x默认用的是lettuce却出现了以下错误,为什么会这样,这个问题折腾了两三天了,快点来个大神拯救我吧 pom: ![图片说明](https://img-ask.csdn.net/upload/201910/10/1570679165_510992.png) 报错: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'application': Unsatisfied dependency expressed through field 'stringredistemplate'; nested exception is org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'stringRedisTemplate' defined in class path resource [org/springframework/boot/autoconfigure/data/redis/RedisAutoConfiguration.class]: Unsatisfied dependency expressed through method 'stringRedisTemplate' parameter 0; nested exception is org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'redisConnectionFactory' defined in class path resource [org/springframework/boot/autoconfigure/data/redis/LettuceConnectionConfiguration.class]: Unsatisfied dependency expressed through method 'redisConnectionFactory' parameter 0; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'lettuceClientResources' defined in class path resource [org/springframework/boot/autoconfigure/data/redis/LettuceConnectionConfiguration.class]: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [io.lettuce.core.resource.DefaultClientResources]: Factory method 'lettuceClientResources' threw exception; nested exception is java.lang.NoClassDefFoundError: io/netty/util/internal/logging/InternalLoggerFactory at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredFieldElement.inject(AutowiredAnnotationBeanPostProcessor.java:596) at org.springframework.beans.factory.annotation.InjectionMetadata.inject(InjectionMetadata.java:90) at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor.postProcessProperties(AutowiredAnnotationBeanPostProcessor.java:374) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1395) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:592) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:515) at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:320) at org.springframework.beans.factory.support.AbstractBeanFactory$$Lambda$112/464064894.getObject(Unknown Source) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:318) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199) at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:849) at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:877) at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:549) at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:142) at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:775) at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:397) at org.springframework.boot.SpringApplication.run(SpringApplication.java:316) at org.springframework.boot.SpringApplication.run(SpringApplication.java:1260) at org.springframework.boot.SpringApplication.run(SpringApplication.java:1248) at com.xxx.demo.Application.main(Application.java:50) Caused by: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'stringRedisTemplate' defined in class path resource [org/springframework/boot/autoconfigure/data/redis/RedisAutoConfiguration.class]: Unsatisfied dependency expressed through method 'stringRedisTemplate' parameter 0; nested exception is org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'redisConnectionFactory' defined in class path resource [org/springframework/boot/autoconfigure/data/redis/LettuceConnectionConfiguration.class]: Unsatisfied dependency expressed through method 'redisConnectionFactory' parameter 0; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'lettuceClientResources' defined in class path resource [org/springframework/boot/autoconfigure/data/redis/LettuceConnectionConfiguration.class]: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [io.lettuce.core.resource.DefaultClientResources]: Factory method 'lettuceClientResources' threw exception; nested exception is java.lang.NoClassDefFoundError: io/netty/util/internal/logging/InternalLoggerFactory at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:769) at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:509) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFactory.java:1305) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1144) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:555) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:515) at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:320) at org.springframework.beans.factory.support.AbstractBeanFactory$$Lambda$117/509891820.getObject(Unknown Source) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:318) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199) at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:277) at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1247) at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1167) at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredFieldElement.inject(AutowiredAnnotationBeanPostProcessor.java:593) ... 20 common frames omitted Caused by: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'redisConnectionFactory' defined in class path resource [org/springframework/boot/autoconfigure/data/redis/LettuceConnectionConfiguration.class]: Unsatisfied dependency expressed through method 'redisConnectionFactory' parameter 0; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'lettuceClientResources' defined in class path resource [org/springframework/boot/autoconfigure/data/redis/LettuceConnectionConfiguration.class]: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [io.lettuce.core.resource.DefaultClientResources]: Factory method 'lettuceClientResources' threw exception; nested exception is java.lang.NoClassDefFoundError: io/netty/util/internal/logging/InternalLoggerFactory at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:769) at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:509) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFactory.java:1305) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1144) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:555) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:515) at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:320) at org.springframework.beans.factory.support.AbstractBeanFactory$$Lambda$117/509891820.getObject(Unknown Source) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:318) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199) at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:277) at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1247) at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1167) at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:857) at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:760) ... 34 common frames omitted Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'lettuceClientResources' defined in class path resource [org/springframework/boot/autoconfigure/data/redis/LettuceConnectionConfiguration.class]: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [io.lettuce.core.resource.DefaultClientResources]: Factory method 'lettuceClientResources' threw exception; nested exception is java.lang.NoClassDefFoundError: io/netty/util/internal/logging/InternalLoggerFactory at org.springframework.beans.factory.support.ConstructorResolver.instantiate(ConstructorResolver.java:627) at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:456) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFactory.java:1305) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1144) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:555) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:515) at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:320) at org.springframework.beans.factory.support.AbstractBeanFactory$$Lambda$117/509891820.getObject(Unknown Source) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:318) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199) at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:277) at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1247) at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1167) at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:857) at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:760) ... 49 common frames omitted Caused by: org.springframework.beans.BeanInstantiationException: Failed to instantiate [io.lettuce.core.resource.DefaultClientResources]: Factory method 'lettuceClientResources' threw exception; nested exception is java.lang.NoClassDefFoundError: io/netty/util/internal/logging/InternalLoggerFactory at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:185) at org.springframework.beans.factory.support.ConstructorResolver.instantiate(ConstructorResolver.java:622) ... 64 common frames omitted Caused by: java.lang.NoClassDefFoundError: io/netty/util/internal/logging/InternalLoggerFactory at io.lettuce.core.resource.DefaultClientResources.<clinit>(DefaultClientResources.java:74) at org.springframework.boot.autoconfigure.data.redis.LettuceConnectionConfiguration.lettuceClientResources(LettuceConnectionConfiguration.java:67) at org.springframework.boot.autoconfigure.data.redis.LettuceConnectionConfiguration$$EnhancerBySpringCGLIB$$6bcc0637.CGLIB$lettuceClientResources$1(<generated>) at org.springframework.boot.autoconfigure.data.redis.LettuceConnectionConfiguration$$EnhancerBySpringCGLIB$$6bcc0637$$FastClassBySpringCGLIB$$ef2b1aca.invoke(<generated>) at org.springframework.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:244) at org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor.intercept(ConfigurationClassEnhancer.java:363) at org.springframework.boot.autoconfigure.data.redis.LettuceConnectionConfiguration$$EnhancerBySpringCGLIB$$6bcc0637.lettuceClientResources(<generated>) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:154) ... 65 common frames omitted Caused by: java.lang.ClassNotFoundException: io.netty.util.internal.logging.InternalLoggerFactory at java.net.URLClassLoader$1.run(Unknown Source) at java.net.URLClassLoader$1.run(Unknown Source) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(Unknown Source) at java.lang.ClassLoader.loadClass(Unknown Source) at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source) at java.lang.ClassLoader.loadClass(Unknown Source) ... 77 common frames omitted
相见恨晚的超实用网站
搞学习 知乎:www.zhihu.com 简答题:http://www.jiandati.com/ 网易公开课:https://open.163.com/ted/ 网易云课堂:https://study.163.com/ 中国大学MOOC:www.icourse163.org 网易云课堂:study.163.com 哔哩哔哩弹幕网:www.bilibili.com 我要自学网:www.51zxw
花了20分钟,给女朋友们写了一个web版群聊程序
参考博客 [1]https://www.byteslounge.com/tutorials/java-ee-html5-websocket-example
爬虫福利二 之 妹子图网MM批量下载
爬虫福利一:27报网MM批量下载    点击 看了本文,相信大家对爬虫一定会产生强烈的兴趣,激励自己去学习爬虫,在这里提前祝:大家学有所成! 目标网站:妹子图网 环境:Python3.x 相关第三方模块:requests、beautifulsoup4 Re:各位在测试时只需要将代码里的变量 path 指定为你当前系统要保存的路径,使用 python xxx.py 或IDE运行即可。
字节跳动视频编解码面经
引言 本文主要是记录一下面试字节跳动的经历。 三四月份投了字节跳动的实习(图形图像岗位),然后hr打电话过来问了一下会不会opengl,c++,shador,当时只会一点c++,其他两个都不会,也就直接被拒了。 七月初内推了字节跳动的提前批,因为内推没有具体的岗位,hr又打电话问要不要考虑一下图形图像岗,我说实习投过这个岗位不合适,不会opengl和shador,然后hr就说秋招更看重基础。我当时
开源一个功能完整的SpringBoot项目框架
福利来了,给大家带来一个福利。 最近想了解一下有关Spring Boot的开源项目,看了很多开源的框架,大多是一些demo或者是一个未成形的项目,基本功能都不完整,尤其是用户权限和菜单方面几乎没有完整的。 想到我之前做的框架,里面通用模块有:用户模块,权限模块,菜单模块,功能模块也齐全了,每一个功能都是完整的。 打算把这个框架分享出来,供大家使用和学习。 为什么用框架? 框架可以学习整体
Java学习的正确打开方式
在博主认为,对于入门级学习java的最佳学习方法莫过于视频+博客+书籍+总结,前三者博主将淋漓尽致地挥毫于这篇博客文章中,至于总结在于个人,实际上越到后面你会发现学习的最好方式就是阅读参考官方文档其次就是国内的书籍,博客次之,这又是一个层次了,这里暂时不提后面再谈。博主将为各位入门java保驾护航,各位只管冲鸭!!!上天是公平的,只要不辜负时间,时间自然不会辜负你。 何谓学习?博主所理解的学习,它
程序员必须掌握的核心算法有哪些?
由于我之前一直强调数据结构以及算法学习的重要性,所以就有一些读者经常问我,数据结构与算法应该要学习到哪个程度呢?,说实话,这个问题我不知道要怎么回答你,主要取决于你想学习到哪些程度,不过针对这个问题,我稍微总结一下我学过的算法知识点,以及我觉得值得学习的算法。这些算法与数据结构的学习大多数是零散的,并没有一本把他们全部覆盖的书籍。下面是我觉得值得学习的一些算法以及数据结构,当然,我也会整理一些看过
Python——画一棵漂亮的樱花树(不同种樱花+玫瑰+圣诞树喔)
最近翻到一篇知乎,上面有不少用Python(大多是turtle库)绘制的树图,感觉很漂亮,我整理了一下,挑了一些我觉得不错的代码分享给大家(这些我都测试过,确实可以生成喔~) one 樱花树 动态生成樱花 效果图(这个是动态的): 实现代码 import turtle as T import random import time # 画樱花的躯干(60,t) def Tree(branch
深深的码丨Java HashMap 透析
HashMap 相关概念 HashTab、HashMap、TreeMap 均以键值对像是存储或操作数据元素。HashTab继承自Dictionary,HashMap、TreeMap继承自AbstractMap,三者均实现Map接口 **HashTab:**同步哈希表,不支持null键或值,因为同步导致性能影响,很少被使用 **HashMap:**应用较多的非同步哈希表,支持null键或值,是键值对...
大学四年自学走来,这些私藏的实用工具/学习网站我贡献出来了
大学四年,看课本是不可能一直看课本的了,对于学习,特别是自学,善于搜索网上的一些资源来辅助,还是非常有必要的,下面我就把这几年私藏的各种资源,网站贡献出来给你们。主要有:电子书搜索、实用工具、在线视频学习网站、非视频学习网站、软件下载、面试/求职必备网站。 注意:文中提到的所有资源,文末我都给你整理好了,你们只管拿去,如果觉得不错,转发、分享就是最大的支持了。 一、电子书搜索 对于大部分程序员...
linux系列之常用运维命令整理笔录
本博客记录工作中需要的linux运维命令,大学时候开始接触linux,会一些基本操作,可是都没有整理起来,加上是做开发,不做运维,有些命令忘记了,所以现在整理成博客,当然vi,文件操作等就不介绍了,慢慢积累一些其它拓展的命令,博客不定时更新 顺便拉下票,我在参加csdn博客之星竞选,欢迎投票支持,每个QQ或者微信每天都可以投5票,扫二维码即可,http://m234140.nofollow.ax.
Python 基础(一):入门必备知识
目录1 标识符2 关键字3 引号4 编码5 输入输出6 缩进7 多行8 注释9 数据类型10 运算符10.1 常用运算符10.2 运算符优先级 1 标识符 标识符是编程时使用的名字,用于给变量、函数、语句块等命名,Python 中标识符由字母、数字、下划线组成,不能以数字开头,区分大小写。 以下划线开头的标识符有特殊含义,单下划线开头的标识符,如:_xxx ,表示不能直接访问的类属性,需通过类提供
程序员接私活怎样防止做完了不给钱?
首先跟大家说明一点,我们做 IT 类的外包开发,是非标品开发,所以很有可能在开发过程中会有这样那样的需求修改,而这种需求修改很容易造成扯皮,进而影响到费用支付,甚至出现做完了项目收不到钱的情况。 那么,怎么保证自己的薪酬安全呢? 我们在开工前,一定要做好一些证据方面的准备(也就是“讨薪”的理论依据),这其中最重要的就是需求文档和验收标准。一定要让需求方提供这两个文档资料作为开发的基础。之后开发
网页实现一个简单的音乐播放器(大佬别看。(⊙﹏⊙))
今天闲着无事,就想写点东西。然后听了下歌,就打算写个播放器。 于是乎用h5 audio的加上js简单的播放器完工了。 欢迎 改进 留言。 演示地点跳到演示地点 html代码如下`&lt;!DOCTYPE html&gt; &lt;html&gt; &lt;head&gt; &lt;title&gt;music&lt;/title&gt; &lt;meta charset="utf-8"&gt
Python十大装B语法
Python 是一种代表简单思想的语言,其语法相对简单,很容易上手。不过,如果就此小视 Python 语法的精妙和深邃,那就大错特错了。本文精心筛选了最能展现 Python 语法之精妙的十个知识点,并附上详细的实例代码。如能在实战中融会贯通、灵活使用,必将使代码更为精炼、高效,同时也会极大提升代码B格,使之看上去更老练,读起来更优雅。 1. for - else 什么?不是 if 和 else 才
数据库优化 - SQL优化
前面一篇文章从实例的角度进行数据库优化,通过配置一些参数让数据库性能达到最优。但是一些“不好”的SQL也会导致数据库查询变慢,影响业务流程。本文从SQL角度进行数据库优化,提升SQL运行效率。 判断问题SQL 判断SQL是否有问题时可以通过两个表象进行判断: 系统级别表象 CPU消耗严重 IO等待严重 页面响应时间过长
2019年11月中国大陆编程语言排行榜
2019年11月2日,我统计了某招聘网站,获得有效程序员招聘数据9万条。针对招聘信息,提取编程语言关键字,并统计如下: 编程语言比例 rank pl_ percentage 1 java 33.62% 2 c/c++ 16.42% 3 c_sharp 12.82% 4 javascript 12.31% 5 python 7.93% 6 go 7.25% 7
通俗易懂地给女朋友讲:线程池的内部原理
餐厅的约会 餐盘在灯光的照耀下格外晶莹洁白,女朋友拿起红酒杯轻轻地抿了一小口,对我说:“经常听你说线程池,到底线程池到底是个什么原理?”我楞了一下,心里想女朋友今天是怎么了,怎么突然问出这么专业的问题,但做为一个专业人士在女朋友面前也不能露怯啊,想了一下便说:“我先给你讲讲我前同事老王的故事吧!” 大龄程序员老王 老王是一个已经北漂十多年的程序员,岁数大了,加班加不动了,升迁也无望,于是拿着手里
经典算法(5)杨辉三角
杨辉三角 是经典算法,这篇博客对它的算法思想进行了讲解,并有完整的代码实现。
腾讯算法面试题:64匹马8个跑道需要多少轮才能选出最快的四匹?
昨天,有网友私信我,说去阿里面试,彻底的被打击到了。问了为什么网上大量使用ThreadLocal的源码都会加上private static?他被难住了,因为他从来都没有考虑过这个问题。无独有偶,今天笔者又发现有网友吐槽了一道腾讯的面试题,我们一起来看看。 腾讯算法面试题:64匹马8个跑道需要多少轮才能选出最快的四匹? 在互联网职场论坛,一名程序员发帖求助到。二面腾讯,其中一个算法题:64匹
面试官:你连RESTful都不知道我怎么敢要你?
面试官:了解RESTful吗? 我:听说过。 面试官:那什么是RESTful? 我:就是用起来很规范,挺好的 面试官:是RESTful挺好的,还是自我感觉挺好的 我:都挺好的。 面试官:… 把门关上。 我:… 要干嘛?先关上再说。 面试官:我说出去把门关上。 我:what ?,夺门而去 文章目录01 前言02 RESTful的来源03 RESTful6大原则1. C-S架构2. 无状态3.统一的接
JDK12 Collectors.teeing 你真的需要了解一下
前言 在 Java 12 里面有个非常好用但在官方 JEP 没有公布的功能,因为它只是 Collector 中的一个小改动,它的作用是 merge 两个 collector 的结果,这句话显得很抽象,老规矩,我们先来看个图(这真是一个不和谐的图????): 管道改造经常会用这个小东西,通常我们叫它「三通」,它的主要作用就是将 downstream1 和 downstre...
为啥国人偏爱Mybatis,而老外喜欢Hibernate/JPA呢?
关于SQL和ORM的争论,永远都不会终止,我也一直在思考这个问题。昨天又跟群里的小伙伴进行了一番讨论,感触还是有一些,于是就有了今天这篇文。 声明:本文不会下关于Mybatis和JPA两个持久层框架哪个更好这样的结论。只是摆事实,讲道理,所以,请各位看官勿喷。 一、事件起因 关于Mybatis和JPA孰优孰劣的问题,争论已经很多年了。一直也没有结论,毕竟每个人的喜好和习惯是大不相同的。我也看
项目中的if else太多了,该怎么重构?
介绍 最近跟着公司的大佬开发了一款IM系统,类似QQ和微信哈,就是聊天软件。我们有一部分业务逻辑是这样的 if (msgType = "文本") { // dosomething } else if(msgType = "图片") { // doshomething } else if(msgType = "视频") { // doshomething } else { // doshom...
致 Python 初学者
欢迎来到“Python进阶”专栏!来到这里的每一位同学,应该大致上学习了很多 Python 的基础知识,正在努力成长的过程中。在此期间,一定遇到了很多的困惑,对未来的学习方向感到迷茫。我非常理解你们所面临的处境。我从2007年开始接触 python 这门编程语言,从2009年开始单一使用 python 应对所有的开发工作,直至今天。回顾自己的学习过程,也曾经遇到过无数的困难,也曾经迷茫过、困惑过。开办这个专栏,正是为了帮助像我当年一样困惑的 Python 初学者走出困境、快速成长。希望我的经验能真正帮到你
“狗屁不通文章生成器”登顶GitHub热榜,分分钟写出万字形式主义大作
一、垃圾文字生成器介绍 最近在浏览GitHub的时候,发现了这样一个骨骼清奇的雷人项目,而且热度还特别高。 项目中文名:狗屁不通文章生成器 项目英文名:BullshitGenerator 根据作者的介绍,他是偶尔需要一些中文文字用于GUI开发时测试文本渲染,因此开发了这个废话生成器。但由于生成的废话实在是太过富于哲理,所以最近已经被小伙伴们给玩坏了。 他的文风可能是这样的: 你发现,...
程序员:我终于知道post和get的区别
是一个老生常谈的话题,然而随着不断的学习,对于以前的认识有很多误区,所以还是需要不断地总结的,学而时习之,不亦说乎
《程序人生》系列-这个程序员只用了20行代码就拿了冠军
你知道的越多,你不知道的越多 点赞再看,养成习惯GitHub上已经开源https://github.com/JavaFamily,有一线大厂面试点脑图,欢迎Star和完善 前言 这一期不算《吊打面试官》系列的,所有没前言我直接开始。 絮叨 本来应该是没有这期的,看过我上期的小伙伴应该是知道的嘛,双十一比较忙嘛,要值班又要去帮忙拍摄年会的视频素材,还得搞个程序员一天的Vlog,还要写BU...
加快推动区块链技术和产业创新发展,2019可信区块链峰会在京召开
11月8日,由中国信息通信研究院、中国通信标准化协会、中国互联网协会、可信区块链推进计划联合主办,科技行者协办的2019可信区块链峰会将在北京悠唐皇冠假日酒店开幕。   区块链技术被认为是继蒸汽机、电力、互联网之后,下一代颠覆性的核心技术。如果说蒸汽机释放了人类的生产力,电力解决了人类基本的生活需求,互联网彻底改变了信息传递的方式,区块链作为构造信任的技术有重要的价值。   1...
程序员把地府后台管理系统做出来了,还有3.0版本!12月7号最新消息:已在开发中有github地址
第一幕:缘起 听说阎王爷要做个生死簿后台管理系统,我们派去了一个程序员…… 996程序员做的梦: 第一场:团队招募 为了应对地府管理危机,阎王打算找“人”开发一套地府后台管理系统,于是就在地府总经办群中发了项目需求。 话说还是中国电信的信号好,地府都是满格,哈哈!!! 经常会有外行朋友问:看某网站做的不错,功能也简单,你帮忙做一下? 而这次,面对这样的需求,这个程序员...
相关热词 c# plc s1200 c#里氏转换原则 c# 主界面 c# do loop c#存为组套 模板 c# 停掉协程 c# rgb 读取图片 c# 图片颜色调整 最快 c#多张图片上传 c#密封类与密封方法
立即提问