spring+netty项目中使用NIO的技术

在做一个springmvc+netty的项目,要求当请求进到方法正常返回一个成功的同时另一条线程处理后台的业务,后台业务在处理的同时其实这个会话已经正常返回了。

@RequestMapping(value = "/static/o_index.do",method={RequestMethod.POST,RequestMethod.GET})
public void indexSubmit(HttpServletRequest request,
        HttpServletResponse response, ModelMap model,String channelId,String _site_id_param) throws JSONException {
    JSONObject jsonobj = new JSONObject();
    jsonobj.append("messageID", request.getParameter("messageID"));
    jsonobj.append("serviceID",request.getParameter("serviceID"));
    jsonobj.append("respStatus", "1");
    jsonobj.append("bizCode","");
    jsonobj.append("bizDesc","首页静态化");
    ResponseUtils.renderJson(response, jsonobj.toString()); 
    runnableStatic.start();

    以上是我的代码。runnableStatic是一个继承了thread的controller,(没有外面去调用它)返回成功后启动线程, 但是我觉得不对,  因为创建线程之前就已经返回了。 如果想做到返回response之后 另一条线程咋以后台工作我该怎么做。
    跪谢!!!

1个回答

可以在处理线程里打出调试语句,判断是否执行

Csdn user default icon
上传中...
上传图片
插入图片
抄袭、复制答案,以达到刷声望分或其他目的的行为,在CSDN问答是严格禁止的,一经发现立刻封号。是时候展现真正的技术了!
其他相关推荐
spring+netty项目中使用NIO的技术

在做一个springmvc+netty的项目,要求当请求进到方法正常返回一个成功的同时另一条线程处理后台的业务,后台业务在处理的同时其实这个会话已经正常返回了。 @RequestMapping(value = "/static/o_index.do",method={RequestMethod.POST,RequestMethod.GET}) public void indexSubmit(HttpServletRequest request, HttpServletResponse response, ModelMap model,String channelId,String _site_id_param) throws JSONException { JSONObject jsonobj = new JSONObject(); jsonobj.append("messageID", request.getParameter("messageID")); jsonobj.append("serviceID",request.getParameter("serviceID")); jsonobj.append("respStatus", "1"); jsonobj.append("bizCode",""); jsonobj.append("bizDesc","首页静态化"); ResponseUtils.renderJson(response, jsonobj.toString()); runnableStatic.start(); 以上是我的代码。runnableStatic是一个继承了thread的controller,(没有外面去调用它)返回成功后启动线程, 但是我觉得不对, 因为创建线程之前就已经返回了。 如果想做到返回response之后 另一条线程咋以后台工作我该怎么做。 跪谢!!!

springboot、netty、redis

springboot 整合netty搭建socket,使用redis缓存,可以正常启动,在不适用缓存时可以交互,使用缓存出异常 ``` java.lang.NoSuchMethodError: io.netty.bootstrap.Bootstrap.channel(Ljava/lang/Class;)Lio/netty/bootstrap/AbstractBootstrap; at io.lettuce.core.AbstractRedisClient.channelType(AbstractRedisClient.java:179) ~[lettuce-core-5.1.4.RELEASE.jar:?] at io.lettuce.core.RedisClient.connectStatefulAsync(RedisClient.java:304) ~[lettuce-core-5.1.4.RELEASE.jar:?] at io.lettuce.core.RedisClient.connectStandaloneAsync(RedisClient.java:271) ~[lettuce-core-5.1.4.RELEASE.jar:?] at io.lettuce.core.RedisClient.connect(RedisClient.java:204) ~[lettuce-core-5.1.4.RELEASE.jar:?] at org.springframework.data.redis.connection.lettuce.StandaloneConnectionProvider.lambda$getConnection$1(StandaloneConnectionProvider.java:113) ~[spring-data-redis-2.1.5.RELEASE.jar:2.1.5.RELEASE] at java.util.Optional.orElseGet(Optional.java:267) ~[?:1.8.0_201] at org.springframework.data.redis.connection.lettuce.StandaloneConnectionProvider.getConnection(StandaloneConnectionProvider.java:113) ~[spring-data-redis-2.1.5.RELEASE.jar:2.1.5.RELEASE] at org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory$SharedConnection.getNativeConnection(LettuceConnectionFactory.java:1085) ~[spring-data-redis-2.1.5.RELEASE.jar:2.1.5.RELEASE] at org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory$SharedConnection.getConnection(LettuceConnectionFactory.java:1065) ~[spring-data-redis-2.1.5.RELEASE.jar:2.1.5.RELEASE] at org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory.getSharedConnection(LettuceConnectionFactory.java:865) ~[spring-data-redis-2.1.5.RELEASE.jar:2.1.5.RELEASE] at org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory.getConnection(LettuceConnectionFactory.java:340) ~[spring-data-redis-2.1.5.RELEASE.jar:2.1.5.RELEASE] at org.springframework.data.redis.core.RedisConnectionUtils.doGetConnection(RedisConnectionUtils.java:132) ~[spring-data-redis-2.1.5.RELEASE.jar:2.1.5.RELEASE] at org.springframework.data.redis.core.RedisConnectionUtils.getConnection(RedisConnectionUtils.java:95) ~[spring-data-redis-2.1.5.RELEASE.jar:2.1.5.RELEASE] at org.springframework.data.redis.core.RedisConnectionUtils.getConnection(RedisConnectionUtils.java:82) ~[spring-data-redis-2.1.5.RELEASE.jar:2.1.5.RELEASE] at org.springframework.data.redis.core.RedisTemplate.execute(RedisTemplate.java:211) ~[spring-data-redis-2.1.5.RELEASE.jar:2.1.5.RELEASE] at org.springframework.data.redis.core.RedisTemplate.execute(RedisTemplate.java:184) ~[spring-data-redis-2.1.5.RELEASE.jar:2.1.5.RELEASE] at org.springframework.data.redis.core.AbstractOperations.execute(AbstractOperations.java:95) ~[spring-data-redis-2.1.5.RELEASE.jar:2.1.5.RELEASE] at org.springframework.data.redis.core.DefaultValueOperations.get(DefaultValueOperations.java:53) ~[spring-data-redis-2.1.5.RELEASE.jar:2.1.5.RELEASE] at com.viewhigh.hiot.elec.service.serviceimp.RedisServiceImpl.get(RedisServiceImpl.java:76) ~[classes/:?] at com.viewhigh.hiot.elec.protocol.ProtocolElecService.isCheck(ProtocolElecService.java:177) ~[classes/:?] at com.viewhigh.hiot.elec.protocol.ProtocolElecService.checkMsg(ProtocolElecService.java:161) ~[classes/:?] at com.viewhigh.hiot.elec.server.SocketServerHandler.channelRead(SocketServerHandler.java:89) ~[classes/:?] at io.netty.channel.ChannelHandlerInvokerUtil.invokeChannelReadNow(ChannelHandlerInvokerUtil.java:74) [netty-all-5.0.0.Alpha1.jar:5.0.0.Alpha1] at io.netty.channel.DefaultChannelHandlerInvoker.invokeChannelRead(DefaultChannelHandlerInvoker.java:138) [netty-all-5.0.0.Alpha1.jar:5.0.0.Alpha1] at io.netty.channel.DefaultChannelHandlerContext.fireChannelRead(DefaultChannelHandlerContext.java:320) [netty-all-5.0.0.Alpha1.jar:5.0.0.Alpha1] at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:154) [netty-all-5.0.0.Alpha1.jar:5.0.0.Alpha1] at io.netty.channel.ChannelHandlerInvokerUtil.invokeChannelReadNow(ChannelHandlerInvokerUtil.java:74) [netty-all-5.0.0.Alpha1.jar:5.0.0.Alpha1] at io.netty.channel.DefaultChannelHandlerInvoker.invokeChannelRead(DefaultChannelHandlerInvoker.java:138) [netty-all-5.0.0.Alpha1.jar:5.0.0.Alpha1] at io.netty.channel.DefaultChannelHandlerContext.fireChannelRead(DefaultChannelHandlerContext.java:320) [netty-all-5.0.0.Alpha1.jar:5.0.0.Alpha1] at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:253) [netty-all-5.0.0.Alpha1.jar:5.0.0.Alpha1] at io.netty.channel.ChannelHandlerInvokerUtil.invokeChannelReadNow(ChannelHandlerInvokerUtil.java:74) [netty-all-5.0.0.Alpha1.jar:5.0.0.Alpha1] at io.netty.channel.DefaultChannelHandlerInvoker.invokeChannelRead(DefaultChannelHandlerInvoker.java:138) [netty-all-5.0.0.Alpha1.jar:5.0.0.Alpha1] at io.netty.channel.DefaultChannelHandlerContext.fireChannelRead(DefaultChannelHandlerContext.java:320) [netty-all-5.0.0.Alpha1.jar:5.0.0.Alpha1] at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846) [netty-all-5.0.0.Alpha1.jar:5.0.0.Alpha1] at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:127) [netty-all-5.0.0.Alpha1.jar:5.0.0.Alpha1] at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:485) [netty-all-5.0.0.Alpha1.jar:5.0.0.Alpha1] at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:452) [netty-all-5.0.0.Alpha1.jar:5.0.0.Alpha1] at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:346) [netty-all-5.0.0.Alpha1.jar:5.0.0.Alpha1] at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:794) [netty-all-5.0.0.Alpha1.jar:5.0.0.Alpha1] at java.lang.Thread.run(Thread.java:748) [?:1.8.0_201] ``` 可能是个包冲突,但是不确定也没有找到是那个冲突,求助,帮看看

用到netty框架,netty应该结合在哪里,最好大佬能提供个demo参考参考,或者通过一个思路,

ssm框架熟悉websocket也能弄出来,但是服务器用的是wss通讯,貌似还要用到netty框架,netty应该结合在哪里,最好大佬能提供个demo参考参考,或者通过一个思路,

netty 心跳机制 netty 心跳机制 设备在线问题

我用的netty做的服务端,app是客户端,我这边使用一个flag标志标记设备是否在线,1在线,0不在线,app网络波动的情况下会重新连接我,上一个连接就没用了,然后netty的心跳机制在五秒内去监测这个连接五秒没读取到数据了就执行事件删除连接然后给我的在线标志弄成不在线,但是这个标志会影响到新连接,如果抖动之后新连接在五秒之内连接上来了,那上一个连接销毁的时候在新连接之后就会影响到这个标志位,有什么好方法改造嘛

服务调用时候报错 请教下如何解决,代码没有改过,使用vpn后就报错了

使用nacos注册中心 认证时直接走认证服务是可以的![图片说明](https://img-ask.csdn.net/upload/202002/17/1581936651_914061.jpg) 但是通过网关请求的时候报错 ![图片说明](https://img-ask.csdn.net/upload/202002/17/1581936753_419316.jpg) 日志 ``` io.netty.channel.AbstractChannel$AnnotatedSocketException: Permission denied: no further information: /10.242.225.80:3000 at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:327) at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:340) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:670) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:617) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:534) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:906) at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) at java.lang.Thread.run(Thread.java:748) Caused by: java.net.SocketException: Permission denied: no further information ... 11 common frames omitted ```

netty 如何设置客户端连接超时时间?希望路过的大神留下宝贵的意见!

1:netty 客户端设置连接超时时间,在规定的时间内如果没有连接到服务端视为网络异常,进行跳过处理。 2:我曾试过bootstrap.option(ChannelOption.CONNECT_TIMEOUT_MILLIS, 5000); 可是这段代码没有起作用,代码如下: 工作线程池组 EventLoopGroup eventLoopGroup = new NioEventLoopGroup(); Bootstrap bootstrap = new Bootstrap(); bootstrap.group(eventLoopGroup)// //设置线程池 .channel(NioSocketChannel.class)// 设置socket工厂 // .option(ChannelOption.TCP_NODELAY, true) .option(ChannelOption.CONNECT_TIMEOUT_MILLIS, 9000)// 设置超时时间 .handler(new ChannelInitializer<SocketChannel>() { @Override protected void initChannel(SocketChannel socketChannel) throws Exception { ChannelPipeline pipeline = socketChannel.pipeline(); // 处理类 pipeline.addLast(new EchoClientHandler()); } }); try { // 发起连接操作 ChannelFuture future = bootstrap.connect(host, port); // 连接若异常发起3次重新连接 i++; if (i <= 3) { // 启动时连接超时重试 future.addListener(new ConnectionListener(i)); } // 等待客户端链路关闭 future.channel().closeFuture().sync(); } catch (InterruptedException e) { } finally { // 优雅退出,释放线程池资源 eventLoopGroup.shutdownGracefully(); } 对于这个问题该如何解决呢?

gateway+nacos+websocket Invalid handshake response getStatus: 403

跪求大佬帮解,websocket,过网关后会握手失败,必须在一台电脑上才能握手成功,流程是这样的,gateway-->nacos-->websocket, 就错了 服务访问流程如下: socketJS----->gateway---------->nacos----------->websocket springboot版本号为2.2.0 以下是代码报错: ``` 2020-05-06 16:17:31 WARN reactor-http-nio-4 reactor.netty.http.client.HttpClientConnect [id: 0xadc98a3b, L:/192.168.0.112:59809 - R:/192.168.0.112:9513] The connection observed an error io.netty.handler.codec.http.websocketx.WebSocketHandshakeException: Invalid handshake response getStatus: 403 at io.netty.handler.codec.http.websocketx.WebSocketClientHandshaker13.verify(WebSocketClientHandshaker13.java:267) at io.netty.handler.codec.http.websocketx.WebSocketClientHandshaker.finishHandshake(WebSocketClientHandshaker.java:303) at reactor.netty.http.client.WebsocketClientOperations.onInboundNext(WebsocketClientOperations.java:117) at reactor.netty.channel.ChannelOperationsHandler.channelRead(ChannelOperationsHandler.java:91) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:374) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:360) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:352) at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:374) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:360) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:352) at io.netty.channel.CombinedChannelDuplexHandler$DelegatingChannelHandlerContext.fireChannelRead(CombinedChannelDuplexHandler.java:438) at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:328) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:302) at io.netty.channel.CombinedChannelDuplexHandler.channelRead(CombinedChannelDuplexHandler.java:253) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:374) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:360) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:352) at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1422) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:374) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:360) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:931) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:700) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:635) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:552) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:514) at io.netty.util.concurrent.SingleThreadEventExecutor$6.run(SingleThreadEventExecutor.java:1044) at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) at java.lang.Thread.run(Thread.java:748) ``` **三个端,配置在同一服务器,是可以通的,分布开就不行** ![这是websocket服务配置的](https://img-ask.csdn.net/upload/202005/06/1588758882_458714.png) ## 在不用stomp的情况下,能不能解决403问题,这是我问这个问题的关键,求大神帮忙解答

linux重启netty服务一些问题,请大神进。

linux上部署netty架构的服务时我一般启动的时候用下面脚本 nohup java -jar xxx-xxx.jar start 8080 8888 & 关闭的时候杀进程就可以了。 今天部署的时候启动服务就报userBind,于是我查看进程的时候发现 root 724 1 0 1406059 1091384 1 Apr08 ? 00:39:15 /usr/java/jdk1.7.0_76/jre/bin/java -Xmx2048m -Xms2048m -Xmn768m -Xss256k -XX:PermSize=512m -XX:MaxPermSize=1024m -XX:+UseParallelGC -XX:ParallelGCThreads=4 -XX:+UseParallelOldGC -XX:+AggressiveOpts -XX:+UseBiasedLocking -jar xxx-xxx.jar start 8080 8765 root 14084 14056 0 25813 844 2 19:14 pts/4 00:00:00 grep java 我也不敢动。我的项目也起不起来, 请问大神们。 这是什么服务。我要是重新启动要用什么脚本 或者依次启动哪些服务。

spring cloud gateway收到请求后报Only one connection receive subscriber allowed

## 代码框架 前端代码采用vuejs框架,并且使用了proxymiddlware做代理 服务端代码采用springboot编写并使用了consul服务发现、spring cloudg ateway做网关。 现在的情况是这样,可以通过网关访问后台的swagger-api页面,并且可以正常发送请求测试,返回也正常,但是当在前端配置好网关调用后,从前端页面发送请求后页面直接显示连接失败,spring cloud gateway服务报java.lang.IllegalStateException: Only one connection receive subscriber allowed,从网上搜索后大概意思是请求body只能被获取一次,下面是完整异常 ``` 2018-09-20 10:30:54.952 ERROR 13772 --- [ctor-http-nio-2] .a.w.r.e.DefaultErrorWebExceptionHandler : Failed to handle request [POST http://127.0.0.1:8088/gacenter/api/v1/login] java.lang.IllegalStateException: Only one connection receive subscriber allowed. at reactor.ipc.netty.channel.FluxReceive.startReceiver(FluxReceive.java:279) ~[reactor-netty-0.7.9.RELEASE.jar:0.7.9.RELEASE] at reactor.ipc.netty.channel.FluxReceive.lambda$subscribe$2(FluxReceive.java:129) ~[reactor-netty-0.7.9.RELEASE.jar:0.7.9.RELEASE] at io.netty.util.concurrent.AbstractEventExecutor.safeExecute$$$capture(AbstractEventExecutor.java:163) ~[netty-common-4.1.29.Final.jar:4.1.29.Final] at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java) ~[netty-common-4.1.29.Final.jar:4.1.29.Final] at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:404) ~[netty-common-4.1.29.Final.jar:4.1.29.Final] at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:446) ~[netty-transport-4.1.29.Final.jar:4.1.29.Final] at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:884) ~[netty-common-4.1.29.Final.jar:4.1.29.Final] at java.lang.Thread.run(Thread.java:745) ~[na:1.8.0_60] ``` 那位大神遇到过这个问题,路过还请留名

求助:netty 4.x服务器端出现CLOSE_WAIT的问题

**1.主题:**我最近用netty4.x 做了一个app服务端,在部署到服务器上之后出现了很多close_wait 状态的TCP连接 ,导致服务端卡住,不能再接收新的连接,但是换回本地测试又不会出现这样的问题。 **2.详细描述:** 1)当服务端出现卡住的情况时,使用netstat -ano 命令可以看到服务器的连接状态还是established,抓包也能看到客户端仍然在正常发送数据包,但是服务器只是回应了一个ACK(此时服务端已经卡住,控制台没有任何动作,也没有日志记录)。 ![图片说明](https://img-ask.csdn.net/upload/201807/12/1531381902_865010.jpg) 下面那一个是心跳包。 只要客户端不关闭连接,一直是established,直到客户端断开连接后,就变成了close_wait。只有一次服务端从这种“卡死”状态恢复,并且打印了日志(比如"用户的断开连接")。 一开始我以为是某一步阻塞,而导致了这个情况,于是又用jstack 命令查看了阻塞状态,转下节, 2) 显示结果: ![图片说明](https://img-ask.csdn.net/upload/201807/12/1531382616_657392.png) 3)在查看官方的使用手册和《netty 实战》中,有提到入站出站消息需要使用ReferenceCountUtil.release()进行,手动释放,但我的编解码器用的分别是ByteToMessageDecoder和MessageToByteEncoder,源码上这两个都进行了Bytebuf的释放处理, 所以问题应该不是出在这里吧..... 以下是编解码器的部分代码: encoder ``` @Override protected void encode(ChannelHandlerContext ctx, YingHeMessage msg, ByteBuf out) throws Exception { checkMsg(msg);// not null int type = msg.getProtoId(); int contentLength = msg.getContentLength(); String body = msg.getBody(); out.writeInt(type); out.writeInt(contentLength); out.writeBytes(body.getBytes(Charset.forName("UTF-8"))); } ``` decoder ``` //int+int private static final int HEADER_SIZE = 8; private static final int LEAST_SIZE = 4; private static final Logger LOG = LoggerFactory.getLogger(YingHeMessageDecoder.class); @Override protected void decode(ChannelHandlerContext ctx, ByteBuf in, List<Object> out) throws Exception { in.markReaderIndex();//第一次mark int readable = in.readableBytes(); LOG.info("check:{}", in.readableBytes() < HEADER_SIZE); LOG.info("readable:{}", readable); if (in.readableBytes() < HEADER_SIZE) {//消息过小回滚指针,不作处理 LOG.warn(">>>可读字节数小于头部长度!"); LOG.info("before reset:{}", in.readerIndex()); in.resetReaderIndex(); LOG.info("after reset:{}", in.readerIndex()); return; } //读取消息类型 int type = in.readInt(); int contentLength = in.readInt(); LOG.info("type:{},contentLength:{}", type, contentLength); in.markReaderIndex();//第二次mark int readable2 = in.readableBytes(); if (readable2 < contentLength) { LOG.error("内容长度错误!length=" + contentLength); in.resetReaderIndex();//重设readerIndex LOG.info("重设,当前readerIndex:" + in.readerIndex()); return; } //读取内容 ByteBuf buf = in.readBytes(contentLength); byte[] content = new byte[buf.readableBytes()]; buf.readBytes(content); String body = new String(content, "UTF-8"); YingHeMessage message = new YingHeMessage(type, contentLength, body); out.add(message); } ``` 下面是服务器启动类的配置: ``` public void run() throws Exception { EventLoopGroup boss = new NioEventLoopGroup(); EventLoopGroup worker = new NioEventLoopGroup(5); try { ServerBootstrap b = new ServerBootstrap(); b.group(boss, worker) .channel(NioServerSocketChannel.class) .option(ChannelOption.SO_BACKLOG, 1024) .option(ChannelOption.SO_REUSEADDR, true) .childOption(ChannelOption.TCP_NODELAY, true) .childOption(ChannelOption.SO_KEEPALIVE, true) .handler(new LoggingHandler(LogLevel.DEBUG)) .childHandler(new ChannelInitializer<SocketChannel>() { @Override protected void initChannel(SocketChannel ch) throws Exception { ch.pipeline() .addLast(new LengthFieldBasedFrameDecoder(MAX_LENGTH, LENGTH_FIELD_OFFSET, LENGTH_FIELD_LENGTH, LENGTH_ADJUSTMENT, INITIAL_BYTES_TO_STRIP)) .addLast(new ReadTimeoutHandler(60)) .addLast(new YingHeMessageDecoder()) .addLast(new YingHeMessageEncoder()) .addLast(new ServerHandlerInitializer()) .addLast(new Zenith()); } }); Properties properties = new Properties(); InputStream in = YingHeServer.class.getClassLoader().getResourceAsStream("net.properties"); properties.load(in); Integer port = Integer.valueOf(properties.getProperty("port")); ChannelFuture f = b.bind(port).sync(); LOG.info("服务器启动,绑定端口:" + port); DiscardProcessorUtil.init(); System.out.println(">>>flush all:" + RedisConnector.getConnector().flushAll()); LOG.info(">>>redis connect test:ping---received:{}", RedisConnector.getConnector().ping()); f.channel().closeFuture().sync(); } catch (IOException e) { e.printStackTrace(); } finally { //清除 LOG.info("优雅退出..."); boss.shutdownGracefully(); worker.shutdownGracefully(); ChannelGroups.clear(); } } ``` 4)其他补充说明: 服务器为windows server 2012r; 客户端使用的C sharp编写; 服务端使用了Netty 4.1.26.Final,Mybatis,Spring,fastjson,redis(缓存),c3p0(连接池); 本地测试不会出现这种情况! 40c币奉上,还请各位大牛不吝赐教,救小弟于水火啊!

fabric网络环境整合java-sdk grpc连接异常

1、调用fabric-java-sdk整合fabric网络环境。启动的过程中一直报错 ``` 23:16:57.825 [main] ERROR org.hyperledger.fabric.sdk.Channel - Channel Channel{id: 1, name: mychannel} Sending proposal with transaction: 5fe505ed0e555ac50cc4773876d8eb3746da951a3ad2f8461b2fa177901e5862 to Peer{ id: 2, name: peer0.org1.example.com, channelName: mychannel, url: grpc://x.x.x.x:7051} failed because of: gRPC failure=Status{code=INTERNAL, description=http2 exception, cause=io.netty.handler.codec.http2.Http2Exception: First received frame was not SETTINGS. Hex dump for first 5 bytes: 1503010002 at io.netty.handler.codec.http2.Http2Exception.connectionError(Http2Exception.java:85) at io.netty.handler.codec.http2.Http2ConnectionHandler$PrefaceDecoder.verifyFirstFrameIsSettings(Http2ConnectionHandler.java:350) at io.netty.handler.codec.http2.Http2ConnectionHandler$PrefaceDecoder.decode(Http2ConnectionHandler.java:251) at io.netty.handler.codec.http2.Http2ConnectionHandler.decode(Http2ConnectionHandler.java:450) at io.netty.handler.codec.ByteToMessageDecoder.decodeRemovalReentryProtection(ByteToMessageDecoder.java:502) at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:441) at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:278) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:359) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:345) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:337) at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1408) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:359) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:345) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:930) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:677) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:612) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:529) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:491) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:905) at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) at java.lang.Thread.run(Thread.java:748) } java.lang.Exception: io.grpc.StatusRuntimeException: INTERNAL: http2 exception at org.hyperledger.fabric.sdk.Channel.sendProposalToPeers(Channel.java:4179) at org.hyperledger.fabric.sdk.Channel.getConfigBlock(Channel.java:854) at org.hyperledger.fabric.sdk.Channel.parseConfigBlock(Channel.java:1820) at org.hyperledger.fabric.sdk.Channel.loadCACertificates(Channel.java:1657) at org.hyperledger.fabric.sdk.Channel.initialize(Channel.java:1103) ``` grpc://x.x.x.x:7051 访问失败 服务器地址为阿里云服务 根据网上提供的方法都说是grpc通信错误,添加相应的依赖, 然后并没用。 实在搞不懂是怎么弄了 哪位大神指导一下。 Fabric环境为1.0的版本

java.lang.OutOfMemory异常

各位大神,小生在做excel导出的时候遇见了这么一个问题,作为实习生实在是莫名其妙,求各位大神帮个小忙。 java.lang.OutOfMemoryError: GC overhead limit exceeded 16:31:22.700 [nioEventLoopGroup-2-8] WARN io.netty.channel.nio.NioEventLoop - Unexpected exception in the selector loop. java.lang.OutOfMemoryError: GC overhead limit exceeded at java.util.ArrayList.iterator(ArrayList.java:814) ~[na:1.7.0_80] at sun.nio.ch.WindowsSelectorImpl.updateSelectedKeys(WindowsSelectorImpl.java:496) ~[na:1.7.0_80] at sun.nio.ch.WindowsSelectorImpl.doSelect(WindowsSelectorImpl.java:172) ~[na:1.7.0_80] at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:87) ~[na:1.7.0_80] at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:98) ~[na:1.7.0_80] at io.netty.channel.nio.NioEventLoop.select(NioEventLoop.java:622) ~[netty-transport-4.0.32.Final.jar:4.0.32.Final] at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:310) ~[netty-transport-4.0.32.Final.jar:4.0.32.Final] at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:112) [netty-common-4.0.32.Final.jar:4.0.32.Final] at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:137) [netty-common-4.0.32.Final.jar:4.0.32.Final] at java.lang.Thread.run(Thread.java:745) [na:1.7.0_80] 16:31:23.334 [Thread-6] ERROR o.a.e.i.a.AcquireTimerJobsRunnable - exception during timer job acquisition: GC overhead limit exceeded java.lang.OutOfMemoryError: GC overhead limit exceeded at java.util.Arrays.copyOf(Arrays.java:2367) ~[na:1.7.0_80] at java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:130) ~[na:1.7.0_80] at java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:114) ~[na:1.7.0_80] at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:415) ~[na:1.7.0_80] at java.lang.StringBuilder.append(StringBuilder.java:132) ~[na:1.7.0_80] at org.apache.ibatis.reflection.wrapper.BeanWrapper.getBeanProperty(BeanWrapper.java:171) ~[mybatis-3.3.0.jar:3.3.0] at org.apache.ibatis.reflection.wrapper.BeanWrapper.get(BeanWrapper.java:49) ~[mybatis-3.3.0.jar:3.3.0] at org.apache.ibatis.reflection.MetaObject.getValue(MetaObject.java:122) ~[mybatis-3.3.0.jar:3.3.0] at org.apache.ibatis.executor.BaseExecutor.createCacheKey(BaseExecutor.java:212) ~[mybatis-3.3.0.jar:3.3.0] at org.apache.ibatis.executor.CachingExecutor.createCacheKey(CachingExecutor.java:139) ~[mybatis-3.3.0.jar:3.3.0] at org.apache.ibatis.executor.CachingExecutor.query(CachingExecutor.java:81) ~[mybatis-3.3.0.jar:3.3.0] at org.apache.ibatis.session.defaults.DefaultSqlSession.selectList(DefaultSqlSession.java:120) ~[mybatis-3.3.0.jar:3.3.0] at org.apache.ibatis.session.defaults.DefaultSqlSession.selectList(DefaultSqlSession.java:113) ~[mybatis-3.3.0.jar:3.3.0] at org.activiti.engine.impl.db.DbSqlSession.selectListWithRawParameter(DbSqlSession.java:438) ~[activiti-engine-5.19.0.2.jar:5.19.0.2] at org.activiti.engine.impl.db.DbSqlSession.selectList(DbSqlSession.java:429) ~[activiti-engine-5.19.0.2.jar:5.19.0.2] at org.activiti.engine.impl.db.DbSqlSession.selectList(DbSqlSession.java:424) ~[activiti-engine-5.19.0.2.jar:5.19.0.2] at org.activiti.engine.impl.db.DbSqlSession.selectList(DbSqlSession.java:411) ~[activiti-engine-5.19.0.2.jar:5.19.0.2] at org.activiti.engine.impl.persistence.entity.JobEntityManager.findNextTimerJobsToExecute(JobEntityManager.java:157) ~[activiti-engine-5.19.0.2.jar:5.19.0.2] at org.activiti.engine.impl.cmd.AcquireTimerJobsCmd.execute(AcquireTimerJobsCmd.java:45) ~[activiti-engine-5.19.0.2.jar:5.19.0.2] at org.activiti.engine.impl.cmd.AcquireTimerJobsCmd.execute(AcquireTimerJobsCmd.java:29) ~[activiti-engine-5.19.0.2.jar:5.19.0.2] at org.activiti.engine.impl.interceptor.CommandInvoker.execute(CommandInvoker.java:24) ~[activiti-engine-5.19.0.2.jar:5.19.0.2] at org.activiti.engine.impl.interceptor.CommandContextInterceptor.execute(CommandContextInterceptor.java:57) ~[activiti-engine-5.19.0.2.jar:5.19.0.2] at org.activiti.spring.SpringTransactionInterceptor$1.doInTransaction(SpringTransactionInterceptor.java:47) ~[activiti-spring-5.19.0.2.jar:5.19.0.2] at org.springframework.transaction.support.TransactionTemplate.execute(TransactionTemplate.java:133) ~[spring-tx-4.2.2.RELEASE.jar:4.2.2.RELEASE] at org.activiti.spring.SpringTransactionInterceptor.execute(SpringTransactionInterceptor.java:45) ~[activiti-spring-5.19.0.2.jar:5.19.0.2] at org.activiti.engine.impl.interceptor.LogInterceptor.execute(LogInterceptor.java:31) ~[activiti-engine-5.19.0.2.jar:5.19.0.2] at org.activiti.engine.impl.cfg.CommandExecutorImpl.execute(CommandExecutorImpl.java:40) ~[activiti-engine-5.19.0.2.jar:5.19.0.2] at org.activiti.engine.impl.cfg.CommandExecutorImpl.execute(CommandExecutorImpl.java:35) ~[activiti-engine-5.19.0.2.jar:5.19.0.2] at org.activiti.engine.impl.asyncexecutor.AcquireTimerJobsRunnable.run(AcquireTimerJobsRunnable.java:52) ~[activiti-engine-5.19.0.2.jar:5.19.0.2] at java.lang.Thread.run(Thread.java:745) [na:1.7.0_80] /report/collectionPaymentClass-report-down java.lang.OutOfMemoryError: GC overhead limit exceeded

netty启动报错找不到ServerBootstrap.channel

本人在一个tomcat容器的JavaWeb项目中打算用netty与硬件端进行通信,写了一个netty服务如下 ``` public class NettyServer extends HttpServlet{ private static final int port = 12888; // java.lang.NoSuchMethodError: public void bind() throws Exception{ EventLoopGroup bossGroup = new NioEventLoopGroup(); EventLoopGroup workerGroup = new NioEventLoopGroup(); System.out.println("start group handler"); try { ServerBootstrap b = new ServerBootstrap(); b.group(bossGroup, workerGroup) .channel(NioServerSocketChannel.class) .option(ChannelOption.SO_BACKLOG, 100) .handler(new LoggingHandler(LogLevel.INFO)) .childHandler(new ChannelInitializer<SocketChannel>() { @Override protected void initChannel(SocketChannel socketChannel) throws Exception { // socketChannel.pipeline().addLast(new MsgDecoder(1024*1024,4,4)); // socketChannel.pipeline().addLast(new MsgDecoder(1024*1024,4,4)); socketChannel.pipeline().addLast("readTimeOutHandler", new ReadTimeoutHandler(50)); socketChannel.pipeline().addLast(new LoginAuthRespHandler()); socketChannel.pipeline().addLast(new HeartBeatRespHandler()); socketChannel.pipeline().addLast(new NettyServerHandler()); } }); System.out.println("start bind"); // b.bind(port).sync(); ChannelFuture f = b.bind(port).sync(); System.out.println("netty Server start listen in port " + port); f.channel().closeFuture().sync(); }finally { bossGroup.shutdownGracefully(); workerGroup.shutdownGracefully(); } } @Override public void init() throws ServletException { try { new NettyServer().bind(); } catch (Exception e) { e.printStackTrace(); } } // @Override // public void contextInitialized(ServletContextEvent servletContextEvent) { // try { // new Thread(new Runnable() { // @Override // public void run() { // try { // new NettyServer().bind(); // } catch (Exception e) { // e.printStackTrace(); // } // } // }).start(); // }catch(Exception e){ // e.printStackTrace(); // } // } // // @Override // public void contextDestroyed(ServletContextEvent servletContextEvent) { //// // } // public static void main(String[] args) throws Exception { new NettyServer().bind(); } } ``` 使用main方法启动服务在本地测试时是没错的,但是在tomcat部署后项目启动时,试过用Listener与Servlet去控制netty服务的初始化,都在启动时报错: ``` java.lang.NoSuchMethodError: io.netty.bootstrap.ServerBootstrap.channel(Ljava/lang/Class;)Lio/netty/bootstrap/ServerBootstrap; at cn.blue.netty.NettyServer.bind(NettyServer.java:39) at cn.blue.netty.NettyServer.init(NettyServer.java:68) at javax.servlet.GenericServlet.init(GenericServlet.java:158) at org.apache.catalina.core.StandardWrapper.initServlet(StandardWrapper.java:1282) at org.apache.catalina.core.StandardWrapper.loadServlet(StandardWrapper.java:1195) at org.apache.catalina.core.StandardWrapper.load(StandardWrapper.java:1085) at org.apache.catalina.core.StandardContext.loadOnStartup(StandardContext.java:5318) at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5610) at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:147) at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:899) at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:875) at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:652) at org.apache.catalina.startup.HostConfig.manageApp(HostConfig.java:1863) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.tomcat.util.modeler.BaseModelMBean.invoke(BaseModelMBean.java:301) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.invoke(DefaultMBeanServerInterceptor.java:819) at com.sun.jmx.mbeanserver.JmxMBeanServer.invoke(JmxMBeanServer.java:801) at org.apache.catalina.mbeans.MBeanFactory.createStandardContext(MBeanFactory.java:618) at org.apache.catalina.mbeans.MBeanFactory.createStandardContext(MBeanFactory.java:565) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.tomcat.util.modeler.BaseModelMBean.invoke(BaseModelMBean.java:301) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.invoke(DefaultMBeanServerInterceptor.java:819) at com.sun.jmx.mbeanserver.JmxMBeanServer.invoke(JmxMBeanServer.java:801) at javax.management.remote.rmi.RMIConnectionImpl.doOperation(RMIConnectionImpl.java:1468) at javax.management.remote.rmi.RMIConnectionImpl.access$300(RMIConnectionImpl.java:76) at javax.management.remote.rmi.RMIConnectionImpl$PrivilegedOperation.run(RMIConnectionImpl.java:1309) at javax.management.remote.rmi.RMIConnectionImpl.doPrivilegedOperation(RMIConnectionImpl.java:1401) at javax.management.remote.rmi.RMIConnectionImpl.invoke(RMIConnectionImpl.java:829) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.rmi.server.UnicastServerRef.dispatch(UnicastServerRef.java:324) at sun.rmi.transport.Transport$1.run(Transport.java:200) at sun.rmi.transport.Transport$1.run(Transport.java:197) at java.security.AccessController.doPrivileged(Native Method) at sun.rmi.transport.Transport.serviceCall(Transport.java:196) at sun.rmi.transport.tcp.TCPTransport.handleMessages(TCPTransport.java:568) at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run0(TCPTransport.java:826) at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.lambda$run$0(TCPTransport.java:683) at java.security.AccessController.doPrivileged(Native Method) at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run(TCPTransport.java:682) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) ``` Spring:4.3.7.RELEASE Tomcat:7 JDK:1.8.0 netty:5.0.0.Alpha1 maven:3.3.9 求大神帮忙看看

启动tomcat 报dubbo端口号冲突

启动tomcat ,同时会报下面的错误 14:56:08.607 [localhost-startStop-1] ERROR o.s.web.context.ContextLoader - Context initialization failed com.alibaba.dubbo.rpc.RpcException: Fail to start server(url: dubbo://172.16.0.62:21889/com.winit.sms.spi.base.customeracct.CustomerAcctService?anyhost=true&application=sms&channel.readonly.sent=true&codec=dubbo&default.cluster=failover&default.executes=500&default.loadbalance=random&default.retries=0&default.service.filter=dubboProviderFilter&default.timeout=60000&default.validation=true&dispatcher=all&dubbo=2.8.3&dynamic=true&generic=false&heartbeat=60000&interface=com.winit.sms.spi.base.customeracct.CustomerAcctService&methods=queryCustomerAcctById,queryCustomerAcctByBpartnerCode,createCustomerAcct,pageCustomerAcct,modifyCustomerAcct&organization=winit&owner=liuyan.chen&pid=14588&revision=1.21.6.NewWorldPhase2.SNAPSHOT&side=provider&threadpool=limited&threads=300&timestamp=1453791368562&version=5.0.0) Failed to bind NettyServer on /172.16.0.62:21889, cause: Failed to bind to: /0.0.0.0:21889 at com.alibaba.dubbo.rpc.protocol.dubbo.DubboProtocol.createServer(DubboProtocol.java:331) ~[dubbo-2.8.3.jar:2.8.3] at com.alibaba.dubbo.rpc.protocol.dubbo.DubboProtocol.openServer(DubboProtocol.java:308) ~[dubbo-2.8.3.jar:2.8.3] at com.alibaba.dubbo.rpc.protocol.dubbo.DubboProtocol.export(DubboProtocol.java:258) ~[dubbo-2.8.3.jar:2.8.3] at com.alibaba.dubbo.rpc.protocol.ProtocolFilterWrapper.export(ProtocolFilterWrapper.java:55) ~[dubbo-2.8.3.jar:2.8.3] at com.alibaba.dubbo.rpc.protocol.ProtocolListenerWrapper.export(ProtocolListenerWrapper.java:56) ~[dubbo-2.8.3.jar:2.8.3] at com.alibaba.dubbo.rpc.Protocol$Adpative.export(Protocol$Adpative.java) ~[dubbo-2.8.3.jar:2.8.3] at com.alibaba.dubbo.registry.integration.RegistryProtocol.doLocalExport(RegistryProtocol.java:153) ~[dubbo-2.8.3.jar:2.8.3] at com.alibaba.dubbo.registry.integration.RegistryProtocol.export(RegistryProtocol.java:107) ~[dubbo-2.8.3.jar:2.8.3] at com.alibaba.dubbo.rpc.protocol.ProtocolFilterWrapper.export(ProtocolFilterWrapper.java:53) ~[dubbo-2.8.3.jar:2.8.3] at com.alibaba.dubbo.rpc.protocol.ProtocolListenerWrapper.export(ProtocolListenerWrapper.java:54) ~[dubbo-2.8.3.jar:2.8.3] at com.alibaba.dubbo.rpc.Protocol$Adpative.export(Protocol$Adpative.java) ~[dubbo-2.8.3.jar:2.8.3] at com.alibaba.dubbo.config.ServiceConfig.doExportUrlsFor1Protocol(ServiceConfig.java:489) ~[dubbo-2.8.3.jar:2.8.3] at com.alibaba.dubbo.config.ServiceConfig.doExportUrls(ServiceConfig.java:285) ~[dubbo-2.8.3.jar:2.8.3] at com.alibaba.dubbo.config.ServiceConfig.doExport(ServiceConfig.java:246) ~[dubbo-2.8.3.jar:2.8.3] at com.alibaba.dubbo.config.ServiceConfig.export(ServiceConfig.java:145) ~[dubbo-2.8.3.jar:2.8.3] at com.alibaba.dubbo.config.spring.ServiceBean.onApplicationEvent(ServiceBean.java:109) ~[dubbo-2.8.3.jar:2.8.3] at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:96) ~[spring-context-3.2.8.RELEASE.jar:3.2.8.RELEASE] at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:334) ~[spring-context-3.2.8.RELEASE.jar:3.2.8.RELEASE] at org.springframework.context.support.AbstractApplicationContext.finishRefresh(AbstractApplicationContext.java:948) ~[spring-context-3.2.8.RELEASE.jar:3.2.8.RELEASE] at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:482) ~[spring-context-3.2.8.RELEASE.jar:3.2.8.RELEASE] at org.springframework.web.context.ContextLoader.configureAndRefreshWebApplicationContext(ContextLoader.java:410) ~[spring-web-3.2.8.RELEASE.jar:3.2.8.RELEASE] at org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:306) ~[spring-web-3.2.8.RELEASE.jar:3.2.8.RELEASE] at org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:112) [spring-web-3.2.8.RELEASE.jar:3.2.8.RELEASE] at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:5003) [catalina.jar:7.0.65] at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5517) [catalina.jar:7.0.65] at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150) [catalina.jar:7.0.65] at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:901) [catalina.jar:7.0.65] at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:877) [catalina.jar:7.0.65] at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:652) [catalina.jar:7.0.65] at org.apache.catalina.startup.HostConfig.deployDirectory(HostConfig.java:1263) [catalina.jar:7.0.65] at org.apache.catalina.startup.HostConfig$DeployDirectory.run(HostConfig.java:1978) [catalina.jar:7.0.65] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) [na:1.7.0_80] at java.util.concurrent.FutureTask.run(FutureTask.java:262) [na:1.7.0_80] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [na:1.7.0_80] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [na:1.7.0_80] at java.lang.Thread.run(Thread.java:745) [na:1.7.0_80] Caused by: com.alibaba.dubbo.remoting.RemotingException: Failed to bind NettyServer on /172.16.0.62:21889, cause: Failed to bind to: /0.0.0.0:21889 at com.alibaba.dubbo.remoting.transport.AbstractServer.<init>(AbstractServer.java:72) ~[dubbo-2.8.3.jar:2.8.3] at com.alibaba.dubbo.remoting.transport.netty.NettyServer.<init>(NettyServer.java:63) ~[dubbo-2.8.3.jar:2.8.3] at com.alibaba.dubbo.remoting.transport.netty.NettyTransporter.bind(NettyTransporter.java:33) ~[dubbo-2.8.3.jar:2.8.3] at com.alibaba.dubbo.remoting.Transporter$Adpative.bind(Transporter$Adpative.java) ~[dubbo-2.8.3.jar:2.8.3] at com.alibaba.dubbo.remoting.Transporters.bind(Transporters.java:48) ~[dubbo-2.8.3.jar:2.8.3] at com.alibaba.dubbo.remoting.exchange.support.header.HeaderExchanger.bind(HeaderExchanger.java:41) ~[dubbo-2.8.3.jar:2.8.3] at com.alibaba.dubbo.remoting.exchange.Exchangers.bind(Exchangers.java:63) ~[dubbo-2.8.3.jar:2.8.3] at com.alibaba.dubbo.rpc.protocol.dubbo.DubboProtocol.createServer(DubboProtocol.java:329) ~[dubbo-2.8.3.jar:2.8.3] ... 35 common frames omitted Caused by: org.jboss.netty.channel.ChannelException: Failed to bind to: /0.0.0.0:21889 at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:303) ~[netty-3.2.5.Final.jar:na] at com.alibaba.dubbo.remoting.transport.netty.NettyServer.doOpen(NettyServer.java:94) ~[dubbo-2.8.3.jar:2.8.3] at com.alibaba.dubbo.remoting.transport.AbstractServer.<init>(AbstractServer.java:67) ~[dubbo-2.8.3.jar:2.8.3] ... 42 common frames omitted Caused by: java.net.BindException: Address already in use: bind at sun.nio.ch.Net.bind0(Native Method) ~[na:1.7.0_80] at sun.nio.ch.Net.bind(Net.java:463) ~[na:1.7.0_80] at sun.nio.ch.Net.bind(Net.java:455) ~[na:1.7.0_80] at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223) ~[na:1.7.0_80] at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74) ~[na:1.7.0_80] at org.jboss.netty.channel.socket.nio.NioServerSocketPipelineSink.bind(NioServerSocketPipelineSink.java:148) ~[netty-3.2.5.Final.jar:na] at org.jboss.netty.channel.socket.nio.NioServerSocketPipelineSink.handleServerSocket(NioServerSocketPipelineSink.java:100) ~[netty-3.2.5.Final.jar:na] at org.jboss.netty.channel.socket.nio.NioServerSocketPipelineSink.eventSunk(NioServerSocketPipelineSink.java:74) ~[netty-3.2.5.Final.jar:na] at org.jboss.netty.channel.Channels.bind(Channels.java:468) ~[netty-3.2.5.Final.jar:na] at org.jboss.netty.channel.AbstractChannel.bind(AbstractChannel.java:192) ~[netty-3.2.5.Final.jar:na] at org.jboss.netty.bootstrap.ServerBootstrap$Binder.channelOpen(ServerBootstrap.java:348) ~[netty-3.2.5.Final.jar:na] at org.jboss.netty.channel.Channels.fireChannelOpen(Channels.java:176) ~[netty-3.2.5.Final.jar:na] at org.jboss.netty.channel.socket.nio.NioServerSocketChannel.<init>(NioServerSocketChannel.java:85) ~[netty-3.2.5.Final.jar:na] at org.jboss.netty.channel.socket.nio.NioServerSocketChannelFactory.newChannel(NioServerSocketChannelFactory.java:142) ~[netty-3.2.5.Final.jar:na] at org.jboss.netty.channel.socket.nio.NioServerSocketChannelFactory.newChannel(NioServerSocketChannelFactory.java:90) ~[netty-3.2.5.Final.jar:na] at org.jboss.netty.bootstrap.ServerBootstrap.bind(ServerBootstrap.java:282) ~[netty-3.2.5.Final.jar:na] ... 44 common frames omitted

ElasticSearch 6.3.2版本整合springBoot 报错 java.io.StreamCorruptedException: invalid internal transport message format, got (48,54,54,50)

在服务器上搭建了一个单机的ElasticSearch 版本6.3.2,因为本地访问服务器的9200端口访问不到(network.host:0.0.0.0启动报错),所以用nginx做了代理9002端口对应9200,9001端口对应9300,直接访问访问成功如下:![图片说明](https://img-ask.csdn.net/upload/202002/08/1581096007_348190.jpg)![图片说明](https://img-ask.csdn.net/upload/202002/08/1581096016_248026.png) 然后我用写了个Demo但是一直报错 java.io.StreamCorruptedException: invalid internal transport message format, got (48,54,54,50); 配置文件如下: ![图片说明](https://img-ask.csdn.net/upload/202002/08/1581096398_340973.jpg) ![图片说明](https://img-ask.csdn.net/upload/202002/08/1581096609_898535.jpg) 错误日志和pom.xml文件如下: ``` exception caught on transport layer [NettyTcpChannel{localAddress=/192.168.1.103:61678, remoteAddress=/xxx.xxx.xxx.xxx:9001}], closing connection io.netty.handler.codec.DecoderException: java.io.StreamCorruptedException: invalid internal transport message format, got (48,54,54,50) at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:472) ~[netty-codec-4.1.31.Final.jar:4.1.31.Final] at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:278) ~[netty-codec-4.1.31.Final.jar:4.1.31.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) [netty-transport-4.1.31.Final.jar:4.1.31.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) [netty-transport-4.1.31.Final.jar:4.1.31.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) [netty-transport-4.1.31.Final.jar:4.1.31.Final] at io.netty.handler.logging.LoggingHandler.channelRead(LoggingHandler.java:241) [netty-handler-4.1.31.Final.jar:4.1.31.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) [netty-transport-4.1.31.Final.jar:4.1.31.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) [netty-transport-4.1.31.Final.jar:4.1.31.Final] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340) [netty-transport-4.1.31.Final.jar:4.1.31.Final] at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1434) [netty-transport-4.1.31.Final.jar:4.1.31.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362) [netty-transport-4.1.31.Final.jar:4.1.31.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348) [netty-transport-4.1.31.Final.jar:4.1.31.Final] at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:965) [netty-transport-4.1.31.Final.jar:4.1.31.Final] at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163) [netty-transport-4.1.31.Final.jar:4.1.31.Final] at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:648) [netty-transport-4.1.31.Final.jar:4.1.31.Final] at io.netty.channel.nio.NioEventLoop.processSelectedKeysPlain(NioEventLoop.java:548) [netty-transport-4.1.31.Final.jar:4.1.31.Final] at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:502) [netty-transport-4.1.31.Final.jar:4.1.31.Final] at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:462) [netty-transport-4.1.31.Final.jar:4.1.31.Final] at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:897) [netty-common-4.1.31.Final.jar:4.1.31.Final] at java.lang.Thread.run(Thread.java:748) [na:1.8.0_191] Caused by: java.io.StreamCorruptedException: invalid internal transport message format, got (48,54,54,50) at org.elasticsearch.transport.TcpTransport.validateMessageHeader(TcpTransport.java:1315) ~[elasticsearch-6.3.2.jar:6.4.3] at org.elasticsearch.transport.netty4.Netty4SizeHeaderFrameDecoder.decode(Netty4SizeHeaderFrameDecoder.java:36) ~[transport-netty4-client-6.4.3.jar:6.4.3] at io.netty.handler.codec.ByteToMessageDecoder.decodeRemovalReentryProtection(ByteToMessageDecoder.java:502) ~[netty-codec-4.1.31.Final.jar:4.1.31.Final] at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:441) ~[netty-codec-4.1.31.Final.jar:4.1.31.Final] ... 19 common frames omitted ``` pom.xml文件如下 ``` <?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <parent> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-parent</artifactId> <version>2.1.1.RELEASE</version> <relativePath/> <!-- lookup parent from repository --> </parent> <groupId>com</groupId> <artifactId>elasticsearchtest</artifactId> <version>0.0.1-SNAPSHOT</version> <name>elasticsearchtest</name> <description>Demo project for Spring Boot</description> <properties> <java.version>1.8</java.version> </properties> <dependencies> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter</artifactId> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-test</artifactId> <scope>test</scope> <exclusions> <exclusion> <groupId>org.junit.vintage</groupId> <artifactId>junit-vintage-engine</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-web</artifactId> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-thymeleaf</artifactId> </dependency> <dependency> <groupId>org.elasticsearch</groupId> <artifactId>elasticsearch</artifactId> <version>6.3.2</version> </dependency> <dependency> <groupId>org.elasticsearch.client</groupId> <artifactId>transport</artifactId> <version>6.3.2</version> </dependency> <dependency> <groupId>org.elasticsearch.client</groupId> <artifactId>elasticsearch-rest-high-level-client</artifactId> <version>6.3.2</version> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-test</artifactId> <scope>test</scope> </dependency> <dependency> <groupId>org.projectlombok</groupId> <artifactId>lombok</artifactId> <version>1.16.10</version> </dependency> </dependencies> <build> <plugins> <plugin> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-maven-plugin</artifactId> </plugin> </plugins> </build> </project> ``` 请大神帮忙看看,谢谢啦!

spring cloud 网关服务gateway报错。

在启动gateway时,加载consul配置失败, ``` ERROR - TraceId: - SpanId: - PspanId: - Fail fast is set and there was an error reading configuration from consul。 ``` 然后我去看config-server的日志,报错如下。 ``` [nio-8080-exec-9] o.apache.coyote.http11.Http11Processor : Error parsing HTTP request header Note: further occurrences of HTTP header parsing errors will be logged at DEBUG level. java.lang.IllegalArgumentException: Invalid character found in the HTTP protocol at org.apache.coyote.http11.Http11InputBuffer.parseRequestLine(Http11InputBuffer.java:541) ~[tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:684) ~[tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:66) [tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:800) [tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1471) [tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49) [tomcat-embed-core-8.5.32.jar:8.5.32] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [na:1.8.0_181] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [na:1.8.0_181] at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61) [tomcat-embed-core-8.5.32.jar:8.5.32] at java.lang.Thread.run(Thread.java:748) [na:1.8.0_181] 2019-08-16 06:36:27.773 ERROR 1 --- [io-8080-exec-10] o.a.c.c.C.[.[.[/].[dispatcherServlet] : Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception org.springframework.security.web.firewall.RequestRejectedException: The request was rejected because the URL contained a potentially malicious String "%2e" at org.springframework.security.web.firewall.StrictHttpFirewall.rejectedBlacklistedUrls(StrictHttpFirewall.java:265) ~[spring-security-web-5.0.7.RELEASE.jar:5.0.7.RELEASE] at org.springframework.security.web.firewall.StrictHttpFirewall.getFirewalledRequest(StrictHttpFirewall.java:245) ~[spring-security-web-5.0.7.RELEASE.jar:5.0.7.RELEASE] at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:194) ~[spring-security-web-5.0.7.RELEASE.jar:5.0.7.RELEASE] at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:178) ~[spring-security-web-5.0.7.RELEASE.jar:5.0.7.RELEASE] at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:357) ~[spring-web-5.0.8.RELEASE.jar:5.0.8.RELEASE] at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:270) ~[spring-web-5.0.8.RELEASE.jar:5.0.8.RELEASE] at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193) ~[tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166) ~[tomcat-embed-core-8.5.32.jar:8.5.32] at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:99) ~[spring-web-5.0.8.RELEASE.jar:5.0.8.RELEASE] at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) ~[spring-web-5.0.8.RELEASE.jar:5.0.8.RELEASE] at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193) ~[tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166) ~[tomcat-embed-core-8.5.32.jar:8.5.32] at org.springframework.web.filter.HttpPutFormContentFilter.doFilterInternal(HttpPutFormContentFilter.java:109) ~[spring-web-5.0.8.RELEASE.jar:5.0.8.RELEASE] at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) ~[spring-web-5.0.8.RELEASE.jar:5.0.8.RELEASE] at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193) ~[tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166) ~[tomcat-embed-core-8.5.32.jar:8.5.32] at org.springframework.web.filter.HiddenHttpMethodFilter.doFilterInternal(HiddenHttpMethodFilter.java:93) ~[spring-web-5.0.8.RELEASE.jar:5.0.8.RELEASE] at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) ~[spring-web-5.0.8.RELEASE.jar:5.0.8.RELEASE] at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193) ~[tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166) ~[tomcat-embed-core-8.5.32.jar:8.5.32] at org.springframework.boot.actuate.metrics.web.servlet.WebMvcMetricsFilter.filterAndRecordMetrics(WebMvcMetricsFilter.java:155) ~[spring-boot-actuator-2.0.4.RELEASE.jar:2.0.4.RELEASE] at org.springframework.boot.actuate.metrics.web.servlet.WebMvcMetricsFilter.filterAndRecordMetrics(WebMvcMetricsFilter.java:123) ~[spring-boot-actuator-2.0.4.RELEASE.jar:2.0.4.RELEASE] at org.springframework.boot.actuate.metrics.web.servlet.WebMvcMetricsFilter.doFilterInternal(WebMvcMetricsFilter.java:108) ~[spring-boot-actuator-2.0.4.RELEASE.jar:2.0.4.RELEASE] at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) ~[spring-web-5.0.8.RELEASE.jar:5.0.8.RELEASE] at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193) ~[tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166) ~[tomcat-embed-core-8.5.32.jar:8.5.32] at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:200) ~[spring-web-5.0.8.RELEASE.jar:5.0.8.RELEASE] at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) ~[spring-web-5.0.8.RELEASE.jar:5.0.8.RELEASE] at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193) ~[tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166) ~[tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:198) ~[tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:96) [tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:493) [tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:140) [tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:81) [tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:87) [tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:342) [tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:800) [tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:66) [tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:800) [tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1471) [tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49) [tomcat-embed-core-8.5.32.jar:8.5.32] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [na:1.8.0_181] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [na:1.8.0_181] at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61) [tomcat-embed-core-8.5.32.jar:8.5.32] at java.lang.Thread.run(Thread.java:748) [na:1.8.0_181] 2019-08-16 08:07:36.261 ERROR 1 --- [nio-8080-exec-1] o.a.c.c.C.[.[.[/].[dispatcherServlet] : Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception org.springframework.security.web.firewall.RequestRejectedException: The request was rejected because the URL contained a potentially malicious String "%2e" at org.springframework.security.web.firewall.StrictHttpFirewall.rejectedBlacklistedUrls(StrictHttpFirewall.java:265) ~[spring-security-web-5.0.7.RELEASE.jar:5.0.7.RELEASE] at org.springframework.security.web.firewall.StrictHttpFirewall.getFirewalledRequest(StrictHttpFirewall.java:245) ~[spring-security-web-5.0.7.RELEASE.jar:5.0.7.RELEASE] at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:194) ~[spring-security-web-5.0.7.RELEASE.jar:5.0.7.RELEASE] at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:178) ~[spring-security-web-5.0.7.RELEASE.jar:5.0.7.RELEASE] at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:357) ~[spring-web-5.0.8.RELEASE.jar:5.0.8.RELEASE] at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:270) ~[spring-web-5.0.8.RELEASE.jar:5.0.8.RELEASE] at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193) ~[tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166) ~[tomcat-embed-core-8.5.32.jar:8.5.32] at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:99) ~[spring-web-5.0.8.RELEASE.jar:5.0.8.RELEASE] at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) ~[spring-web-5.0.8.RELEASE.jar:5.0.8.RELEASE] at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193) ~[tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166) ~[tomcat-embed-core-8.5.32.jar:8.5.32] at org.springframework.web.filter.HttpPutFormContentFilter.doFilterInternal(HttpPutFormContentFilter.java:109) ~[spring-web-5.0.8.RELEASE.jar:5.0.8.RELEASE] at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) ~[spring-web-5.0.8.RELEASE.jar:5.0.8.RELEASE] at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193) ~[tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166) ~[tomcat-embed-core-8.5.32.jar:8.5.32] at org.springframework.web.filter.HiddenHttpMethodFilter.doFilterInternal(HiddenHttpMethodFilter.java:93) ~[spring-web-5.0.8.RELEASE.jar:5.0.8.RELEASE] at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) ~[spring-web-5.0.8.RELEASE.jar:5.0.8.RELEASE] at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193) ~[tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166) ~[tomcat-embed-core-8.5.32.jar:8.5.32] at org.springframework.boot.actuate.metrics.web.servlet.WebMvcMetricsFilter.filterAndRecordMetrics(WebMvcMetricsFilter.java:155) ~[spring-boot-actuator-2.0.4.RELEASE.jar:2.0.4.RELEASE] at org.springframework.boot.actuate.metrics.web.servlet.WebMvcMetricsFilter.filterAndRecordMetrics(WebMvcMetricsFilter.java:123) ~[spring-boot-actuator-2.0.4.RELEASE.jar:2.0.4.RELEASE] at org.springframework.boot.actuate.metrics.web.servlet.WebMvcMetricsFilter.doFilterInternal(WebMvcMetricsFilter.java:108) ~[spring-boot-actuator-2.0.4.RELEASE.jar:2.0.4.RELEASE] at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) ~[spring-web-5.0.8.RELEASE.jar:5.0.8.RELEASE] at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193) ~[tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166) ~[tomcat-embed-core-8.5.32.jar:8.5.32] at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:200) ~[spring-web-5.0.8.RELEASE.jar:5.0.8.RELEASE] at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107) ~[spring-web-5.0.8.RELEASE.jar:5.0.8.RELEASE] at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193) ~[tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166) ~[tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:198) ~[tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:96) [tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:493) [tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:140) [tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:81) [tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:87) [tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:342) [tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:800) [tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:66) [tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:800) [tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1471) [tomcat-embed-core-8.5.32.jar:8.5.32] at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49) [tomcat-embed-core-8.5.32.jar:8.5.32] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [na:1.8.0_181] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [na:1.8.0_181] at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61) [tomcat-embed-core-8.5.32.jar:8.5.32] at java.lang.Thread.run(Thread.java:748) [na:1.8.0_181] ``` 求大神告诉解决方法。

报错,求大神指点,发布项目的时候

``` 2016-05-18 18:05:56.711:INFO:/:Initializing log4j system 2016-05-18 18:05:56.714:INFO:/:Could not find log4j configuration file "/WEB-INF/log4j.xml" in webapp context. Using default configurations. INFO: configuring "log4j" using jar:file:/D:/maven/mvnRespo/com/alibaba/citrus/citrus-webx-all/3.0.9/citrus-webx-all-3.0.9.jar!/com/alibaba/citrus/logconfig/log4j/log4j-default.xml - with property localAddress = 100.81.168.223 - with property localHost = CP-WB179986-01 - with property log.level = INFO - with property log4j.defaultInitOverride = true - with property log_print_to_console = log_print_to_console - with property loggingCharset = UTF-8 - with property loggingLevel = INFO - with property loggingRoot = D:\Users\wb-hqm179986\logs 2016-05-18 18:05:56.898:INFO:/:Initializing Spring root WebApplicationContext Web Context替换文件到【C:\Users\WB-HQM~1\AppData\Local\Temp\hsf_jetty_placeholder\WEB-INF\webx.xml】 替换文件到【C:\Users\WB-HQM~1\AppData\Local\Temp\hsf_jetty_placeholder\WEB-INF\webx.xml】 HSFJettyWebAppContext replace servlet context get file /C:/Users/WB-HQM~1/AppData/Local/Temp/hsf_jetty_placeholder/WEB-INF/webx.xml Web Context替换文件到【C:\Users\WB-HQM~1\AppData\Local\Temp\hsf_jetty_placeholder\WEB-INF\common\webx-component-and-root.xml】 替换文件到【C:\Users\WB-HQM~1\AppData\Local\Temp\hsf_jetty_placeholder\WEB-INF\common\webx-component-and-root.xml】 HSFJettyWebAppContext replace servlet context get file /C:/Users/WB-HQM~1/AppData/Local/Temp/hsf_jetty_placeholder/WEB-INF/common/webx-component-and-root.xml 替换文件到【C:\Users\WB-HQM~1\AppData\Local\Temp\hsf_jetty_placeholder\WEB-INF\common\webx-component-and-root.xml】 HSFJettyWebAppContext replace servlet context get file /C:/Users/WB-HQM~1/AppData/Local/Temp/hsf_jetty_placeholder/WEB-INF/common/webx-component-and-root.xml Web Context替换文件到【C:\Users\WB-HQM~1\AppData\Local\Temp\hsf_jetty_placeholder\WEB-INF\common\uris.xml】 HSFJettyWebAppContext replace servlet context get file /C:/Users/wb-hqm179986/AppData/Local/Temp/hsf_jetty_placeholder/WEB-INF/common/uris.xml HSFJettyWebAppContext replace servlet context get file /C:/Users/wb-hqm179986/AppData/Local/Temp/hsf_jetty_placeholder/WEB-INF/common/uris.xml HSFJettyWebAppContext replace servlet context get file /C:/Users/wb-hqm179986/AppData/Local/Temp/hsf_jetty_placeholder/WEB-INF/ HSFJettyWebAppContext replace servlet context get file /C:/Users/wb-hqm179986/AppData/Local/Temp/hsf_jetty_placeholder/WEB-INF/ Web Context替换文件到【C:\Users\WB-HQM~1\AppData\Local\Temp\hsf_jetty_placeholder\WEB-INF\webx-bizconsole.xml】 2016-05-18 18:06:10.255:INFO:/:Initializing Spring sub WebApplicationContext: bizconsole 替换文件到【C:\Users\WB-HQM~1\AppData\Local\Temp\hsf_jetty_placeholder\WEB-INF\webx-bizconsole.xml】 HSFJettyWebAppContext replace servlet context get file /C:/Users/WB-HQM~1/AppData/Local/Temp/hsf_jetty_placeholder/WEB-INF/webx-bizconsole.xml 替换文件到【C:\Users\WB-HQM~1\AppData\Local\Temp\hsf_jetty_placeholder\WEB-INF\common\webx-component-and-root.xml】 HSFJettyWebAppContext replace servlet context get file /C:/Users/WB-HQM~1/AppData/Local/Temp/hsf_jetty_placeholder/WEB-INF/common/webx-component-and-root.xml Web Context替换文件到【C:\Users\WB-HQM~1\AppData\Local\Temp\hsf_jetty_placeholder\WEB-INF\common\webx-component.xml】 替换文件到【C:\Users\WB-HQM~1\AppData\Local\Temp\hsf_jetty_placeholder\config\hsf\buc-client-hsf.xml】 替换文件到【C:\Users\WB-HQM~1\AppData\Local\Temp\hsf_jetty_placeholder\config\hsf\buc-client-hsf.xml】 替换文件到【C:\Users\WB-HQM~1\AppData\Local\Temp\hsf_jetty_placeholder\config\hsf\buc-client-hsf.xml】 替换文件到【C:\Users\WB-HQM~1\AppData\Local\Temp\hsf_jetty_placeholder\config\hsf\biz-hsf-client.xml】 替换文件到【C:\Users\WB-HQM~1\AppData\Local\Temp\hsf_jetty_placeholder\config\hsf\biz-hsf-client.xml】 替换文件到【C:\Users\WB-HQM~1\AppData\Local\Temp\hsf_jetty_placeholder\config\hsf\biz-hsf-client.xml】 替换文件到【C:\Users\WB-HQM~1\AppData\Local\Temp\hsf_jetty_placeholder\config\hsf\forest-hsf-client.xml】 替换文件到【C:\Users\WB-HQM~1\AppData\Local\Temp\hsf_jetty_placeholder\config\hsf\forest-hsf-client.xml】 替换文件到【C:\Users\WB-HQM~1\AppData\Local\Temp\hsf_jetty_placeholder\config\hsf\forest-hsf-client.xml】 替换文件到【C:\Users\WB-HQM~1\AppData\Local\Temp\hsf_jetty_placeholder\spring\common\sls-client.xml】 替换文件到【C:\Users\WB-HQM~1\AppData\Local\Temp\hsf_jetty_placeholder\spring\common\sls-client.xml】 替换文件到【C:\Users\WB-HQM~1\AppData\Local\Temp\hsf_jetty_placeholder\spring\common\sls-client.xml】 替换文件到【C:\Users\WB-HQM~1\AppData\Local\Temp\hsf_jetty_placeholder\spring\common\keycenter-client.xml】 替换文件到【C:\Users\WB-HQM~1\AppData\Local\Temp\hsf_jetty_placeholder\spring\common\keycenter-client.xml】 替换文件到【C:\Users\WB-HQM~1\AppData\Local\Temp\hsf_jetty_placeholder\spring\common\keycenter-client.xml】 JM.Log:INFO Init JM logger with Slf4jLoggerFactory JM.Log:INFO Log root path: D:\Users\wb-hqm179986\logs\ JM.Log:INFO Set diamond-client log path: D:\Users\wb-hqm179986\logs\diamond-client JM.Log:INFO Init JM logger with Slf4jLoggerFactory JM.Log:INFO Log root path: D:\Users\wb-hqm179986\logs\ JM.Log:INFO Set hsf log path: D:\Users\wb-hqm179986\logs\hsf 18:06:25.305 [main] DEBUG i.n.u.i.l.InternalLoggerFactory - Using SLF4J as the default logging framework 18:06:25.308 [main] DEBUG i.n.c.MultithreadEventLoopGroup - -Dio.netty.eventLoopThreads: 8 18:06:25.333 [main] DEBUG i.n.util.internal.PlatformDependent0 - java.nio.Buffer.address: available 18:06:25.334 [main] DEBUG i.n.util.internal.PlatformDependent0 - sun.misc.Unsafe.theUnsafe: available 18:06:25.334 [main] DEBUG i.n.util.internal.PlatformDependent0 - sun.misc.Unsafe.copyMemory: available 18:06:25.334 [main] DEBUG i.n.util.internal.PlatformDependent0 - java.nio.Bits.unaligned: true 18:06:25.334 [main] DEBUG i.n.util.internal.PlatformDependent - Platform: Windows 18:06:25.334 [main] DEBUG i.n.util.internal.PlatformDependent - Java version: 7 18:06:25.335 [main] DEBUG i.n.util.internal.PlatformDependent - -Dio.netty.noUnsafe: false 18:06:25.335 [main] DEBUG i.n.util.internal.PlatformDependent - sun.misc.Unsafe: available 18:06:25.335 [main] DEBUG i.n.util.internal.PlatformDependent - -Dio.netty.noJavassist: false 18:06:25.664 [main] DEBUG i.n.util.internal.PlatformDependent - Javassist: available 18:06:25.664 [main] DEBUG i.n.util.internal.PlatformDependent - -Dio.netty.tmpdir: C:\Users\WB-HQM~1\AppData\Local\Temp (java.io.tmpdir) 18:06:25.664 [main] DEBUG i.n.util.internal.PlatformDependent - -Dio.netty.bitMode: 64 (sun.arch.data.model) 18:06:25.664 [main] DEBUG i.n.util.internal.PlatformDependent - -Dio.netty.noPreferDirect: false 18:06:25.758 [main] DEBUG io.netty.channel.nio.NioEventLoop - -Dio.netty.noKeySetOptimization: false 18:06:25.759 [main] DEBUG io.netty.channel.nio.NioEventLoop - -Dio.netty.selectorAutoRebuildThreshold: 512 18:06:25.802 [main] DEBUG io.netty.util.ResourceLeakDetector - -Dio.netty.leakDetectionLevel: simple 18:06:25.967 [main] DEBUG i.n.u.i.JavassistTypeParameterMatcherGenerator - Generated: io.netty.util.internal.__matchers__.com.taobao.hsf.remoting.BaseRequestMatcher 18:06:25.994 [main] DEBUG i.n.u.i.JavassistTypeParameterMatcherGenerator - Generated: io.netty.util.internal.__matchers__.com.taobao.hsf.remoting.netty.server.http.domain.NettyHttpRpcRequestMatcher JM.Log:INFO Log root path: D:\Users\wb-hqm179986\logs\ JM.Log:INFO Init JM logger with Slf4jLoggerFactory JM.Log:INFO Set configclient log path: D:\Users\wb-hqm179986\logs\configclient Exception in thread "HSF-Remoting-Timer-6-thread-1" Exception in thread "BufferedStatLogWriter-Flush-Timer" 2016-05-18 18:07:48.650:WARN::failed runjettyrun.HSFJettyWebAppContext@6aaaa67b{/,src/main/webapp}: java.lang.OutOfMemoryError: PermGen space 2016-05-18 18:07:48.651:WARN::Error starting handlers java.lang.OutOfMemoryError: PermGen space 2016-05-18 18:07:49.189:WARN::failed org.mortbay.jetty.nio.SelectChannelConnector$1@360987dc: java.lang.OutOfMemoryError: PermGen space 2016-05-18 18:07:49.190:WARN::failed SelectChannelConnector@0.0.0.0:8081: java.lang.OutOfMemoryError: PermGen space Exception in thread "-thread-2" 2016-05-18 18:07:52.172:WARN::failed Ajp13SocketConnector@0.0.0.0:8009: java.lang.OutOfMemoryError: PermGen space 2016-05-18 18:07:54.105:WARN::failed Server@1b897a77: org.mortbay.util.MultiException[java.lang.OutOfMemoryError: PermGen space, java.lang.OutOfMemoryError: PermGen space] Exception in thread "main" ```

监听端口,发布服务的几种实现方法。

1,通过springListener类,发布服务,接受客户端socket,使用Executor线程池,处理青丘任务。 2,通过使用java5的NIO实现服务发布,同步非阻塞的形式处理客户端请求。 3,通过使用netty开源框架,处理客户端请求任务。 **那么问题来了** MQ消息队列,可以实现服务发布监听,请求处理吗?项目中有这么实用的吗?伪代码,能看一下吗?

关于dubbo连接不到服务器

严重: Exception sending context initialized event to listener instance of class org.springframework.web.context.ContextLoaderListener com.alibaba.dubbo.rpc.RpcException: Fail to start server(url: dubbo://192.168.10.123:20880/com.jgxin.service.ItemService?anyhost=true&application=jgxin-manager&channel.readonly.sent=true&codec=dubbo&dubbo=2.5.3&heartbeat=60000&interface=com.jgxin.service.ItemService&methods=getItemList,addItem,getItemById&pid=5584&side=provider&timeout=100000&timestamp=1465717104749) Failed to bind NettyServer on /192.168.10.123:20880, cause: Failed to create a selector. at com.alibaba.dubbo.rpc.protocol.dubbo.DubboProtocol.createServer(DubboProtocol.java:289) at com.alibaba.dubbo.rpc.protocol.dubbo.DubboProtocol.openServer(DubboProtocol.java:266) at com.alibaba.dubbo.rpc.protocol.dubbo.DubboProtocol.export(DubboProtocol.java:253) at com.alibaba.dubbo.rpc.protocol.ProtocolFilterWrapper.export(ProtocolFilterWrapper.java:55) at com.alibaba.dubbo.rpc.protocol.ProtocolListenerWrapper.export(ProtocolListenerWrapper.java:56) at com.alibaba.dubbo.rpc.Protocol$Adpative.export(Protocol$Adpative.java) at com.alibaba.dubbo.registry.integration.RegistryProtocol.doLocalExport(RegistryProtocol.java:153) at com.alibaba.dubbo.registry.integration.RegistryProtocol.export(RegistryProtocol.java:107) at com.alibaba.dubbo.rpc.protocol.ProtocolFilterWrapper.export(ProtocolFilterWrapper.java:53) at com.alibaba.dubbo.rpc.protocol.ProtocolListenerWrapper.export(ProtocolListenerWrapper.java:54) at com.alibaba.dubbo.rpc.Protocol$Adpative.export(Protocol$Adpative.java) at com.alibaba.dubbo.config.ServiceConfig.doExportUrlsFor1Protocol(ServiceConfig.java:485) at com.alibaba.dubbo.config.ServiceConfig.doExportUrls(ServiceConfig.java:281) at com.alibaba.dubbo.config.ServiceConfig.doExport(ServiceConfig.java:242) at com.alibaba.dubbo.config.ServiceConfig.export(ServiceConfig.java:143) at com.alibaba.dubbo.config.spring.ServiceBean.onApplicationEvent(ServiceBean.java:109) at org.springframework.context.event.SimpleApplicationEventMulticaster.invokeListener(SimpleApplicationEventMulticaster.java:151) at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:128) at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:331) at org.springframework.context.support.AbstractApplicationContext.finishRefresh(AbstractApplicationContext.java:773) at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:483) at org.springframework.web.context.ContextLoader.configureAndRefreshWebApplicationContext(ContextLoader.java:403) at org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:306) at org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:106) at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4939) at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5434) at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150) at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1559) at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1549) at java.util.concurrent.FutureTask.run(FutureTask.java:262) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:744) Caused by: com.alibaba.dubbo.remoting.RemotingException: Failed to bind NettyServer on /192.168.10.123:20880, cause: Failed to create a selector. at com.alibaba.dubbo.remoting.transport.AbstractServer.<init>(AbstractServer.java:72) at com.alibaba.dubbo.remoting.transport.netty.NettyServer.<init>(NettyServer.java:63) at com.alibaba.dubbo.remoting.transport.netty.NettyTransporter.bind(NettyTransporter.java:33) at com.alibaba.dubbo.remoting.Transporter$Adpative.bind(Transporter$Adpative.java) at com.alibaba.dubbo.remoting.Transporters.bind(Transporters.java:48) at com.alibaba.dubbo.remoting.exchange.support.header.HeaderExchanger.bind(HeaderExchanger.java:41) at com.alibaba.dubbo.remoting.exchange.Exchangers.bind(Exchangers.java:63) at com.alibaba.dubbo.rpc.protocol.dubbo.DubboProtocol.createServer(DubboProtocol.java:287) ... 32 more Caused by: org.jboss.netty.channel.ChannelException: Failed to create a selector. at org.jboss.netty.channel.socket.nio.AbstractNioSelector.openSelector(AbstractNioSelector.java:337) at org.jboss.netty.channel.socket.nio.AbstractNioSelector.<init>(AbstractNioSelector.java:95) at org.jboss.netty.channel.socket.nio.AbstractNioWorker.<init>(AbstractNioWorker.java:53) at org.jboss.netty.channel.socket.nio.NioWorker.<init>(NioWorker.java:45) at org.jboss.netty.channel.socket.nio.NioWorkerPool.createWorker(NioWorkerPool.java:45) at org.jboss.netty.channel.socket.nio.NioWorkerPool.createWorker(NioWorkerPool.java:28) at org.jboss.netty.channel.socket.nio.AbstractNioWorkerPool.newWorker(AbstractNioWorkerPool.java:99) at org.jboss.netty.channel.socket.nio.AbstractNioWorkerPool.init(AbstractNioWorkerPool.java:69) at org.jboss.netty.channel.socket.nio.NioWorkerPool.<init>(NioWorkerPool.java:39) at org.jboss.netty.channel.socket.nio.NioWorkerPool.<init>(NioWorkerPool.java:33) at org.jboss.netty.channel.socket.nio.NioServerSocketChannelFactory.<init>(NioServerSocketChannelFactory.java:149) at org.jboss.netty.channel.socket.nio.NioServerSocketChannelFactory.<init>(NioServerSocketChannelFactory.java:131) at com.alibaba.dubbo.remoting.transport.netty.NettyServer.doOpen(NettyServer.java:71) at com.alibaba.dubbo.remoting.transport.AbstractServer.<init>(AbstractServer.java:67) ... 39 more Caused by: java.io.IOException: Unable to establish loopback connection at sun.nio.ch.PipeImpl$Initializer.run(PipeImpl.java:125) at sun.nio.ch.PipeImpl$Initializer.run(PipeImpl.java:69) at java.security.AccessController.doPrivileged(Native Method) at sun.nio.ch.PipeImpl.<init>(PipeImpl.java:141) at sun.nio.ch.SelectorProviderImpl.openPipe(SelectorProviderImpl.java:50) at java.nio.channels.Pipe.open(Pipe.java:150) at sun.nio.ch.WindowsSelectorImpl.<init>(WindowsSelectorImpl.java:127) at sun.nio.ch.WindowsSelectorProvider.openSelector(WindowsSelectorProvider.java:44) at java.nio.channels.Selector.open(Selector.java:227) at org.jboss.netty.channel.socket.nio.SelectorUtil.open(SelectorUtil.java:63) at org.jboss.netty.channel.socket.nio.AbstractNioSelector.openSelector(AbstractNioSelector.java:335) ... 52 more Caused by: java.io.IOException: 你的主机中的软件中止了一个已建立的连接。 at sun.nio.ch.SocketDispatcher.write0(Native Method) at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:51) at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) at sun.nio.ch.IOUtil.write(IOUtil.java:65) at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:487) at sun.nio.ch.PipeImpl$Initializer.run(PipeImpl.java:102) ... 62 more

为什么 elasticsearch 获取节点信息失败?

在 spring boot 项目中即成集成 elasticsearch(dao层数据与es交互使用的的是 spring-data-elasticsearch)首先安装了服务器端的 es 服务,和 head 插件,es 服务启动正常,node-1 为默认主节点,my-cluster 为集群名,如图: ![图片说明](https://img-ask.csdn.net/upload/201605/24/1464076273_991814.png) 在程序中,使用嵌入式node启动节点正常,方式如下: ``` Node node = NodeBuilder.nodeBuilder().node(); node.start(); ``` 但是,如果使用 TransportClient 建立 es 的 Client,方式如下: ``` TransportClient client = new TransportClient(settings); client.addTransportAddress(new InetSocketTransportAddress("127.0.0.1", 9300)); ``` 这是 ES 2.3 官网上面提供的 TransportClient 方式:https://www.elastic.co/guide/en/elasticsearch/client/java-api/current/transport-client.html 使用 TransportClient 方式,启动程序后,程序可以正常启动,但是控制台一直不停报错,貌似是在一直监测节点,而节点一直没有被发现,所以不停报错,错误信息如下,应用程序控制台错误信息: ``` 2016-05-23 19:40:15.823 INFO 27655 --- [ main] org.elasticsearch.client.transport : [Aliyah Bishop] failed to get node info for [#transport#-1][XXX-MBP.lan][inet[/127.0.0.1:9300]], disconnecting... org.elasticsearch.transport.RemoteTransportException: Failed to deserialize exception response from stream Caused by: org.elasticsearch.transport.TransportSerializationException: Failed to deserialize exception response from stream at org.elasticsearch.transport.netty.MessageChannelHandler.handlerResponseError(MessageChannelHandler.java:173) at org.elasticsearch.transport.netty.MessageChannelHandler.messageReceived(MessageChannelHandler.java:125) at org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) at org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) at org.elasticsearch.common.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) at org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:296) at org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:462) at org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:443) at org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:303) at org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) at org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) at org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559) at org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:268) at org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:255) at org.elasticsearch.common.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88) at org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108) at org.elasticsearch.common.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:318) at org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89) at org.elasticsearch.common.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178) at org.elasticsearch.common.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) at org.elasticsearch.common.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) Caused by: java.io.StreamCorruptedException: Unsupported version: 1 at org.elasticsearch.common.io.ThrowableObjectInputStream.readStreamHeader(ThrowableObjectInputStream.java:46) at java.io.ObjectInputStream.<init>(ObjectInputStream.java:299) at org.elasticsearch.common.io.ThrowableObjectInputStream.<init>(ThrowableObjectInputStream.java:38) at org.elasticsearch.transport.netty.MessageChannelHandler.handlerResponseError(MessageChannelHandler.java:170) ... 23 common frames omitted …………. failed to load elasticsearch nodes : org.elasticsearch.client.transport.NoNodeAvailableException: None of the configured nodes are available: [] ``` es 控制台报错信息: ``` [2016-05-23 21:45:56,807][WARN ][transport.netty ] [node-1] exception caught on transport layer [[id: 0x8e4b89bc, /127.0.0.1:62566 => /127.0.0.1:9300]], closing connection java.lang.IllegalStateException: Message not fully read (request) for requestId [233], action [cluster/nodes/info], readerIndex [39] vs expected [57]; resetting at org.elasticsearch.transport.netty.MessageChannelHandler.messageReceived(MessageChannelHandler.java:121) at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:296) at org.jboss.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:462) at org.jboss.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:443) at org.jboss.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:303) at org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) at org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) at org.elasticsearch.common.netty.OpenChannelsHandler.handleUpstream(OpenChannelsHandler.java:75) at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) at org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559) at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268) at org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255) at org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88) at org.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108) at org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:337) at org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89) at org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178) at org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) at org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) ``` 而且,当我使用嵌入式node启动节点后,系统会自动启动一个node节点,node.name 也是随机的,并且并没有将该节点加入启动的es集群中,我加入的索引和数据也不能出现在 my-cluster 集群中 所以有两个问题: 1、为什么我使用嵌入式node启动节点,启动后的节点不会加入到集群中? 2、使用 TransportClient 方式建立 Client,启动程序后,程序的控制台,和服务器端的es一直报错,获取节点信息失败,是什么原因?即上面的错误信息 下面是环境版本信息: 服务器端 es 版本:2.3.3 jdk 版本:1.7.0_79 spring-data-elasticsearch 版本:1.2.0.RELEASE(即:elasticsearch-1.4.4.jar) 服务器端 es 配置,elasticsearch.yml: cluster.name: my-cluster node.name: node-1 http 端口为:9200 节点间的通信端口为:9300 关于第二个问题尝试了很多解决方案,比如: 1、应用 和 es 的 jdk 环境版本不一致。已检查过一致 https://github.com/elastic/elasticsearch/issues/3835 2、es 多个节点之间的JDK版本不一致。我使用的只是单节点,并且都在我本地机器上 http://jontai.me/blog/2013/06/elasticsearch-remotetransportexception-failed-to-deserialize-exception-response-from-stream/ 3、idk 版本过低,建议1.7+。已是1.7+ 最后附上两种启动节点方式的完整代码 1、嵌入式 node 启动 ``` import org.elasticsearch.client.Client; import org.elasticsearch.node.Node; import org.elasticsearch.node.NodeBuilder; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.data.elasticsearch.repository.config.EnableElasticsearchRepositories; @Configuration @EnableElasticsearchRepositories(basePackages = "xx.xxx.domain.repository.elastic") public class ElasticsearchConfiguration { @Bean public Client client() { Node node = NodeBuilder.nodeBuilder().node(); node.start(); return node.client(); } } ``` 2、TransportClient 方式 ``` import org.elasticsearch.client.Client; import org.elasticsearch.client.transport.TransportClient; import org.elasticsearch.common.settings.ImmutableSettings; import org.elasticsearch.common.settings.Settings; import org.elasticsearch.common.transport.InetSocketTransportAddress; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.data.elasticsearch.core.ElasticsearchOperations; import org.springframework.data.elasticsearch.core.ElasticsearchTemplate; import org.springframework.data.elasticsearch.repository.config.EnableElasticsearchRepositories; @Configuration @EnableElasticsearchRepositories(basePackages = "xx.xxx.domain.repository.elastic") public class ElasticsearchConfiguration { @Bean public Client client() { TransportClient client = new TransportClient(); client.addTransportAddress(new InetSocketTransportAddress("127.0.0.1", 9300)); return client; } @Bean public ElasticsearchOperations elasticsearchTemplate() { return new ElasticsearchTemplate(client()); } } ```

大学四年自学走来,这些私藏的实用工具/学习网站我贡献出来了

大学四年,看课本是不可能一直看课本的了,对于学习,特别是自学,善于搜索网上的一些资源来辅助,还是非常有必要的,下面我就把这几年私藏的各种资源,网站贡献出来给你们。主要有:电子书搜索、实用工具、在线视频学习网站、非视频学习网站、软件下载、面试/求职必备网站。 注意:文中提到的所有资源,文末我都给你整理好了,你们只管拿去,如果觉得不错,转发、分享就是最大的支持了。 一、电子书搜索 对于大部分程序员...

在中国程序员是青春饭吗?

今年,我也32了 ,为了不给大家误导,咨询了猎头、圈内好友,以及年过35岁的几位老程序员……舍了老脸去揭人家伤疤……希望能给大家以帮助,记得帮我点赞哦。 目录: 你以为的人生 一次又一次的伤害 猎头界的真相 如何应对互联网行业的「中年危机」 一、你以为的人生 刚入行时,拿着傲人的工资,想着好好干,以为我们的人生是这样的: 等真到了那一天,你会发现,你的人生很可能是这样的: ...

程序员请照顾好自己,周末病魔差点一套带走我。

程序员在一个周末的时间,得了重病,差点当场去世,还好及时挽救回来了。

技术大佬:我去,你写的 switch 语句也太老土了吧

昨天早上通过远程的方式 review 了两名新来同事的代码,大部分代码都写得很漂亮,严谨的同时注释也很到位,这令我非常满意。但当我看到他们当中有一个人写的 switch 语句时,还是忍不住破口大骂:“我擦,小王,你丫写的 switch 语句也太老土了吧!” 来看看小王写的代码吧,看完不要骂我装逼啊。 private static String createPlayer(PlayerTypes p...

和黑客斗争的 6 天!

互联网公司工作,很难避免不和黑客们打交道,我呆过的两家互联网公司,几乎每月每天每分钟都有黑客在公司网站上扫描。有的是寻找 Sql 注入的缺口,有的是寻找线上服务器可能存在的漏洞,大部分都...

点沙成金:英特尔芯片制造全过程揭密

“亚马逊丛林里的蝴蝶扇动几下翅膀就可能引起两周后美国德州的一次飓风……” 这句人人皆知的话最初用来描述非线性系统中微小参数的变化所引起的系统极大变化。 而在更长的时间尺度内,我们所生活的这个世界就是这样一个异常复杂的非线性系统…… 水泥、穹顶、透视——关于时间与技艺的蝴蝶效应 公元前3000年,古埃及人将尼罗河中挖出的泥浆与纳特龙盐湖中的矿物盐混合,再掺入煅烧石灰石制成的石灰,由此得来了人...

讲一个程序员如何副业月赚三万的真实故事

loonggg读完需要3分钟速读仅需 1 分钟大家好,我是你们的校长。我之前讲过,这年头,只要肯动脑,肯行动,程序员凭借自己的技术,赚钱的方式还是有很多种的。仅仅靠在公司出卖自己的劳动时...

上班一个月,后悔当初着急入职的选择了

最近有个老铁,告诉我说,上班一个月,后悔当初着急入职现在公司了。他之前在美图做手机研发,今年美图那边今年也有一波组织优化调整,他是其中一个,在协商离职后,当时捉急找工作上班,因为有房贷供着,不能没有收入来源。所以匆忙选了一家公司,实际上是一个大型外包公司,主要派遣给其他手机厂商做外包项目。**当时承诺待遇还不错,所以就立马入职去上班了。但是后面入职后,发现薪酬待遇这块并不是HR所说那样,那个HR自...

女程序员,为什么比男程序员少???

昨天看到一档综艺节目,讨论了两个话题:(1)中国学生的数学成绩,平均下来看,会比国外好?为什么?(2)男生的数学成绩,平均下来看,会比女生好?为什么?同时,我又联想到了一个技术圈经常讨...

副业收入是我做程序媛的3倍,工作外的B面人生是怎样的?

提到“程序员”,多数人脑海里首先想到的大约是:为人木讷、薪水超高、工作枯燥…… 然而,当离开工作岗位,撕去层层标签,脱下“程序员”这身外套,有的人生动又有趣,马上展现出了完全不同的A/B面人生! 不论是简单的爱好,还是正经的副业,他们都干得同样出色。偶尔,还能和程序员的特质结合,产生奇妙的“化学反应”。 @Charlotte:平日素颜示人,周末美妆博主 大家都以为程序媛也个个不修边幅,但我们也许...

MySQL数据库面试题(2020最新版)

文章目录数据库基础知识为什么要使用数据库什么是SQL?什么是MySQL?数据库三大范式是什么mysql有关权限的表都有哪几个MySQL的binlog有有几种录入格式?分别有什么区别?数据类型mysql有哪些数据类型引擎MySQL存储引擎MyISAM与InnoDB区别MyISAM索引与InnoDB索引的区别?InnoDB引擎的4大特性存储引擎选择索引什么是索引?索引有哪些优缺点?索引使用场景(重点)...

如果你是老板,你会不会踢了这样的员工?

有个好朋友ZS,是技术总监,昨天问我:“有一个老下属,跟了我很多年,做事勤勤恳恳,主动性也很好。但随着公司的发展,他的进步速度,跟不上团队的步伐了,有点...

我入职阿里后,才知道原来简历这么写

私下里,有不少读者问我:“二哥,如何才能写出一份专业的技术简历呢?我总感觉自己写的简历太烂了,所以投了无数份,都石沉大海了。”说实话,我自己好多年没有写过简历了,但我认识的一个同行,他在阿里,给我说了一些他当年写简历的方法论,我感觉太牛逼了,实在是忍不住,就分享了出来,希望能够帮助到你。 01、简历的本质 作为简历的撰写者,你必须要搞清楚一点,简历的本质是什么,它就是为了来销售你的价值主张的。往深...

我说我不会算法,阿里把我挂了。

不说了,字节跳动也反手把我挂了。

优雅的替换if-else语句

场景 日常开发,if-else语句写的不少吧??当逻辑分支非常多的时候,if-else套了一层又一层,虽然业务功能倒是实现了,但是看起来是真的很不优雅,尤其是对于我这种有强迫症的程序"猿",看到这么多if-else,脑袋瓜子就嗡嗡的,总想着解锁新姿势:干掉过多的if-else!!!本文将介绍三板斧手段: 优先判断条件,条件不满足的,逻辑及时中断返回; 采用策略模式+工厂模式; 结合注解,锦...

离职半年了,老东家又发 offer,回不回?

有小伙伴问松哥这个问题,他在上海某公司,在离职了几个月后,前公司的领导联系到他,希望他能够返聘回去,他很纠结要不要回去? 俗话说好马不吃回头草,但是这个小伙伴既然感到纠结了,我觉得至少说明了两个问题:1.曾经的公司还不错;2.现在的日子也不是很如意。否则应该就不会纠结了。 老实说,松哥之前也有过类似的经历,今天就来和小伙伴们聊聊回头草到底吃不吃。 首先一个基本观点,就是离职了也没必要和老东家弄的苦...

为什么你不想学习?只想玩?人是如何一步一步废掉的

不知道是不是只有我这样子,还是你们也有过类似的经历。 上学的时候总有很多光辉历史,学年名列前茅,或者单科目大佬,但是虽然慢慢地长大了,你开始懈怠了,开始废掉了。。。 什么?你说不知道具体的情况是怎么样的? 我来告诉你: 你常常潜意识里或者心理觉得,自己真正的生活或者奋斗还没有开始。总是幻想着自己还拥有大把时间,还有无限的可能,自己还能逆风翻盘,只不是自己还没开始罢了,自己以后肯定会变得特别厉害...

男生更看重女生的身材脸蛋,还是思想?

往往,我们看不进去大段大段的逻辑。深刻的哲理,往往短而精悍,一阵见血。问:产品经理挺漂亮的,有点心动,但不知道合不合得来。男生更看重女生的身材脸蛋,还是...

为什么程序员做外包会被瞧不起?

二哥,有个事想询问下您的意见,您觉得应届生值得去外包吗?公司虽然挺大的,中xx,但待遇感觉挺低,马上要报到,挺纠结的。

当HR压你价,说你只值7K,你该怎么回答?

当HR压你价,说你只值7K时,你可以流畅地回答,记住,是流畅,不能犹豫。 礼貌地说:“7K是吗?了解了。嗯~其实我对贵司的面试官印象很好。只不过,现在我的手头上已经有一份11K的offer。来面试,主要也是自己对贵司挺有兴趣的,所以过来看看……”(未完) 这段话主要是陪HR互诈的同时,从公司兴趣,公司职员印象上,都给予对方正面的肯定,既能提升HR的好感度,又能让谈判气氛融洽,为后面的发挥留足空间。...

面试:第十六章:Java中级开发(16k)

HashMap底层实现原理,红黑树,B+树,B树的结构原理 Spring的AOP和IOC是什么?它们常见的使用场景有哪些?Spring事务,事务的属性,传播行为,数据库隔离级别 Spring和SpringMVC,MyBatis以及SpringBoot的注解分别有哪些?SpringMVC的工作原理,SpringBoot框架的优点,MyBatis框架的优点 SpringCould组件有哪些,他们...

面试阿里p7,被按在地上摩擦,鬼知道我经历了什么?

面试阿里p7被问到的问题(当时我只知道第一个):@Conditional是做什么的?@Conditional多个条件是什么逻辑关系?条件判断在什么时候执...

你打算用Java 8一辈子都不打算升级到Java 14,真香

我们程序员应该抱着尝鲜、猎奇的心态,否则就容易固步自封,技术停滞不前。

无代码时代来临,程序员如何保住饭碗?

编程语言层出不穷,从最初的机器语言到如今2500种以上的高级语言,程序员们大呼“学到头秃”。程序员一边面临编程语言不断推陈出新,一边面临由于许多代码已存在,程序员编写新应用程序时存在重复“搬砖”的现象。 无代码/低代码编程应运而生。无代码/低代码是一种创建应用的方法,它可以让开发者使用最少的编码知识来快速开发应用程序。开发者通过图形界面中,可视化建模来组装和配置应用程序。这样一来,开发者直...

面试了一个 31 岁程序员,让我有所触动,30岁以上的程序员该何去何从?

最近面试了一个31岁8年经验的程序猿,让我有点感慨,大龄程序猿该何去何从。

大三实习生,字节跳动面经分享,已拿Offer

说实话,自己的算法,我一个不会,太难了吧

程序员垃圾简历长什么样?

已经连续五年参加大厂校招、社招的技术面试工作,简历看的不下于万份 这篇文章会用实例告诉你,什么是差的程序员简历! 疫情快要结束了,各个公司也都开始春招了,作为即将红遍大江南北的新晋UP主,那当然要为小伙伴们做点事(手动狗头)。 就在公众号里公开征简历,义务帮大家看,并一一点评。《启舰:春招在即,义务帮大家看看简历吧》 一石激起千层浪,三天收到两百多封简历。 花光了两个星期的所有空闲时...

《经典算法案例》01-08:如何使用质数设计扫雷(Minesweeper)游戏

我们都玩过Windows操作系统中的经典游戏扫雷(Minesweeper),如果把质数当作一颗雷,那么,表格中红色的数字哪些是雷(质数)?您能找出多少个呢?文中用列表的方式罗列了10000以内的自然数、质数(素数),6的倍数等,方便大家观察质数的分布规律及特性,以便对算法求解有指导意义。另外,判断质数是初学算法,理解算法重要性的一个非常好的案例。

《Oracle Java SE编程自学与面试指南》最佳学习路线图(2020最新版)

正确选择比瞎努力更重要!

一文带你入门Java Stream流,太强了

两个星期以前,就有读者强烈要求我写一篇 Java Stream 流的文章,我说市面上不是已经有很多了吗,结果你猜他怎么说:“就想看你写的啊!”你看你看,多么苍白的喜欢啊。那就“勉为其难”写一篇吧,嘻嘻。 单从“Stream”这个单词上来看,它似乎和 java.io 包下的 InputStream 和 OutputStream 有些关系。实际上呢,没毛关系。Java 8 新增的 Stream 是为...

立即提问
相关内容推荐