本地怎么通过nacos(注册中心)调用服务器上docker容器中的服务 5C

前言:

nacos(注册中心)是通过 【 ip+端口】 的形式调用服务的。

docker容器使用虚拟ip

本地(开发电脑)直接调,肯定是不行的,因为容器的虚拟IP在我本地是不能ping通的

问题:

本地(我的开发电脑)怎么通过nacos(注册中心)调用服务器上容器中的服务

尝试解决过程:

本 机 ping 服务器 是ok的,服务器 ping 容器ip是ok的,现在我要在本机ping容器IP

通过本地添加路由表的形式,映射容器虚拟IP

route -p add 171.205.0.0 mask 255.255.0.0 172.31.11.41

结果:

路由表添加成功,但是调用仍然失败

2个回答

qq_36357399
CsDn_小白 我windows本地,开发调试怎么调用
3 个月之前 回复
qq_36357399
CsDn_小白 感谢已解决,同方法。
大约一个月之前 回复
Csdn user default icon
上传中...
上传图片
插入图片
抄袭、复制答案,以达到刷声望分或其他目的的行为,在CSDN问答是严格禁止的,一经发现立刻封号。是时候展现真正的技术了!
其他相关推荐
Nacos作为服务发现,将服务下线之后,还能被调用?
使用Nacos作为服务注册中心,但是我在页面进行服务下线之后,调用该服务的接口还是可以调用,这是正常的?那下线的真是含义是?新手上路,希望大家指教。 ![图片说明](https://img-ask.csdn.net/upload/201910/18/1571390952_311104.png)
nacos作为注册中心,启动后会寻找我未注册的服务(以我数据ip作为的明名)
nacos作为注册中心,我使用项目名作为服务名注册,启动不会报错,但是访问项目之后,会报错,nacos一直在寻找以我数据库的ip作为的服务名,什么原因? 2019-07-02 14:48:17.850 ERROR [smallcircle-member,,,] 11616 --- [.naming.updater] com.alibaba.nacos.client.naming : [] [] [CALL-SERVER] failed to req API:http://127.0.0.1:8848/nacos/v1/ns/api/srvIPXT. code:404 msg: service not found: DEFAULT_GROUP@@47.98.52.252 2019-07-02 14:48:17.851 ERROR [smallcircle-member,,,] 11616 --- [.naming.updater] com.alibaba.nacos.client.naming : [] [] [NA] req api:/nacos/v1/ns/api/srvIPXT failed, server(127.0.0.1:8848 com.alibaba.nacos.api.exception.NacosException: null at com.alibaba.nacos.client.naming.net.NamingProxy.callServer(NamingProxy.java:304) [nacos-client-0.6.2.jar:na] at com.alibaba.nacos.client.naming.net.NamingProxy.reqAPI(NamingProxy.java:327) [nacos-client-0.6.2.jar:na] at com.alibaba.nacos.client.naming.net.NamingProxy.reqAPI(NamingProxy.java:310) [nacos-client-0.6.2.jar:na] at com.alibaba.nacos.client.naming.net.NamingProxy.reqAPI(NamingProxy.java:257) [nacos-client-0.6.2.jar:na] at com.alibaba.nacos.client.naming.core.HostReactor.updateServiceNow(HostReactor.java:340) [nacos-client-0.6.2.jar:na] at com.alibaba.nacos.client.naming.core.HostReactor$UpdateTask.run(HostReactor.java:429) [nacos-client-0.6.2.jar:na] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_171] at java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:266) [na:1.8.0_171] at java.util.concurrent.FutureTask.run(FutureTask.java) [na:1.8.0_171] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) [na:1.8.0_171] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) [na:1.8.0_171] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [na:1.8.0_171] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [na:1.8.0_171] at java.lang.Thread.run(Thread.java:748) [na:1.8.0_171] 2019-07-02 14:48:17.852 ERROR [smallcircle-member,,,] 11616 --- [.naming.updater] com.alibaba.nacos.client.naming : [] [] [NA] failed to update serviceName: 47.98.52.252
Nacos作为注册中心,使用spring cloud config作为配置中心,discoveryClient冲突
1.本项目是基于jdk1.8开发的,使用doker容器部署nacos注册中心,及发布服务实例 ,在服务实例中启用spring.cloud.config.discovery.enabled=true配置,实例运行报错,提示找不到DiscoveryClient, 2.程序启动类WaiterServiceApplication ``` @EnableDiscoveryClient @SpringBootApplication public class WaiterServiceApplication { public static void main(String[] args) { SpringApplication.run(WaiterServiceApplication.class, args); } } ``` 3.bootstrap。properties配置 ``` spring.application.name=waiter-service spring.cloud.nacos.discovery.server-addr=192.168.40.129:8848 spring.cloud.config.discovery.enabled=true spring.cloud.config.discovery.service-id=config-service ``` 4.maven pom.xml主要依赖 ``` <dependency> <groupId>org.springframework.cloud</groupId> <artifactId>spring-cloud-starter-alibaba-nacos-discovery</artifactId> <version>0.9.0.RELEASE</version> </dependency> <dependency> <groupId>org.springframework.cloud</groupId> <artifactId>spring-cloud-starter-config</artifactId> </dependency> ``` 5.服务启动报错信息 ``` Caused by: org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type 'org.springframework.cloud.client.discovery.DiscoveryClient' available: expected at least 1 bean which qualifies as autowire candidate. Dependency annotations: {} at org.springframework.beans.factory.support.DefaultListableBeanFactory.raiseNoMatchingBeanFound(DefaultListableBeanFactory.java:1654) ~[spring-beans-5.1.6.RELEASE.jar:5.1.6.RELEASE] at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1213) ~[spring-beans-5.1.6.RELEASE.jar:5.1.6.RELEASE] at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1167) ~[spring-beans-5.1.6.RELEASE.jar:5.1.6.RELEASE] at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:857) ~[spring-beans-5.1.6.RELEASE.jar:5.1.6.RELEASE] at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:760) ~[spring-beans-5.1.6.RELEASE.jar:5.1.6.RELEASE] ... 43 common frames omitte ``` 6.尝试方案 方案一:移除spring.cloud.config.discovery.enabled=true配置后,程序可以正常启动并成功注册到注册中心,但是无法获取cloud config info 方案二:将spring.cloud.config.discovery.enabled=true 及spring.cloud.config.discovery.service-id=config-service移到application.yml文件中,程序正常启动并成功注册到注册中心,但是无法还是获取cloud config info 7.各位大佬,类似这种情形应该如何处理
nacos作为注册中心,启动后会寻找我未注册的服务(数据库ip的服务名)
``` 2019-07-02 14:48:17.850 ERROR [smallcircle-member,,,] 11616 --- [.naming.updater] com.alibaba.nacos.client.naming : [] [] [CALL-SERVER] failed to req API:http://127.0.0.1:8848/nacos/v1/ns/api/srvIPXT. code:404 msg: service not found: DEFAULT_GROUP@@47.98.52.252 2019-07-02 14:48:17.851 ERROR [smallcircle-member,,,] 11616 --- [.naming.updater] com.alibaba.nacos.client.naming : [] [] [NA] req api:/nacos/v1/ns/api/srvIPXT failed, server(127.0.0.1:8848 com.alibaba.nacos.api.exception.NacosException: null at com.alibaba.nacos.client.naming.net.NamingProxy.callServer(NamingProxy.java:304) [nacos-client-0.6.2.jar:na] at com.alibaba.nacos.client.naming.net.NamingProxy.reqAPI(NamingProxy.java:327) [nacos-client-0.6.2.jar:na] at com.alibaba.nacos.client.naming.net.NamingProxy.reqAPI(NamingProxy.java:310) [nacos-client-0.6.2.jar:na] at com.alibaba.nacos.client.naming.net.NamingProxy.reqAPI(NamingProxy.java:257) [nacos-client-0.6.2.jar:na] at com.alibaba.nacos.client.naming.core.HostReactor.updateServiceNow(HostReactor.java:340) [nacos-client-0.6.2.jar:na] at com.alibaba.nacos.client.naming.core.HostReactor$UpdateTask.run(HostReactor.java:429) [nacos-client-0.6.2.jar:na] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_171] at java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:266) [na:1.8.0_171] at java.util.concurrent.FutureTask.run(FutureTask.java) [na:1.8.0_171] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) [na:1.8.0_171] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) [na:1.8.0_171] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [na:1.8.0_171] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [na:1.8.0_171] at java.lang.Thread.run(Thread.java:748) [na:1.8.0_171] 2019-07-02 14:48:17.852 ERROR [smallcircle-member,,,] 11616 --- [.naming.updater] com.alibaba.nacos.client.naming : [] [] [NA] failed to update serviceName: 47.98.52.252 ```
Dubbo服务部署到服务器上,从本地访问服务器上的服务访问不通
<br> ### /----------原问题----------/(问题已经解决,答案在后文) <br> <br> 如题,项目使用的是dubbo+nacos,在本地上启动provider端和customer端,可以正常访问。 (provider和customer端同时部署到服务器上,也可以正常访问) 但是,把provider端部署到linux服务器上,再从本地启动customer端就访问不了,一直显示超时(两者分开部署,访问显示超时)... ``` org.apache.dubbo.remoting.TimeoutException: Waiting server-side response timeout by scan timer. start time: 2019-10-21 11:32:02.027, end time: 2019-10-21 11:32:12.047, client elapsed: 1 ms, server elapsed: 10018 ms, timeout: 10000 ms, request: Request [id=5, version=2.0.2, twoway=true, event=false, broken=false, data=RpcInvocation [methodName=sayHello, parameterTypes=[], arguments=[], attachments={path=com.braisedpanda.suanfa.service.TestService, activelimit_filter_start_time=1571628722027, interface=com.braisedpanda.suanfa.service.TestService, version=1.0.0, timeout=10000}]], channel: /172.215.1.27:50876 -> /172.17.0.2:20880 at org.apache.dubbo.remoting.exchange.support.DefaultFuture.doReceived(DefaultFuture.java:189) ~[dubbo-2.7.3.jar:2.7.3] at org.apache.dubbo.remoting.exchange.support.DefaultFuture.received(DefaultFuture.java:153) ~[dubbo-2.7.3.jar:2.7.3] at org.apache.dubbo.remoting.exchange.support.DefaultFuture$TimeoutCheckTask.run(DefaultFuture.java:252) ~[dubbo-2.7.3.jar:2.7.3] at org.apache.dubbo.common.timer.HashedWheelTimer$HashedWheelTimeout.expire(HashedWheelTimer.java:648) ~[dubbo-2.7.3.jar:2.7.3] at org.apache.dubbo.common.timer.HashedWheelTimer$HashedWheelBucket.expireTimeouts(HashedWheelTimer.java:727) ~[dubbo-2.7.3.jar:2.7.3] at org.apache.dubbo.common.timer.HashedWheelTimer$Worker.run(HashedWheelTimer.java:449) ~[dubbo-2.7.3.jar:2.7.3] at java.lang.Thread.run(Thread.java:748) [na:1.8.0_211] ``` 网页报错:![图片说明](https://img-ask.csdn.net/upload/201910/21/1571629077_503413.png) 配置: ![图片说明](https://img-ask.csdn.net/upload/201910/21/1571629427_901327.png) ![图片说明](https://img-ask.csdn.net/upload/201910/21/1571629436_736567.png) 部署linux:![图片说明](https://img-ask.csdn.net/upload/201910/21/1571629510_364799.png) 网上各种方法都试过了,折腾一周了还是不行, 设置timeout时间,关闭Linux防火墙,删除了连接数据的代码,直到后来我甚至怀疑服务器有问题,又租了台服务器测试还是不行....... 快要崩溃了... 希望有大佬能指出问题... <br> <br> <br> /----------以下是问题解决方案----------/ ####先说下问题出现的原因: dubbo服务部署在linux服务器上时,它默认使用的是linux系统的内网,对外暴露的是内网的地址,这就导致我本地项目访问该服务器服务时,访问不通,出现超时错误。这就解释了为什么服务端和消费端同时部署在本地,或者同时部署在服务器上时,可以畅通无阻的访问了。 ![图片说明](https://img-ask.csdn.net/upload/201910/23/1571810544_540718.png) 如图片所示,我原先注册的服务全是172开头的内网,本地访问不通。后来修改了之后,dubbo对外暴露外网,可以顺利访问。 至于如何让dubbo在linux部署时暴露外网服务,网上有很多教程,说什么修改linux hosts主机名和公网ip,修改linux DNS地址,甚至还有禁用linux网卡.. 这些我都试过了,全都没用~ 每个人的情况都不一样,可能对有些人有用 ####我的解决办法: 1、 我在项目配置中添加了: spring.cloud.nacos.discovery.ip=39.98.131.xxx (这个ip地址,是你要部署服务器的公网地址。 比如你要部署到公网地址为39.98.131.588这台服务器上, 那上面就写39.98.131.588) 2、docker 运行镜像时,添加 DUBBO_IP_TO_REGISTRY语句 ``` docker run -d -e DUBBO_IP_TO_REGISTRY=39.98.131.xxx -e DUBBO_PORT_TO_REGISTRY=20880 -p 20880:20880 --name xxx xxxx(镜像名称) ``` 这下,这个问题终于解决了,真的坑,折腾我好久了,麻蛋~ 下面贴上我的配置,仅供参考,方面同样被这个问题困扰的小伙伴早点脱困 bootstrap.yml ``` ##应用名称 spring: main: allow-bean-definition-overriding: true # profiles: # active: public cloud: nacos: discovery: enabled: true register-enabled: true server-addr: ${spring.cloud.nacos.config.server-addr} namespace: ${spring.cloud.nacos.config.namespace} config: server-addr: 47.98.135.xxx:8848 group: provider namespace: c229ab10-2e39-4444-be97-048b3a5ef49d file-extension: yaml ``` <br> application.properties ``` spring.application.name=balance-suanfa-provider dubbo.application.name=balance-suanfa-provider dubbo.scan.base-packages=com.braisedpanda.suanfa.service dubbo.protocol.name=dubbo dubbo.protocol.port=20880 #这个本地测试就注释吧,部署到服务器上时,根据服务器公网填写 #spring.cloud.nacos.discovery.ip=39.98.131.xxx dubbo.registry.address=nacos://47.98.135.xxx:8848 dubbo.provider.loadbalance=myRoundRobin #dubbo.provider.actives=8 #dubbo.provider.executes=8 dubbo.provider.dispatcher=message dubbo.provider.threadpool=cached dubbo.provider.timeout=50000 dubbo.provider.delay=-1 dubbo.application.dump-directory=/tmp dubbo.provider.cluster=failfast dubbo.consumer.check=false dubbo.registry.check=false ``` 附上两篇有用的参考连接: https://juejin.im/post/5b2072016fb9a01e2d704431 https://www.jianshu.com/p/7c29a24a917d
nacos作为注册中心,启动不报错,访问后报错,寻找不到服务
![图片说明](https://img-ask.csdn.net/upload/201907/02/1562053574_327512.png) 该ip地址是数据库i,不需要注册服务,为什么会寻找这个服务。
docker-maven-plugin build 出现的一些问题,欢迎各位大佬前来解决
1. com.spotify:docker-maven-plugin:0.4.13:build (default-cli) on project springboot-nacos-register: Exception caught: Request error: POST http://192.168.120.134:2375/build?t=nacos: 500: HTTP 500 Internal Server Error -> [Help 1] 这个问题我百度了下往上说的都是repository name 不符合要求,但是我改成nacos去build 还是不行 2. 使用intellij idea docker 插件的时候 add * app.jar 提示 add 后面没有内容 这个错误
SpringCloud项目集成阿里云nacos配置管理后 启动报错
SpringBoot项目改SpringCloud项目,没集成alibaba nacos时运行无异常,集成了之后启动报错,提示找不到相关bean,但是功能却是正常的。 云配置文件是加载到了的 ![报错示意](https://img-ask.csdn.net/upload/201911/14/1573717658_234284.png) ![配置](https://img-ask.csdn.net/upload/201911/14/1573717700_562227.png) ![配置](https://img-ask.csdn.net/upload/201911/14/1573717710_263011.png) 报错内容: ``` 2019-11-14 15:31:28 [ com.alibaba.nacos.client.Worker.longPullingacm.aliyun.com-32f1587d-47d1-4060-bf86-870b0d47c500:17212 ] - [ WARN ] org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:558) Exception encountered during context initialization - cancelling refresh attempt: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'invoiceCallbackApiController': Unsatisfied dependency expressed through field 'invoiceCallbackProducer'; nested exception is org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'yongYouInvoiceCallbackProducer': Unsatisfied dependency expressed through field 'amqpTemplate'; nested exception is org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type 'org.springframework.amqp.core.AmqpTemplate' available: expected at least 1 bean which qualifies as autowire candidate. Dependency annotations: {@org.springframework.beans.factory.annotation.Autowired(required=true)} 2019-11-14 15:31:28 [ com.alibaba.nacos.client.Worker.longPullingacm.aliyun.com-32f1587d-47d1-4060-bf86-870b0d47c500:17220 ] - [ ERROR ] org.springframework.boot.diagnostics.LoggingFailureAnalysisReporter.report(LoggingFailureAnalysisReporter.java:42) *************************** APPLICATION FAILED TO START *************************** Description: Field amqpTemplate in cn.xxx.invoice.service.rabbitmq.producer.YongYouInvoiceCallbackProducer required a bean of type 'org.springframework.amqp.core.AmqpTemplate' that could not be found. Action: Consider defining a bean of type 'org.springframework.amqp.core.AmqpTemplate' in your configuration. ```
@RefreshScope不起作用springcloud+nacos
在service实现类里加@RefreshScope不起作用 @Service @Transactional(rollbackFor = {Exception.class, RuntimeException.class}) @RefreshScope//加注解了 public class MessageServiceImpl implements MessageService { private Logger logger = LoggerFactory.getLogger(MessageServiceImpl.class); @Autowired private MessageMapper messageMapper; @Value("${batchInsertCount}") private int maxInsertSize; @Value("${deleteMsg.days}") private int msgDays; @Scheduled(cron = "${scheduStr}") public void timerDeleteMsg(){ logger.info("消息有效期:"+msgDays);//配置文件修改后,msgDays还是旧值 //代码省略 } } bootstrap.properties里配置: management.endpoints.web.exposure.include= * 配置文件修改后,msgDays还是旧值
spring cloud 中 gateway网关算是服务消费者吗?
spring cloud 搭建微服务,nacos做注册中心,gateway作ApI网关(分流及安全验证),其他的都是业务微服务,请求从gateway进来直接调用各服务,各业务微服务是生产者(吗?),那这里的gateway算是服务消费者吗?不懂网关下的服务生产者和消费者是什么样的关系和联系?
springboot 项目在本地启动正常,maven打包运行包如下错误,在项目中去掉 @KafkaListener注解打包运行不会报错,怀疑是jar依赖的问题,求解!
[2019-11-25 14:53:48.299][main][KEY:] INFO- Stopping service [Tomcat] -[org.apache.juli.logging.DirectJDKLog:log] [DirectJDKLog.java:180] [2019-11-25 14:53:48.499][localhost-startStop-2][KEY:] WARN- The web application [ROOT] appears to have started a thread named [logback-1] but ha s failed to stop it. This is very likely to create a memory leak. Stack trace of thread: java.base@11.0.3/jdk.internal.misc.Unsafe.park(Native Method) java.base@11.0.3/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234) java.base@11.0.3/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2123) java.base@11.0.3/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1182) java.base@11.0.3/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:899) java.base@11.0.3/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1054) java.base@11.0.3/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1114) java.base@11.0.3/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) java.base@11.0.3/java.lang.Thread.run(Thread.java:834) -[org.apache.juli.logging.DirectJDKLog:log] [DirectJDKLog.java:180] [2019-11-25 14:53:48.503][localhost-startStop-2][KEY:] WARN- The web application [ROOT] appears to have started a thread named [Timer-0] but has failed to stop it. This is very likely to create a memory leak. Stack trace of thread: java.base@11.0.3/java.lang.Object.wait(Native Method) java.base@11.0.3/java.util.TimerThread.mainLoop(Timer.java:553) java.base@11.0.3/java.util.TimerThread.run(Timer.java:506) -[org.apache.juli.logging.DirectJDKLog:log] [DirectJDKLog.java:180] [2019-11-25 14:53:48.504][localhost-startStop-2][KEY:] WARN- The web application [ROOT] appears to have started a thread named [com.alibaba.nacos .naming.client.listener] but has failed to stop it. This is very likely to create a memory leak. Stack trace of thread: java.base@11.0.3/jdk.internal.misc.Unsafe.park(Native Method) java.base@11.0.3/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234) java.base@11.0.3/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2123) java.base@11.0.3/java.util.concurrent.LinkedBlockingQueue.poll(LinkedBlockingQueue.java:458) com.alibaba.nacos.client.naming.core.EventDispatcher$Notifier.run(EventDispatcher.java:114) java.base@11.0.3/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) java.base@11.0.3/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) java.base@11.0.3/java.lang.Thread.run(Thread.java:834) -[org.apache.juli.logging.DirectJDKLog:log] [DirectJDKLog.java:180] [2019-11-25 14:53:48.508][localhost-startStop-2][KEY:] WARN- The web application [ROOT] appears to have started a thread named [com.alibaba.nacos .naming.beat.sender] but has failed to stop it. This is very likely to create a memory leak. Stack trace of thread: java.base@11.0.3/jdk.internal.misc.Unsafe.park(Native Method) java.base@11.0.3/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234) java.base@11.0.3/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2123) java.base@11.0.3/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1182) java.base@11.0.3/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:899) java.base@11.0.3/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1054) java.base@11.0.3/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1114) java.base@11.0.3/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) java.base@11.0.3/java.lang.Thread.run(Thread.java:834) -[org.apache.juli.logging.DirectJDKLog:log] [DirectJDKLog.java:180] [2019-11-25 14:53:48.514][localhost-startStop-2][KEY:] WARN- The web application [ROOT] appears to have started a thread named [com.alibaba.nacos .naming.beat.sender] but has failed to stop it. This is very likely to create a memory leak. Stack trace of thread: java.base@11.0.3/jdk.internal.misc.Unsafe.park(Native Method) java.base@11.0.3/java.util.concurrent.locks.LockSupport.park(LockSupport.java:194) java.base@11.0.3/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2081) java.base@11.0.3/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1177) java.base@11.0.3/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:899) java.base@11.0.3/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1054) java.base@11.0.3/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1114) java.base@11.0.3/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) java.base@11.0.3/java.lang.Thread.run(Thread.java:834) -[org.apache.juli.logging.DirectJDKLog:log] [DirectJDKLog.java:180] [2019-11-25 14:53:48.525][localhost-startStop-2][KEY:] WARN- The web application [ROOT] appears to have started a thread named [com.alibaba.nacos .naming.failover] but has failed to stop it. This is very likely to create a memory leak. Stack trace of thread: java.base@11.0.3/jdk.internal.misc.Unsafe.park(Native Method) java.base@11.0.3/java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:234) java.base@11.0.3/java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2123) java.base@11.0.3/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:1182) java.base@11.0.3/java.util.concurrent.ScheduledThreadPoolExecutor$DelayedWorkQueue.take(ScheduledThreadPoolExecutor.java:899) java.base@11.0.3/java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1054) java.base@11.0.3/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1114) java.base@11.0.3/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) java.base@11.0.3/java.lang.Thread.run(Thread.java:834) -[org.apache.juli.logging.DirectJDKLog:log] [DirectJDKLog.java:180] [2019-11-25 14:53:48.531][localhost-startStop-2][KEY:] WARN- The web application [ROOT] appears to have started a thread named [com.alibaba.nacos .naming.push.receiver] but has failed to stop it. This is very likely to create a memory leak. Stack trace of thread: java.base@11.0.3/java.net.DualStackPlainDatagramSocketImpl.socketReceiveOrPeekData(Native Method) java.base@11.0.3/java.net.DualStackPlainDatagramSocketImpl.receive0(DualStackPlainDatagramSocketImpl.java:124) java.base@11.0.3/java.net.AbstractPlainDatagramSocketImpl.receive(AbstractPlainDatagramSocketImpl.java:181) java.base@11.0.3/java.net.DatagramSocket.receive(DatagramSocket.java:814) java.base@11.0.3/java.net.AbstractPlainDatagramSocketImpl.receive(AbstractPlainDatagramSocketImpl.j java.base@11.0.3/java.net.DatagramSocket.receive(DatagramSocket.java:814) com.alibaba.nacos.client.naming.core.PushReceiver.run(PushReceiver.java:73) java.base@11.0.3/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) java.base@11.0.3/java.util.concurrent.FutureTask.run(FutureTask.java:264) java.base@11.0.3/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Scheduled java.base@11.0.3/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) java.base@11.0.3/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) java.base@11.0.3/java.lang.Thread.run(Thread.java:834) -[org.apache.juli.logging.DirectJDKLog:log]
求帮忙分析下Tomcat崩溃日志
``` # # There is insufficient memory for the Java Runtime Environment to continue. # Native memory allocation (malloc) failed to allocate 572522496 bytes for committing reserved memory. # Possible reasons: # The system is out of physical RAM or swap space # In 32 bit mode, the process size limit was hit # Possible solutions: # Reduce memory load on the system # Increase physical memory or swap space # Check if swap backing store is full # Use 64 bit Java on a 64 bit OS # Decrease Java heap size (-Xmx/-Xms) # Decrease number of Java threads # Decrease Java thread stack sizes (-Xss) # Set larger code cache with -XX:ReservedCodeCacheSize= # This output file may be truncated or incomplete. # # Out of Memory Error (os_linux.cpp:2756), pid=25208, tid=140304851973888 # # JRE version: Java(TM) SE Runtime Environment (7.0_80-b15) (build 1.7.0_80-b15) # Java VM: Java HotSpot(TM) 64-Bit Server VM (24.80-b11 mixed mode linux-amd64 compressed oops) # Failed to write core dump. Core dumps have been disabled. To enable core dumping, try "ulimit -c unlimited" before starting Java again # --------------- T H R E A D --------------- Current thread (0x00007f9b4c06a000): VMThread [stack: 0x00007f9b44ccb000,0x00007f9b44dcc000] [id=25214] Stack: [0x00007f9b44ccb000,0x00007f9b44dcc000], sp=0x00007f9b44dca1a0, free space=1020k Native frames: (J=compiled Java code, j=interpreted, Vv=VM code, C=native code) V [libjvm.so+0x9a320a] VMError::report_and_die()+0x2ea V [libjvm.so+0x498d3b] report_vm_out_of_memory(char const*, int, unsigned long, char const*)+0x9b V [libjvm.so+0x82191e] os::Linux::commit_memory_impl(char*, unsigned long, bool)+0xfe V [libjvm.so+0x821e69] os::pd_commit_memory(char*, unsigned long, unsigned long, bool)+0x29 V [libjvm.so+0x81bb6a] os::commit_memory(char*, unsigned long, unsigned long, bool)+0x2a V [libjvm.so+0x88d623] PSVirtualSpace::expand_by(unsigned long)+0x53 V [libjvm.so+0x87ca80] PSOldGen::expand(unsigned long)+0x170 V [libjvm.so+0x87cc8b] PSOldGen::resize(unsigned long)+0x1cb V [libjvm.so+0x885951] PSParallelCompact::invoke_no_policy(bool)+0x9b1 V [libjvm.so+0x88b8cd] PSScavenge::invoke()+0x1ad V [libjvm.so+0x843f40] ParallelScavengeHeap::failed_mem_allocate(unsigned long)+0x70 V [libjvm.so+0x9a4a97] VM_ParallelGCFailedAllocation::doit()+0x97 V [libjvm.so+0x9abf35] VM_Operation::evaluate()+0x55 V [libjvm.so+0x9aa2fa] VMThread::evaluate_operation(VM_Operation*)+0xba V [libjvm.so+0x9aa67e] VMThread::loop()+0x1ce V [libjvm.so+0x9aaaf0] VMThread::run()+0x70 V [libjvm.so+0x8238c8] java_start(Thread*)+0x108 VM_Operation (0x00007f9b360f5340): ParallelGCFailedAllocation, mode: safepoint, requested by thread 0x00007f9af2e2b800 --------------- P R O C E S S --------------- Java Threads: ( => current thread ) 0x00007f9ab8053800 JavaThread "pool-2-thread-10937" [_thread_blocked, id=20834, stack(0x00007f9aa5b9c000,0x00007f9aa5c9d000)] 0x00007f9ab809d000 JavaThread "pool-2-thread-10936" [_thread_blocked, id=20833, stack(0x00007f9aa6bac000,0x00007f9aa6cad000)] 0x00007f9b4c7c4800 JavaThread "pool-8-thread-5804" [_thread_blocked, id=19335, stack(0x00007f9aa81ee000,0x00007f9aa82ef000)] 0x00007f9b4c3e0000 JavaThread "pool-8-thread-5803" [_thread_blocked, id=18507, stack(0x00007f9b342a0000,0x00007f9b343a1000)] 0x00007f9b4c2c4000 JavaThread "MQTT Ping: GID_EDO_NOTICE@@@admin" [_thread_blocked, id=28435, stack(0x00007f9aa9a06000,0x00007f9aa9b07000)] 0x00007f9b4c485800 JavaThread "MQTT Call: GID_EDO_NOTICE@@@admin" [_thread_blocked, id=28434, stack(0x00007f9b081c0000,0x00007f9b082c1000)] 0x00007f9b4c428800 JavaThread "MQTT Snd: GID_EDO_NOTICE@@@admin" [_thread_blocked, id=28433, stack(0x00007f9aaa915000,0x00007f9aaaa16000)] 0x00007f9b4c81f800 JavaThread "MQTT Rec: GID_EDO_NOTICE@@@admin" [_thread_in_native, id=28432, stack(0x00007f9aa5798000,0x00007f9aa5899000)] 0x00007f9ad0101000 JavaThread "http-bio-8080-exec-34" daemon [_thread_blocked, id=20845, stack(0x00007f9aa72b3000,0x00007f9aa73b4000)] 0x00007f9ad00ff800 JavaThread "http-bio-8080-exec-33" daemon [_thread_blocked, id=20844, stack(0x00007f9aab01c000,0x00007f9aab11d000)] 0x00007f9ad00fc000 JavaThread "http-bio-8080-exec-30" daemon [_thread_blocked, id=20838, stack(0x00007f9aa9703000,0x00007f9aa9804000)] 0x00007f9ad00fb000 JavaThread "http-bio-8080-exec-29" daemon [_thread_blocked, id=20837, stack(0x00007f9aa92ff000,0x00007f9aa9400000)] 0x00007f9ad0110800 JavaThread "http-bio-8080-exec-28" daemon [_thread_blocked, id=20836, stack(0x00007f9aa8cf9000,0x00007f9aa8dfa000)] 0x00007f9ad010f800 JavaThread "http-bio-8080-exec-27" daemon [_thread_blocked, id=20832, stack(0x00007f9aaa814000,0x00007f9aaa915000)] 0x00007f9ad010e000 JavaThread "http-bio-8080-exec-26" daemon [_thread_blocked, id=20826, stack(0x00007f9aa87f4000,0x00007f9aa88f5000)] 0x00007f9ad010d800 JavaThread "http-bio-8080-exec-25" daemon [_thread_blocked, id=20825, stack(0x00007f9aa85f2000,0x00007f9aa86f3000)] 0x00007f9ad0077800 JavaThread "http-bio-8080-exec-24" daemon [_thread_blocked, id=20822, stack(0x00007f9aaa00c000,0x00007f9aaa10d000)] 0x00007f9ad00f0000 JavaThread "http-bio-8080-exec-23" daemon [_thread_blocked, id=20821, stack(0x00007f9aa8ffc000,0x00007f9aa90fd000)] 0x00007f9ad0075800 JavaThread "http-bio-8080-exec-22" daemon [_thread_blocked, id=20820, stack(0x00007f9aaa10d000,0x00007f9aaa20e000)] 0x00007f9af423f800 JavaThread "Java2D Disposer" daemon [_thread_blocked, id=28691, stack(0x00007f9aab7f8000,0x00007f9aab8f9000)] 0x00007f9af1c93800 JavaThread "taskExecutor-5" [_thread_blocked, id=28196, stack(0x00007f9aa9d09000,0x00007f9aa9e0a000)] 0x00007f9af2cef000 JavaThread "taskExecutor-4" [_thread_blocked, id=28195, stack(0x00007f9aa89f6000,0x00007f9aa8af7000)] 0x00007f9af2cee800 JavaThread "taskExecutor-3" [_thread_blocked, id=28194, stack(0x00007f9aaac18000,0x00007f9aaad19000)] 0x00007f9af3870000 JavaThread "taskExecutor-2" [_thread_blocked, id=26218, stack(0x00007f9aa9501000,0x00007f9aa9602000)] 0x00007f9af10db000 JavaThread "taskExecutor-1" [_thread_blocked, id=26216, stack(0x00007f9aaaf1b000,0x00007f9aab01c000)] 0x00007f9b4c4b1000 JavaThread "idle_connection_reaper" daemon [_thread_blocked, id=25553, stack(0x00007f9b3759a000,0x00007f9b3769b000)] 0x00007f9af40df800 JavaThread "HSF-AddressProfiler" daemon [_thread_blocked, id=25354, stack(0x00007f9b37499000,0x00007f9b3759a000)] 0x00007f9ae82ff800 JavaThread "HSF-Worker-2-thread-8" daemon [_thread_blocked, id=25334, stack(0x00007f9aab8f9000,0x00007f9aab9fa000)] 0x00007f9ad80ab800 JavaThread "HSF-Worker-2-thread-7" daemon [_thread_blocked, id=25333, stack(0x00007f9aab9fa000,0x00007f9aabafb000)] 0x00007f9ac003b800 JavaThread "HSF-Worker-2-thread-6" daemon [_thread_blocked, id=25332, stack(0x00007f9aabafb000,0x00007f9aabbfc000)] 0x00007f9aec03c000 JavaThread "HSF-Worker-2-thread-1" daemon [_thread_blocked, id=25331, stack(0x00007f9aabbfc000,0x00007f9aabcfd000)] 0x00007f9ab4054800 JavaThread "HSF-Worker-2-thread-3" daemon [_thread_blocked, id=25330, stack(0x00007f9aabcfd000,0x00007f9aabdfe000)] 0x00007f9af8098800 JavaThread "HSF-Worker-2-thread-4" daemon [_thread_blocked, id=25329, stack(0x00007f9aabdfe000,0x00007f9aabeff000)] 0x00007f9ac00d9000 JavaThread "HSF-Worker-2-thread-5" daemon [_thread_blocked, id=25328, stack(0x00007f9aabeff000,0x00007f9aac000000)] 0x00007f9abc136800 JavaThread "HSF-Worker-2-thread-2" daemon [_thread_blocked, id=25327, stack(0x00007f9b08061000,0x00007f9b08162000)] 0x00007f9ad0073800 JavaThread "http-bio-8080-exec-21" daemon [_thread_blocked, id=25324, stack(0x00007f9b084c3000,0x00007f9b085c4000)] 0x00007f9ad0071000 JavaThread "http-bio-8080-exec-20" daemon [_thread_blocked, id=25323, stack(0x00007f9b085c4000,0x00007f9b086c5000)] 0x00007f9ad006f000 JavaThread "http-bio-8080-exec-19" daemon [_thread_blocked, id=25322, stack(0x00007f9b086c5000,0x00007f9b087c6000)] 0x00007f9ad006a800 JavaThread "http-bio-8080-exec-17" daemon [_thread_blocked, id=25320, stack(0x00007f9b088c7000,0x00007f9b089c8000)] 0x00007f9ad0068800 JavaThread "http-bio-8080-exec-16" daemon [_thread_blocked, id=25319, stack(0x00007f9b089c8000,0x00007f9b08ac9000)] 0x00007f9ad0066800 JavaThread "http-bio-8080-exec-15" daemon [_thread_blocked, id=25317, stack(0x00007f9b08ac9000,0x00007f9b08bca000)] 0x00007f9ad0064800 JavaThread "http-bio-8080-exec-14" daemon [_thread_blocked, id=25316, stack(0x00007f9b08bca000,0x00007f9b08ccb000)] 0x00007f9ad0062800 JavaThread "http-bio-8080-exec-13" daemon [_thread_blocked, id=25315, stack(0x00007f9b08ccb000,0x00007f9b08dcc000)] 0x00007f9ad000a800 JavaThread "http-bio-8080-exec-12" daemon [_thread_blocked, id=25314, stack(0x00007f9b08dcc000,0x00007f9b08ecd000)] 0x00007f9ad0008800 JavaThread "http-bio-8080-exec-11" daemon [_thread_blocked, id=25313, stack(0x00007f9b08ecd000,0x00007f9b08fce000)] 0x00007f9b4c0d9000 JavaThread "http-bio-8080-AsyncTimeout" daemon [_thread_blocked, id=25312, stack(0x00007f9b08fce000,0x00007f9b090cf000)] 0x00007f9b4c0d7800 JavaThread "http-bio-8080-Acceptor-0" daemon [_thread_blocked, id=25311, stack(0x00007f9b090cf000,0x00007f9b091d0000)] 0x00007f9b4c0d4800 JavaThread "http-bio-8080-exec-10" daemon [_thread_blocked, id=25310, stack(0x00007f9b091d0000,0x00007f9b092d1000)] 0x00007f9b4c43e800 JavaThread "http-bio-8080-exec-7" daemon [_thread_blocked, id=25307, stack(0x00007f9b094d3000,0x00007f9b095d4000)] 0x00007f9b4c43c800 JavaThread "http-bio-8080-exec-6" daemon [_thread_blocked, id=25306, stack(0x00007f9b095d4000,0x00007f9b096d5000)] 0x00007f9b4c43a800 JavaThread "http-bio-8080-exec-5" daemon [_thread_blocked, id=25305, stack(0x00007f9b096d5000,0x00007f9b097d6000)] 0x00007f9b4c168800 JavaThread "http-bio-8080-exec-4" daemon [_thread_blocked, id=25304, stack(0x00007f9b37095000,0x00007f9b37196000)] 0x00007f9b4c166800 JavaThread "http-bio-8080-exec-3" daemon [_thread_blocked, id=25303, stack(0x00007f9b37196000,0x00007f9b37297000)] 0x00007f9b4c165000 JavaThread "http-bio-8080-exec-2" daemon [_thread_blocked, id=25302, stack(0x00007f9b37297000,0x00007f9b37398000)] 0x00007f9b4c163000 JavaThread "http-bio-8080-exec-1" daemon [_thread_blocked, id=25301, stack(0x00007f9b37398000,0x00007f9b37499000)] 0x00007f9b4c162800 JavaThread "ContainerBackgroundProcessor[StandardEngine[Catalina]]" daemon [_thread_blocked, id=25300, stack(0x00007f9b097d6000,0x00007f9b098d7000)] 0x00007f9af1ae3000 JavaThread "pool-7-thread-1" [_thread_blocked, id=25298, stack(0x00007f9b098d7000,0x00007f9b099d8000)] 0x00007f9af1ae0000 JavaThread "redisMessageListenerContainer-1" [_thread_blocked, id=25297, stack(0x00007f9b354eb000,0x00007f9b355ec000)] 0x00007f9af1589800 JavaThread "commons-pool-EvictionTimer" daemon [_thread_blocked, id=25296, stack(0x00007f9b099d8000,0x00007f9b09ad9000)] 0x00007f9b006a9000 JavaThread "ConfigClientNotifier" daemon [_thread_blocked, id=25291, stack(0x00007f9b3409e000,0x00007f9b3419f000)] 0x00007f9b1c023800 JavaThread "SocketConnectorIoProcessor-0.0" daemon [_thread_blocked, id=25289, stack(0x00007f9b3419f000,0x00007f9b342a0000)] 0x00007f9b00221000 JavaThread "PooledByteBufferExpirer-0" daemon [_thread_blocked, id=25287, stack(0x00007f9b343a1000,0x00007f9b344a2000)] 0x00007f9b00180000 JavaThread "com.taobao.remoting.TimerThread" daemon [_thread_blocked, id=25286, stack(0x00007f9b344a2000,0x00007f9b345a3000)] 0x00007f9ae804c000 JavaThread "HSF-RuntimeInfo-Publisher-7-thread-1" daemon [_thread_blocked, id=25285, stack(0x00007f9b345a3000,0x00007f9b346a4000)] 0x00007f9af0676800 JavaThread "com.taobao.config.client.timer" daemon [_thread_blocked, id=25284, stack(0x00007f9b346a4000,0x00007f9b347a5000)] 0x00007f9af3291000 JavaThread "ConfigClientWorker-Thread" daemon [_thread_blocked, id=25283, stack(0x00007f9b347a5000,0x00007f9b348a6000)] 0x00007f9af3985800 JavaThread "pool-4-thread-1" [_thread_blocked, id=25282, stack(0x00007f9b348a6000,0x00007f9b349a7000)] 0x00007f9af3982800 JavaThread "com.taobao.config.client.timer" daemon [_thread_blocked, id=25281, stack(0x00007f9b349a7000,0x00007f9b34aa8000)] 0x00007f9ae81e1800 JavaThread "HSF-Remoting-Timer-3-thread-1" daemon [_thread_blocked, id=25280, stack(0x00007f9b34aa8000,0x00007f9b34ba9000)] 0x00007f9af0f67800 JavaThread "HSF-AddressPool-Default-1-thread-1" daemon [_thread_blocked, id=25277, stack(0x00007f9b34be2000,0x00007f9b34ce3000)] 0x00007f9ad0006000 JavaThread "NacosConfigListener-ThreadPool-4" daemon [_thread_blocked, id=25276, stack(0x00007f9b34ce3000,0x00007f9b34de4000)] 0x00007f9ad4001000 JavaThread "NacosConfigListener-ThreadPool-3" daemon [_thread_blocked, id=25275, stack(0x00007f9b34de4000,0x00007f9b34ee5000)] 0x00007f9ad0004800 JavaThread "com.alibaba.nacos.client.Worker.longPullingfixed-10.201.199.104:8848-product" daemon [_thread_in_native, id=25274, stack(0x00007f9b34ee5000,0x00007f9b34fe6000)] 0x00007f9ad0002800 JavaThread "NacosConfigListener-ThreadPool-2" daemon [_thread_blocked, id=25273, stack(0x00007f9b34fe6000,0x00007f9b350e7000)] 0x00007f9ad0001800 JavaThread "NacosConfigListener-ThreadPool-1" daemon [_thread_blocked, id=25272, stack(0x00007f9b350e7000,0x00007f9b351e8000)] 0x00007f9ac4001800 JavaThread "com.alibaba.nacos.client.Worker.longPullingfixed-10.201.199.104:8848-product" daemon [_thread_blocked, id=25271, stack(0x00007f9b351e8000,0x00007f9b352e9000)] 0x00007f9af38c4000 JavaThread "com.alibaba.nacos.client.Worker.fixed-10.201.199.104:8848-product" daemon [_thread_blocked, id=25270, stack(0x00007f9b352e9000,0x00007f9b353ea000)] 0x00007f9af2a5a800 JavaThread "Timer-1" daemon [_thread_blocked, id=25269, stack(0x00007f9b353ea000,0x00007f9b354eb000)] 0x00007f9acc00c800 JavaThread "EagleEye-StatLogController-writer-thread-1" daemon [_thread_blocked, id=25252, stack(0x00007f9b35af1000,0x00007f9b35bf2000)] 0x00000000044e4800 JavaThread "com.taobao.diamond.client.Worker.default" daemon [_thread_in_native, id=25251, stack(0x00007f9b355ec000,0x00007f9b356ed000)] 0x0000000003dfb000 JavaThread "com.taobao.diamond.client.Timer" daemon [_thread_blocked, id=25250, stack(0x00007f9b356ed000,0x00007f9b357ee000)] 0x0000000003c1f800 JavaThread "Thread-7" daemon [_thread_blocked, id=25249, stack(0x00007f9b357ee000,0x00007f9b358ef000)] 0x0000000003c1d800 JavaThread "server-timer1" daemon [_thread_blocked, id=25248, stack(0x00007f9b358ef000,0x00007f9b359f0000)] 0x0000000003bd3800 JavaThread "server-timer" daemon [_thread_blocked, id=25247, stack(0x00007f9b359f0000,0x00007f9b35af1000)] 0x00007f9af2d45000 JavaThread "Pandora pandora-qos-reporter Pool [Thread-1]" daemon [_thread_blocked, id=25245, stack(0x00007f9b35bf2000,0x00007f9b35cf3000)] 0x00007f9af2bc5000 JavaThread "qos-boss-1-1" daemon [_thread_blocked, id=25244, stack(0x00007f9b35cf3000,0x00007f9b35df4000)] 0x00007f9ad800d000 JavaThread "EagleEye-AsyncAppender-Thread-TraceLog-atp" daemon [_thread_blocked, id=25242, stack(0x00007f9b35df4000,0x00007f9b35ef5000)] 0x00007f9af2e2c000 JavaThread "metrics-LogDescriptionManager-thread-1" daemon [_thread_blocked, id=25241, stack(0x00007f9b35ef5000,0x00007f9b35ff6000)] 0x00007f9af2e2b800 JavaThread "metrics-bin-reporter-2-thread-1" daemon [_thread_blocked, id=25240, stack(0x00007f9b35ff6000,0x00007f9b360f7000)] 0x00007f9ae8a63800 JavaThread "metrics-BinAppender-thread-1" daemon [_thread_blocked, id=25239, stack(0x00007f9b360f7000,0x00007f9b361f8000)] 0x00007f9ae8554800 JavaThread "metrics-file-reporter-1-thread-1" daemon [_thread_blocked, id=25236, stack(0x00007f9b361f8000,0x00007f9b362f9000)] 0x00007f9ae84d8800 JavaThread "metrics-RollingFileAppender-thread-1" daemon [_thread_blocked, id=25235, stack(0x00007f9b362f9000,0x00007f9b363fa000)] 0x00007f9ae80ee800 JavaThread "EagleEye-ScheduleTaskController-worker-thread-1" daemon [_thread_blocked, id=25234, stack(0x00007f9b363fa000,0x00007f9b364fb000)] 0x00007f9ae80e1800 JavaThread "EagleEye-StatLogController-roller-thread-1" daemon [_thread_blocked, id=25233, stack(0x00007f9b364fb000,0x00007f9b365fc000)] 0x00007f9ae80a6800 JavaThread "EagleEye-LogDaemon-Thread" daemon [_thread_blocked, id=25232, stack(0x00007f9b365fc000,0x00007f9b366fd000)] 0x00007f9af3f8d800 JavaThread "EagleEye-AsyncAppender-Thread-BizLog" daemon [_thread_blocked, id=25231, stack(0x00007f9b366fd000,0x00007f9b367fe000)] 0x00007f9af3e1d800 JavaThread "EagleEye-AsyncAppender-Thread-RpcLog" daemon [_thread_blocked, id=25230, stack(0x00007f9b367fe000,0x00007f9b368ff000)] 0x00007f9af3cab800 JavaThread "EagleEye-AsyncAppender-Thread-SelfLog" daemon [_thread_blocked, id=25229, stack(0x00007f9b368ff000,0x00007f9b36a00000)] 0x00007f9af173f800 JavaThread "Timer-0" daemon [_thread_blocked, id=25228, stack(0x00007f9b36a00000,0x00007f9b36b01000)] 0x00007f9b4c556800 JavaThread "GC Daemon" daemon [_thread_blocked, id=25222, stack(0x00007f9b4403a000,0x00007f9b4413b000)] 0x00007f9b4c099800 JavaThread "Service Thread" daemon [_thread_blocked, id=25220, stack(0x00007f9b446c5000,0x00007f9b447c6000)] 0x00007f9b4c097000 JavaThread "C2 CompilerThread1" daemon [_thread_blocked, id=25219, stack(0x00007f9b447c6000,0x00007f9b448c7000)] 0x00007f9b4c094000 JavaThread "C2 CompilerThread0" daemon [_thread_blocked, id=25218, stack(0x00007f9b448c7000,0x00007f9b449c8000)] 0x00007f9b4c091800 JavaThread "Signal Dispatcher" daemon [_thread_blocked, id=25217, stack(0x00007f9b449c8000,0x00007f9b44ac9000)] 0x00007f9b4c070800 JavaThread "Finalizer" daemon [_thread_blocked, id=25216, stack(0x00007f9b44ac9000,0x00007f9b44bca000)] 0x00007f9b4c06e800 JavaThread "Reference Handler" daemon [_thread_blocked, id=25215, stack(0x00007f9b44bca000,0x00007f9b44ccb000)] 0x00007f9b4c00a000 JavaThread "main" [_thread_in_native, id=25209, stack(0x00007f9b5280b000,0x00007f9b5290c000)] Other Threads: =>0x00007f9b4c06a000 VMThread [stack: 0x00007f9b44ccb000,0x00007f9b44dcc000] [id=25214] 0x00007f9b4c0a4800 WatcherThread [stack: 0x00007f9b445c4000,0x00007f9b446c5000] [id=25221] VM state:at safepoint (normal execution) VM Mutex/Monitor currently owned by a thread: ([mutex/lock_event]) [0x00007f9b4c0073d0] ExpandHeap_lock - owner thread: 0x00007f9b4c06a000 [0x00007f9b4c007a50] Threads_lock - owner thread: 0x00007f9b4c06a000 [0x00007f9b4c007f50] Heap_lock - owner thread: 0x00007f9af2e2b800 Heap PSYoungGen total 1122816K, used 0K [0x0000000780000000, 0x00000007eef00000, 0x0000000800000000) eden space 423936K, 0% used [0x0000000780000000,0x0000000780000000,0x0000000799e00000) from space 698880K, 0% used [0x0000000799e00000,0x0000000799e00000,0x00000007c4880000) to space 690176K, 0% used [0x00000007c4d00000,0x00000007c4d00000,0x00000007eef00000) ParOldGen total 3225088K, used 3184634K [0x0000000680000000, 0x0000000744d80000, 0x0000000780000000) object space 3225088K, 98% used [0x0000000680000000,0x00000007425fea80,0x0000000744d80000) PSPermGen total 2097152K, used 111831K [0x0000000600000000, 0x0000000680000000, 0x0000000680000000) object space 2097152K, 5% used [0x0000000600000000,0x0000000606d35ff8,0x0000000680000000) Card table byte_map: [0x00007f9b47fff000,0x00007f9b49000000] byte_map_base: 0x00007f9b44fff000 Polling page: 0x00007f9b52917000 Code Cache [0x00007f9b49000000, 0x00007f9b4a2e0000, 0x00007f9b4c000000) total_blobs=5813 nmethods=5091 adapters=673 free_code_cache=30158Kb largest_free_block=30581696 Compilation events (10 events): Event: 1204524.007 Thread 0x00007f9b4c094000 6182 com.taobao.eagleeye.json.IdentityHashMap::get (52 bytes) Event: 1204524.007 Thread 0x00007f9b4c097000 6181 ! com.taobao.eagleeye.json.JSONSerializer::getObjectWriter (872 bytes) Event: 1204524.009 Thread 0x00007f9b4c094000 nmethod 6182 0x00007f9b49cc2750 code [0x00007f9b49cc28a0, 0x00007f9b49cc29e8] Event: 1204524.009 Thread 0x00007f9b4c094000 6183 com.taobao.eagleeye.json.SerializeWriter::writeFieldName (80 bytes) Event: 1204524.011 Thread 0x00007f9b4c094000 nmethod 6183 0x00007f9b4a280010 code [0x00007f9b4a280160, 0x00007f9b4a280288] Event: 1204524.040 Thread 0x00007f9b4c097000 nmethod 6181 0x00007f9b4a01e110 code [0x00007f9b4a01e600, 0x00007f9b4a0201d8] Event: 1204590.715 Thread 0x00007f9b4c094000 6184 java.util.TreeMap::rotateRight (96 bytes) Event: 1204590.715 Thread 0x00007f9b4c097000 6185 com.taobao.eagleeye.json.BigDecimalCodec::write (88 bytes) Event: 1204590.717 Thread 0x00007f9b4c094000 nmethod 6184 0x00007f9b49f7af90 code [0x00007f9b49f7b0e0, 0x00007f9b49f7b238] Event: 1204590.718 Thread 0x00007f9b4c097000 nmethod 6185 0x00007f9b4a05ddd0 code [0x00007f9b4a05df40, 0x00007f9b4a05e138] GC Heap History (10 events): Event: 1204590.670 GC heap after Heap after GC invocations=36180 (full 121): PSYoungGen total 1093632K, used 0K [0x0000000780000000, 0x00000007ef400000, 0x0000000800000000) eden space 394752K, 0% used [0x0000000780000000,0x0000000780000000,0x0000000798180000) from space 698880K, 0% used [0x0000000798180000,0x0000000798180000,0x00000007c2c00000) to space 698880K, 0% used [0x00000007c4980000,0x00000007c4980000,0x00000007ef400000) ParOldGen total 3225088K, used 2652104K [0x0000000680000000, 0x0000000744d80000, 0x0000000780000000) object space 3225088K, 82% used [0x0000000680000000,0x0000000721df21d8,0x0000000744d80000) PSPermGen total 2097152K, used 111831K [0x0000000600000000, 0x0000000680000000, 0x0000000680000000) object space 2097152K, 5% used [0x0000000600000000,0x0000000606d35e60,0x0000000680000000) } Event: 1204596.643 GC heap before {Heap before GC invocations=36181 (full 121): PSYoungGen total 1093632K, used 394752K [0x0000000780000000, 0x00000007ef400000, 0x0000000800000000) eden space 394752K, 100% used [0x0000000780000000,0x0000000798180000,0x0000000798180000) from space 698880K, 0% used [0x0000000798180000,0x0000000798180000,0x00000007c2c00000) to space 698880K, 0% used [0x00000007c4980000,0x00000007c4980000,0x00000007ef400000) ParOldGen total 3225088K, used 2652104K [0x0000000680000000, 0x0000000744d80000, 0x0000000780000000) object space 3225088K, 82% used [0x0000000680000000,0x0000000721df21d8,0x0000000744d80000) PSPermGen total 2097152K, used 111831K [0x0000000600000000, 0x0000000680000000, 0x0000000680000000) object space 2097152K, 5% used [0x0000000600000000,0x0000000606d35f38,0x0000000680000000) Event: 1204596.732 GC heap after Heap after GC invocations=36181 (full 121): PSYoungGen total 1095168K, used 123671K [0x0000000780000000, 0x00000007ee600000, 0x0000000800000000) eden space 410624K, 0% used [0x0000000780000000,0x0000000780000000,0x0000000799100000) from space 684544K, 18% used [0x00000007c4980000,0x00000007cc245f30,0x00000007ee600000) to space 698880K, 0% used [0x0000000799100000,0x0000000799100000,0x00000007c3b80000) ParOldGen total 3225088K, used 2652104K [0x0000000680000000, 0x0000000744d80000, 0x0000000780000000) object space 3225088K, 82% used [0x0000000680000000,0x0000000721df21d8,0x0000000744d80000) PSPermGen total 2097152K, used 111831K [0x0000000600000000, 0x0000000680000000, 0x0000000680000000) object space 2097152K, 5% used [0x0000000600000000,0x0000000606d35f38,0x0000000680000000) } Event: 1204602.842 GC heap before {Heap before GC invocations=36182 (full 121): PSYoungGen total 1095168K, used 534295K [0x0000000780000000, 0x00000007ee600000, 0x0000000800000000) eden space 410624K, 100% used [0x0000000780000000,0x0000000799100000,0x0000000799100000) from space 684544K, 18% used [0x00000007c4980000,0x00000007cc245f30,0x00000007ee600000) to space 698880K, 0% used [0x0000000799100000,0x0000000799100000,0x00000007c3b80000) ParOldGen total 3225088K, used 2652104K [0x0000000680000000, 0x0000000744d80000, 0x0000000780000000) object space 3225088K, 82% used [0x0000000680000000,0x0000000721df21d8,0x0000000744d80000) PSPermGen total 2097152K, used 111831K [0x0000000600000000, 0x0000000680000000, 0x0000000680000000) object space 2097152K, 5% used [0x0000000600000000,0x0000000606d35f38,0x0000000680000000) Event: 1204603.067 GC heap after Heap after GC invocations=36182 (full 121): PSYoungGen total 1109504K, used 116361K [0x0000000780000000, 0x00000007ef480000, 0x0000000800000000) eden space 410624K, 0% used [0x0000000780000000,0x0000000780000000,0x0000000799100000) from space 698880K, 16% used [0x0000000799100000,0x00000007a02a2558,0x00000007c3b80000) to space 698880K, 0% used [0x00000007c4a00000,0x00000007c4a00000,0x00000007ef480000) ParOldGen total 3225088K, used 2761705K [0x0000000680000000, 0x0000000744d80000, 0x0000000780000000) object space 3225088K, 85% used [0x0000000680000000,0x00000007288fa6c8,0x0000000744d80000) PSPermGen total 2097152K, used 111831K [0x0000000600000000, 0x0000000680000000, 0x0000000680000000) object space 2097152K, 5% used [0x0000000600000000,0x0000000606d35f38,0x0000000680000000) } Event: 1204611.632 GC heap before {Heap before GC invocations=36183 (full 121): PSYoungGen total 1109504K, used 526985K [0x0000000780000000, 0x00000007ef480000, 0x0000000800000000) eden space 410624K, 100% used [0x0000000780000000,0x0000000799100000,0x0000000799100000) from space 698880K, 16% used [0x0000000799100000,0x00000007a02a2558,0x00000007c3b80000) to space 698880K, 0% used [0x00000007c4a00000,0x00000007c4a00000,0x00000007ef480000) ParOldGen total 3225088K, used 2761705K [0x0000000680000000, 0x0000000744d80000, 0x0000000780000000) object space 3225088K, 85% used [0x0000000680000000,0x00000007288fa6c8,0x0000000744d80000) PSPermGen total 2097152K, used 111831K [0x0000000600000000, 0x0000000680000000, 0x0000000680000000) object space 2097152K, 5% used [0x0000000600000000,0x0000000606d35f38,0x0000000680000000) Event: 1204611.935 GC heap after Heap after GC invocations=36183 (full 121): PSYoungGen total 1121280K, used 159460K [0x0000000780000000, 0x00000007ef300000, 0x0000000800000000) eden space 423936K, 0% used [0x0000000780000000,0x0000000780000000,0x0000000799e00000) from space 697344K, 22% used [0x00000007c4a00000,0x00000007ce5b9330,0x00000007ef300000) to space 698880K, 0% used [0x0000000799e00000,0x0000000799e00000,0x00000007c4880000) ParOldGen total 3225088K, used 2873393K [0x0000000680000000, 0x0000000744d80000, 0x0000000780000000) object space 3225088K, 89% used [0x0000000680000000,0x000000072f60c6c8,0x0000000744d80000) PSPermGen total 2097152K, used 111831K [0x0000000600000000, 0x0000000680000000, 0x0000000680000000) object space 2097152K, 5% used [0x0000000600000000,0x0000000606d35f38,0x0000000680000000) } Event: 1204620.761 GC heap before {Heap before GC invocations=36184 (full 121): PSYoungGen total 1121280K, used 583396K [0x0000000780000000, 0x00000007ef300000, 0x0000000800000000) eden space 423936K, 100% used [0x0000000780000000,0x0000000799e00000,0x0000000799e00000) from space 697344K, 22% used [0x00000007c4a00000,0x00000007ce5b9330,0x00000007ef300000) to space 698880K, 0% used [0x0000000799e00000,0x0000000799e00000,0x00000007c4880000) ParOldGen total 3225088K, used 2873393K [0x0000000680000000, 0x0000000744d80000, 0x0000000780000000) object space 3225088K, 89% used [0x0000000680000000,0x000000072f60c6c8,0x0000000744d80000) PSPermGen total 2097152K, used 111831K [0x0000000600000000, 0x0000000680000000, 0x0000000680000000) object space 2097152K, 5% used [0x0000000600000000,0x0000000606d35ff8,0x0000000680000000) Event: 1204621.151 GC heap after Heap after GC invocations=36184 (full 121): PSYoungGen total 1122816K, used 169794K [0x0000000780000000, 0x00000007eef00000, 0x0000000800000000) eden space 423936K, 0% used [0x0000000780000000,0x0000000780000000,0x0000000799e00000) from space 698880K, 24% used [0x0000000799e00000,0x00000007a43d0868,0x00000007c4880000) to space 690176K, 0% used [0x00000007c4d00000,0x00000007c4d00000,0x00000007eef00000) ParOldGen total 3225088K, used 3027001K [0x0000000680000000, 0x0000000744d80000, 0x0000000780000000) object space 3225088K, 93% used [0x0000000680000000,0x0000000738c0e520,0x0000000744d80000) PSPermGen total 2097152K, used 111831K [0x0000000600000000, 0x0000000680000000, 0x0000000680000000) object space 2097152K, 5% used [0x0000000600000000,0x0000000606d35ff8,0x0000000680000000) } Event: 1204621.151 GC heap before {Heap before GC invocations=36185 (full 122): PSYoungGen total 1122816K, used 169794K [0x0000000780000000, 0x00000007eef00000, 0x0000000800000000) eden space 423936K, 0% used [0x0000000780000000,0x0000000780000000,0x0000000799e00000) from space 698880K, 24% used [0x0000000799e00000,0x00000007a43d0868,0x00000007c4880000) to space 690176K, 0% used [0x00000007c4d00000,0x00000007c4d00000,0x00000007eef00000) ParOldGen total 3225088K, used 3027001K [0x0000000680000000, 0x0000000744d80000, 0x0000000780000000) object space 3225088K, 93% used [0x0000000680000000,0x0000000738c0e520,0x0000000744d80000) PSPermGen total 2097152K, used 111831K [0x0000000600000000, 0x0000000680000000, 0x0000000680000000) object space 2097152K, 5% used [0x0000000600000000,0x0000000606d35ff8,0x0000000680000000) Deoptimization events (10 events): Event: 1204401.184 Thread 0x00007f9ad010d800 Uncommon trap: reason=class_check action=maybe_recompile pc=0x00007f9b49746a14 method=cn.hutool.core.convert.ConverterRegistry.getDefaultConverter(Ljava/lang/reflect/Type;)Lcn/hutool/core/convert/Converter; @ 22 Event: 1204401.471 Thread 0x00007f9ad010d800 Uncommon trap: reason=unreached action=reinterpret pc=0x00007f9b4a227214 method=com.qdport.controller.edo.EdoDynamicController.exportPayOrderData(Ljava/lang/String;Ljava/lang/String;Ljava/lang/String;Ljava/lang/String;Ljava/lang/String;Ljava/lang/String;L Event: 1204401.679 Thread 0x00007f9ad010d800 Uncommon trap: reason=null_check action=make_not_entrant pc=0x00007f9b4a119570 method=com.alibaba.fastjson.parser.ParserConfig.get(Ljava/lang/reflect/Type;)Lcom/alibaba/fastjson/parser/deserializer/ObjectDeserializer; @ 18 Event: 1204401.679 Thread 0x00007f9ad010d800 Uncommon trap: reason=null_check action=make_not_entrant pc=0x00007f9b4a11a0d0 method=com.alibaba.fastjson.parser.ParserConfig.get(Ljava/lang/reflect/Type;)Lcom/alibaba/fastjson/parser/deserializer/ObjectDeserializer; @ 18 Event: 1204402.306 Thread 0x00007f9ad010d800 Uncommon trap: reason=unstable_if action=reinterpret pc=0x00007f9b4a2acd48 method=com.qdport.controller.edo.EdoDynamicController.exportPayOrderData(Ljava/lang/String;Ljava/lang/String;Ljava/lang/String;Ljava/lang/String;Ljava/lang/String;Ljava/lang/String Event: 1204482.174 Thread 0x00007f9ac4001800 Uncommon trap: reason=class_check action=maybe_recompile pc=0x00007f9b4975a408 method=com.alibaba.nacos.client.config.impl.ClientWorker$LongPullingRunnable.run()V @ 130 Event: 1204482.222 Thread 0x00007f9ad0002800 Uncommon trap: reason=bimorphic action=maybe_recompile pc=0x00007f9b49ef0d6c method=org.springframework.context.event.AbstractApplicationEventMulticaster.getApplicationListeners(Lorg/springframework/context/ApplicationEvent;Lorg/springframework/core/Resol Event: 1204523.968 Thread 0x00007f9af40df800 Uncommon trap: reason=unstable_if action=reinterpret pc=0x00007f9b49e96204 method=com.taobao.hsf.address.AddressProfiler.run()V @ 24 Event: 1204523.969 Thread 0x00007f9af40df800 Uncommon trap: reason=unstable_if action=reinterpret pc=0x00007f9b4a118ef8 method=java.text.FieldPosition.matchesField(Ljava/text/Format$Field;I)Z @ 21 Event: 1204581.597 Thread 0x00007f9ad0068800 Uncommon trap: reason=unstable_if action=reinterpret pc=0x00007f9b49d12f7c method=cn.hutool.core.convert.AbstractConverter.convertToStr(Ljava/lang/Object;)Ljava/lang/String; @ 11 Internal exceptions (10 events): Event: 1204618.577 Thread 0x00007f9b4c166800 Threw 0x000000079141a848 at /HUDSON/workspace/7u-2-build-linux-amd64/jdk7u80/2329/hotspot/src/share/vm/prims/jvm.cpp:1319 Event: 1204619.542 Thread 0x00007f9b4c43a800 Threw 0x0000000792fa6bf8 at /HUDSON/workspace/7u-2-build-linux-amd64/jdk7u80/2329/hotspot/src/share/vm/prims/jvm.cpp:1319 Event: 1204619.542 Thread 0x00007f9b4c43a800 Threw 0x0000000792faeb00 at /HUDSON/workspace/7u-2-build-linux-amd64/jdk7u80/2329/hotspot/src/share/vm/prims/jvm.cpp:1319 Event: 1204619.610 Thread 0x00007f9ad010f800 Threw 0x0000000793b29018 at /HUDSON/workspace/7u-2-build-linux-amd64/jdk7u80/2329/hotspot/src/share/vm/prims/jvm.cpp:1319 Event: 1204619.611 Thread 0x00007f9ad010f800 Threw 0x0000000793b30a50 at /HUDSON/workspace/7u-2-build-linux-amd64/jdk7u80/2329/hotspot/src/share/vm/prims/jvm.cpp:1319 Event: 1204620.562 Thread 0x00007f9ad0008800 Threw 0x0000000797ff58c8 at /HUDSON/workspace/7u-2-build-linux-amd64/jdk7u80/2329/hotspot/src/share/vm/prims/jvm.cpp:1319 Event: 1204620.562 Thread 0x00007f9ad0008800 Threw 0x0000000797ffd6d0 at /HUDSON/workspace/7u-2-build-linux-amd64/jdk7u80/2329/hotspot/src/share/vm/prims/jvm.cpp:1319 Event: 1204620.743 Thread 0x00007f9af2e2b800 Threw 0x0000000798ee0120 at /HUDSON/workspace/7u-2-build-linux-amd64/jdk7u80/2329/hotspot/src/share/vm/prims/jni.cpp:717 Event: 1204620.743 Thread 0x00007f9af2e2b800 Threw 0x0000000798ee6c38 at /HUDSON/workspace/7u-2-build-linux-amd64/jdk7u80/2329/hotspot/src/share/vm/prims/jni.cpp:717 Event: 1204620.743 Thread 0x00007f9af2e2b800 Threw 0x0000000798eeb330 at /HUDSON/workspace/7u-2-build-linux-amd64/jdk7u80/2329/hotspot/src/share/vm/prims/jni.cpp:717 Events (10 events): Event: 1204619.610 loading class 0x00007f9ab400eee0 done Event: 1204619.731 Thread 0x00007f9abc15f000 Thread added: 0x00007f9abc15f000 Event: 1204619.731 Thread 0x00007f9abc15f000 Thread exited: 0x00007f9abc15f000 Event: 1204620.562 loading class 0x00007f9ab40101e0 Event: 1204620.562 loading class 0x00007f9ab40101e0 done Event: 1204620.562 loading class 0x00007f9ab400eee0 Event: 1204620.562 loading class 0x00007f9ab400eee0 done Event: 1204620.678 Executing VM operation: RevokeBias Event: 1204620.679 Executing VM operation: RevokeBias done Event: 1204620.761 Executing VM operation: ParallelGCFailedAllocation ```
springboot 包扫描问题。
@SpringBootApplication(scanBasePackages = {"com.alibaba.nacos.api.*","com.alibaba.nacos.server.*"}) @NacosPropertySource(dataId = "example", autoRefreshed = true) public class SpringBootStaterApplication { public static void main(String[] args) { SpringApplication.run(SpringBootStaterApplication.class, args); } @Bean public JPAQueryFactory jpaQueryFactory(EntityManager entityManager) { return new JPAQueryFactory(entityManager); } } 上面要扫描的包的路径不一致,分别在不同的子模块中
nginx静态资源无法完全加载的问题
我目前碰到的问题具体是这样的: 1. 若不开启静态资源压缩,则所有静态资源都无法加载。 2. 开启静态资源压缩后,部分静态资源可以访问,当某个超过40KB左右的JS文件无法访问时,其后续的静态资源都无法加载,各种浏览器试了都是同样的现象。 以上都是在Windows1809系统下,没有任何后端,纯前端静态。 nginx设置了MB级别的请求大小限制,绝对够KB级的静态资源访问用了,也设置了各种超时时长以及缓存,静态资源文件的访问权限也是有的,同时我查看了nginx本身的日志,所有请求的状态码都是200,并没有任何异常或报错,就浏览器那端报4XX或5XX。 此情况只出现在我本机上,局域网下其他机器访问我本机的nginx服务与资源都是完全正常的,而且我把同样的nginx配置和资源挪到其他机器跑起来然后自访问也是完全正常的,所以这个就很诡异了。 我个人怀疑可能是Windows防火墙、Defender杀毒软件、某一次系统更新,因为去年这些都是正常的,不太记得是哪一次系统升级或更新之后就出现这种情况了。 如果实在没办法了我只好重装系统来解决了 贴一下nginx主配置: ``` http { include mime.types; default_type application/octet-stream; #log_format main '$remote_addr - $remote_user [$time_local] "$request" ' # '$status $body_bytes_sent "$http_referer" ' # '"$http_user_agent" "$http_x_forwarded_for"'; # 日志格式 #access_log logs/access.log main; sendfile on; #on/off #tcp_nopush on; #keepalive_timeout 0; keepalive_timeout 300s 300s; fastcgi_connect_timeout 6000s; fastcgi_send_timeout 6000s; fastcgi_read_timeout 6000s; fastcgi_buffer_size 256k; fastcgi_buffers 8 256k; fastcgi_busy_buffers_size 256k; fastcgi_temp_file_write_size 256k; # 压缩配置 gzip on; #开启gzip压缩功能,默认是关闭的。 gzip_static on; #开启gzip静态资源 gzip_min_length 1k; #允许压缩的页面最小字节数,默认是全部都压缩,最好不要小于1k,因为小于1k的可能越压越大。 gzip_buffers 16 128k; #设置系统获取几个单位的缓存用于存储gzip的压缩结果数据流。4 16k代表以16k为单位,安装原始数据大小以16k为单位的4倍申请内存。 gzip_http_version 1.1; #设置http协议版本,只对1.1版本进行压缩。 gzip_comp_level 9; #gzip压缩比/压缩级别,压缩级别 1-9,级别越高压缩率越大,当然压缩时间也就越长(传输快但比较消耗cpu)。 gzip_types text/plain text/xml text/css text/javascript application/xml application/json application/javascript application/x-javascript image/x-icon image/jpg image/jpeg image/gif image/png application/x-font-ttf application/font-woff application/font-woff2; #设置压缩文件类型,这里指定了text/html text html js css json xml image font。 gzip_disable "MSIE [1-6]\."; #IE1-6版本不支持gzip压缩 gzip_proxied any; gzip_vary off; #给http请求增加vary字段,不支持gzip的不进行压缩处理。 # 设置上传文件大小限制 client_max_body_size 50M; client_body_buffer_size 512K; client_header_buffer_size 10M; client_header_timeout 120s; client_body_timeout 120s; # http_proxy proxy_buffers 32 256k; #缓冲区,nginx针对单个连接缓存来自后端real-server的响应 proxy_buffer_size 256k; #设置代理服务器(nginx)从后端real-server读取并保存用户头信息的缓冲区大小,默认与proxy_buffers大小相同,其实可以将这个指令值设的小一点 proxy_busy_buffers_size 256k; #高负荷下缓冲大小(proxy_buffers*2) proxy_max_temp_file_size 1024M; #当proxy_buffers放不下后端服务器的响应内容时,会将一部分保存到硬盘的临时文件中,这个值用来设置最大临时文件大小,默认1024M,它与proxy_cache没有关系。大于这个值,将从upstream服务器传回。设置为0禁用。 proxy_temp_file_write_size 256k; #当缓存被代理的服务器响应到临时文件时,这个选项限制每次写临时文件的大小。 proxy_connect_timeout 300s; proxy_send_timeout 300s; proxy_read_timeout 300s; upstream nacos-server { server 127.0.0.1:8848; #server 127.0.0.1:8841; #server 127.0.0.1:8842; #server 127.0.0.1:8843; } include ../servers/*.conf; } ``` --- 某一个次配置: ``` server { # 设置端口监听 listen 80; # 设置监听的域名(此处的域名为自定义配置,请在host文件中添加) server_name nacos.cn; # 统一字符编码 charset utf-8; # 资源根路径 root static; #access_log logs/host.access.log main; # 默认配置 #location / { # index login.html; #} # 端口转发配置 location /nacos/ { proxy_pass http://nacos-server/nacos/; add_header From nacos.cn; proxy_redirect default; proxy_set_header Host $host; proxy_set_header Server-Name $server_name; proxy_set_header Http-Host $http_host; proxy_set_header Cookie $http_cookie; proxy_set_header Referer $http_referer; proxy_set_header Nginx_Version $nginx_version; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Real-Port $remote_port; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header X-Forwarded-Proto $scheme; proxy_set_header X-Nginx-Proxy true; proxy_cookie_path / /; } # 异常处理 #error_page 404 /404.html; error_page 500 502 503 504 /50x.html; } ```
相见恨晚的超实用网站
相见恨晚的超实用网站 持续更新中。。。
爬虫福利二 之 妹子图网MM批量下载
爬虫福利一:27报网MM批量下载 点击 看了本文,相信大家对爬虫一定会产生强烈的兴趣,激励自己去学习爬虫,在这里提前祝:大家学有所成! 目标网站:妹子图网 环境:Python3.x 相关第三方模块:requests、beautifulsoup4 Re:各位在测试时只需要将代码里的变量path 指定为你当前系统要保存的路径,使用 python xxx.py 或IDE运行即可。 ...
字节跳动视频编解码面经
三四月份投了字节跳动的实习(图形图像岗位),然后hr打电话过来问了一下会不会opengl,c++,shador,当时只会一点c++,其他两个都不会,也就直接被拒了。 七月初内推了字节跳动的提前批,因为内推没有具体的岗位,hr又打电话问要不要考虑一下图形图像岗,我说实习投过这个岗位不合适,不会opengl和shador,然后hr就说秋招更看重基础。我当时想着能进去就不错了,管他哪个岗呢,就同意了面试...
开源一个功能完整的SpringBoot项目框架
福利来了,给大家带来一个福利。 最近想了解一下有关Spring Boot的开源项目,看了很多开源的框架,大多是一些demo或者是一个未成形的项目,基本功能都不完整,尤其是用户权限和菜单方面几乎没有完整的。 想到我之前做的框架,里面通用模块有:用户模块,权限模块,菜单模块,功能模块也齐全了,每一个功能都是完整的。 打算把这个框架分享出来,供大家使用和学习。 为什么用框架? 框架可以学习整体...
源码阅读(19):Java中主要的Map结构——HashMap容器(下1)
HashMap容器从字面的理解就是,基于Hash算法构造的Map容器。从数据结构的知识体系来说,HashMap容器是散列表在Java中的具体表达(并非线性表结构)。具体来说就是,利用K-V键值对中键对象的某个属性(默认使用该对象的“内存起始位置”这一属性)作为计算依据进行哈希计算(调用hashCode方法),然后再以计算后的返回值为依据,将当前K-V键值对在符合HashMap容器构造原则的基础上,放置到HashMap容器的某个位置上,且这个位置和之前添加的K-V键值对的存储位置完全独立,不一定构成连续的存储
c++制作的植物大战僵尸,开源,一代二代结合游戏
此游戏全部由本人自己制作完成。游戏大部分的素材来源于原版游戏素材,少部分搜集于网络,以及自己制作。 此游戏为同人游戏而且仅供学习交流使用,任何人未经授权,不得对本游戏进行更改、盗用等,否则后果自负。目前有六种僵尸和六种植物,植物和僵尸的动画都是本人做的。qq:2117610943 开源代码下载 提取码:3vzm 点击下载--&gt; 11月28日 新增四种植物 统一植物画风,全部修...
Java学习的正确打开方式
在博主认为,对于入门级学习java的最佳学习方法莫过于视频+博客+书籍+总结,前三者博主将淋漓尽致地挥毫于这篇博客文章中,至于总结在于个人,实际上越到后面你会发现学习的最好方式就是阅读参考官方文档其次就是国内的书籍,博客次之,这又是一个层次了,这里暂时不提后面再谈。博主将为各位入门java保驾护航,各位只管冲鸭!!!上天是公平的,只要不辜负时间,时间自然不会辜负你。 何谓学习?博主所理解的学习,它是一个过程,是一个不断累积、不断沉淀、不断总结、善于传达自己的个人见解以及乐于分享的过程。
程序员必须掌握的核心算法有哪些?
由于我之前一直强调数据结构以及算法学习的重要性,所以就有一些读者经常问我,数据结构与算法应该要学习到哪个程度呢?,说实话,这个问题我不知道要怎么回答你,主要取决于你想学习到哪些程度,不过针对这个问题,我稍微总结一下我学过的算法知识点,以及我觉得值得学习的算法。这些算法与数据结构的学习大多数是零散的,并没有一本把他们全部覆盖的书籍。下面是我觉得值得学习的一些算法以及数据结构,当然,我也会整理一些看过...
Python——画一棵漂亮的樱花树(不同种樱花+玫瑰+圣诞树喔)
最近翻到一篇知乎,上面有不少用Python(大多是turtle库)绘制的树图,感觉很漂亮,我整理了一下,挑了一些我觉得不错的代码分享给大家(这些我都测试过,确实可以生成) one 樱花树 动态生成樱花 效果图(这个是动态的): 实现代码 import turtle as T import random import time # 画樱花的躯干(60,t) def Tree(branch, ...
linux系列之常用运维命令整理笔录
本博客记录工作中需要的linux运维命令,大学时候开始接触linux,会一些基本操作,可是都没有整理起来,加上是做开发,不做运维,有些命令忘记了,所以现在整理成博客,当然vi,文件操作等就不介绍了,慢慢积累一些其它拓展的命令,博客不定时更新 free -m 其中:m表示兆,也可以用g,注意都要小写 Men:表示物理内存统计 total:表示物理内存总数(total=used+free) use...
Python 基础(一):入门必备知识
Python 入门必备知识,你都掌握了吗?
深度学习图像算法在内容安全领域的应用
互联网给人们生活带来便利的同时也隐含了大量不良信息,防范互联网平台有害内容传播引起了多方面的高度关注。本次演讲从技术层面分享网易易盾在内容安全领域的算法实践经验,包括深度...
程序员接私活怎样防止做完了不给钱?
首先跟大家说明一点,我们做 IT 类的外包开发,是非标品开发,所以很有可能在开发过程中会有这样那样的需求修改,而这种需求修改很容易造成扯皮,进而影响到费用支付,甚至出现做完了项目收不到钱的情况。 那么,怎么保证自己的薪酬安全呢? 我们在开工前,一定要做好一些证据方面的准备(也就是“讨薪”的理论依据),这其中最重要的就是需求文档和验收标准。一定要让需求方提供这两个文档资料作为开发的基础。之后开发...
网页实现一个简单的音乐播放器(大佬别看。(⊙﹏⊙))
今天闲着无事,就想写点东西。然后听了下歌,就打算写个播放器。 于是乎用h5 audio的加上js简单的播放器完工了。 演示地点演示 html代码如下` music 这个年纪 七月的风 音乐 ` 然后就是css`*{ margin: 0; padding: 0; text-decoration: none; list-...
Python十大装B语法
Python 是一种代表简单思想的语言,其语法相对简单,很容易上手。不过,如果就此小视 Python 语法的精妙和深邃,那就大错特错了。本文精心筛选了最能展现 Python 语法之精妙的十个知识点,并附上详细的实例代码。如能在实战中融会贯通、灵活使用,必将使代码更为精炼、高效,同时也会极大提升代码B格,使之看上去更老练,读起来更优雅。
数据库优化 - SQL优化
以实际SQL入手,带你一步一步走上SQL优化之路!
2019年11月中国大陆编程语言排行榜
2019年11月2日,我统计了某招聘网站,获得有效程序员招聘数据9万条。针对招聘信息,提取编程语言关键字,并统计如下: 编程语言比例 rank pl_ percentage 1 java 33.62% 2 cpp 16.42% 3 c_sharp 12.82% 4 javascript 12.31% 5 python 7.93% 6 go 7.25% 7 p...
通俗易懂地给女朋友讲:线程池的内部原理
餐盘在灯光的照耀下格外晶莹洁白,女朋友拿起红酒杯轻轻地抿了一小口,对我说:“经常听你说线程池,到底线程池到底是个什么原理?”
经典算法(5)杨辉三角
写在前面: 我是 扬帆向海,这个昵称来源于我的名字以及女朋友的名字。我热爱技术、热爱开源、热爱编程。技术是开源的、知识是共享的。 这博客是对自己学习的一点点总结及记录,如果您对 Java、算法 感兴趣,可以关注我的动态,我们一起学习。 用知识改变命运,让我们的家人过上更好的生活。 目录一、杨辉三角的介绍二、杨辉三角的算法思想三、代码实现1.第一种写法2.第二种写法 一、杨辉三角的介绍 百度
腾讯算法面试题:64匹马8个跑道需要多少轮才能选出最快的四匹?
昨天,有网友私信我,说去阿里面试,彻底的被打击到了。问了为什么网上大量使用ThreadLocal的源码都会加上private static?他被难住了,因为他从来都没有考虑过这个问题。无独有偶,今天笔者又发现有网友吐槽了一道腾讯的面试题,我们一起来看看。 腾讯算法面试题:64匹马8个跑道需要多少轮才能选出最快的四匹? 在互联网职场论坛,一名程序员发帖求助到。二面腾讯,其中一个算法题:64匹...
面试官:你连RESTful都不知道我怎么敢要你?
干货,2019 RESTful最贱实践
为啥国人偏爱Mybatis,而老外喜欢Hibernate/JPA呢?
关于SQL和ORM的争论,永远都不会终止,我也一直在思考这个问题。昨天又跟群里的小伙伴进行了一番讨论,感触还是有一些,于是就有了今天这篇文。 声明:本文不会下关于Mybatis和JPA两个持久层框架哪个更好这样的结论。只是摆事实,讲道理,所以,请各位看官勿喷。 一、事件起因 关于Mybatis和JPA孰优孰劣的问题,争论已经很多年了。一直也没有结论,毕竟每个人的喜好和习惯是大不相同的。我也看...
项目中的if else太多了,该怎么重构?
介绍 最近跟着公司的大佬开发了一款IM系统,类似QQ和微信哈,就是聊天软件。我们有一部分业务逻辑是这样的 if (msgType = "文本") { // dosomething } else if(msgType = "图片") { // doshomething } else if(msgType = "视频") { // doshomething } else { // doshom...
致 Python 初学者
欢迎来到“Python进阶”专栏!来到这里的每一位同学,应该大致上学习了很多 Python 的基础知识,正在努力成长的过程中。在此期间,一定遇到了很多的困惑,对未来的学习方向感到迷茫。我非常理解你们所面临的处境。我从2007年开始接触 python 这门编程语言,从2009年开始单一使用 python 应对所有的开发工作,直至今天。回顾自己的学习过程,也曾经遇到过无数的困难,也曾经迷茫过、困惑过。开办这个专栏,正是为了帮助像我当年一样困惑的 Python 初学者走出困境、快速成长。希望我的经验能真正帮到你
Python 编程实用技巧
Python是一门很灵活的语言,也有很多实用的方法,有时候实现一个功能可以用多种方法实现,我这里总结了一些常用的方法,并会持续更新。
“狗屁不通文章生成器”登顶GitHub热榜,分分钟写出万字形式主义大作
一、垃圾文字生成器介绍 最近在浏览GitHub的时候,发现了这样一个骨骼清奇的雷人项目,而且热度还特别高。 项目中文名:狗屁不通文章生成器 项目英文名:BullshitGenerator 根据作者的介绍,他是偶尔需要一些中文文字用于GUI开发时测试文本渲染,因此开发了这个废话生成器。但由于生成的废话实在是太过富于哲理,所以最近已经被小伙伴们给玩坏了。 他的文风可能是这样的: 你发现,
程序员:我终于知道post和get的区别
IT界知名的程序员曾说:对于那些月薪三万以下,自称IT工程师的码农们,其实我们从来没有把他们归为我们IT工程师的队伍。他们虽然总是以IT工程师自居,但只是他们一厢情愿罢了。 此话一出,不知激起了多少(码农)程序员的愤怒,却又无可奈何,于是码农问程序员。 码农:你知道get和post请求到底有什么区别? 程序员:你看这篇就知道了。 码农:你月薪三万了? 程序员:嗯。 码农:你是怎么做到的? 程序员:
"狗屁不通文章生成器"登顶GitHub热榜,分分钟写出万字形式主义大作
前言 GitHub 被誉为全球最大的同性交友网站,……,陪伴我们已经走过 10+ 年时间,它托管了大量的软件代码,同时也承载了程序员无尽的欢乐。 上周给大家分享了一篇10个让你笑的合不拢嘴的Github项目,而且还拿了7万+个Star哦,有兴趣的朋友,可以看看, 印象最深刻的是 “ 呼吸不止,码字不停 ”: 老实交代,你是不是经常准备写个技术博客,打开word后瞬间灵感便秘,码不出字? 有什么
推荐几款比较实用的工具,网站
1.盘百度PanDownload 这个云盘工具是免费的,可以进行资源搜索,提速(偶尔会抽风????) 不要去某站买付费的???? PanDownload下载地址 2.BeJSON 这是一款拥有各种在线工具的网站,推荐它的主要原因是网站简洁,功能齐全,广告相比其他广告好太多了 bejson网站 3.二维码美化 这个网站的二维码美化很好看,网站界面也很...
《程序人生》系列-这个程序员只用了20行代码就拿了冠军
你知道的越多,你不知道的越多 点赞再看,养成习惯GitHub上已经开源https://github.com/JavaFamily,有一线大厂面试点脑图,欢迎Star和完善 前言 这一期不算《吊打面试官》系列的,所有没前言我直接开始。 絮叨 本来应该是没有这期的,看过我上期的小伙伴应该是知道的嘛,双十一比较忙嘛,要值班又要去帮忙拍摄年会的视频素材,还得搞个程序员一天的Vlog,还要写BU
程序员把地府后台管理系统做出来了,还有3.0版本!12月7号最新消息:已在开发中有github地址
第一幕:缘起 听说阎王爷要做个生死簿后台管理系统,我们派去了一个程序员…… 996程序员做的梦: 第一场:团队招募 为了应对地府管理危机,阎王打算找“人”开发一套地府后台管理系统,于是就在地府总经办群中发了项目需求。 话说还是中国电信的信号好,地府都是满格,哈哈!!! 经常会有外行朋友问:看某网站做的不错,功能也简单,你帮忙做一下? 而这次,面对这样的需求,这个程序员
网易云6亿用户音乐推荐算法
网易云音乐是音乐爱好者的集聚地,云音乐推荐系统致力于通过 AI 算法的落地,实现用户千人千面的个性化推荐,为用户带来不一样的听歌体验。 本次分享重点介绍 AI 算法在音乐推荐中的应用实践,以及在算法落地过程中遇到的挑战和解决方案。 将从如下两个部分展开: AI 算法在音乐推荐中的应用 音乐场景下的 AI 思考 从 2013 年 4 月正式上线至今,网易云音乐平台持续提供着:乐屏社区、UGC
Spring Security 实战干货:基于注解的接口角色访问控制
1. 前言 欢迎阅读 Spring Security 实战干货[1] 系列文章 。在上一篇 基于配置的接口角色访问控制[2] 我们讲解了如何通过 javaConfig 的方式配置接口的角色访问控制。其实还有一种更加灵活的配置方式 基于注解 。今天我们就来探讨一下。DEMO 获取方式在文末。 2. Spring Security 方法安全 Spring Security 基于注解的安全...
8年经验面试官详解 Java 面试秘诀
    作者 | 胡书敏 责编 | 刘静 出品 | CSDN(ID:CSDNnews) 本人目前在一家知名外企担任架构师,而且最近八年来,在多家外企和互联网公司担任Java技术面试官,前后累计面试了有两三百位候选人。在本文里,就将结合本人的面试经验,针对Java初学者、Java初级开发和Java开发,给出若干准备简历和准备面试的建议。   Java程序员准备和投递简历的实
面试官如何考察你的思维方式?
1.两种思维方式在求职面试中,经常会考察这种问题:北京有多少量特斯拉汽车? 某胡同口的煎饼摊一年能卖出多少个煎饼? 深圳有多少个产品经理? 一辆公交车里能装下多少个乒乓球? 一
相关热词 c#选择结构应用基本算法 c# 收到udp包后回包 c#oracle 头文件 c# 序列化对象 自定义 c# tcp 心跳 c# ice连接服务端 c# md5 解密 c# 文字导航控件 c#注册dll文件 c#安装.net
立即提问