spring boot 1.5集成 kafka 消费者怎么自己确认消费

spring boot 1.5集成 kafka 消费者怎么自己确认消费 怎么使用@KafkaListener注解实现Acknowledgment,即消费者怎么自己提交游标

Csdn user default icon
上传中...
上传图片
插入图片
抄袭、复制答案,以达到刷声望分或其他目的的行为,在CSDN问答是严格禁止的,一经发现立刻封号。是时候展现真正的技术了!
其他相关推荐
Spring集成kafka,消费者运行时内存占用会一直增长?

本人用Spring集成kafka消费者,发布运行时内存占用会一直升高,最后程序挂掉。请各位大神看看,提供解决方法 以下是我的配置文件 ![图片说明](https://img-ask.csdn.net/upload/201810/31/1540978342_260014.png) 程序运行两天后占用内存达到了1.4G,我用jmap导出程序占用文件,使用eclipsemat分析 ![图片说明](https://img-ask.csdn.net/upload/201810/31/1540978543_231966.png) ![图片说明](https://img-ask.csdn.net/upload/201810/31/1540978554_565464.png) 发现是这个org.springframework.kafka.listener.KafkaMessageListenerContainer这个类里面 ![图片说明](https://img-ask.csdn.net/upload/201810/31/1540978671_113331.png) 这个里面的LinkedBlockingQueue这个队列像是没释放一样。不知道是不是还需要配置什么东西,,一直找不到什么方法来解决。

kafka消费者速度与什么有关

@KafkaListener(topics = {"CRBKC0002.000"}) public void sendSmsInfoByBizType(String record) { } 假设单机版的kafka,就一个节点。 1、 @KafkaListener注解接受消费者,是不是等这个方法执行完。 这个消费者进程才算消费结束。是不是一个镜像这个方法同时只能执行一次?就是不能连续起多个线程执行这个方法。 2、如果接受到参数就算消费这进程结束,也就是获取这个record消费者进程就结束了,那假设生产者一秒生产100w数据进入kafka。那这边获取参数就算消费者进程消费结束,那是不是相当于瞬间连续起100w这个方法线程执行。可是tomcat就200线程。

kafka消费者无法消费信息

在生产环境部署kafka集群和消费者服务器后,通过logstash向kafka集群发送实时日志,消费者也能正常消费信息。但是两分钟之后消费者就停止消费信息了,想问下各位老师如何排查问题点在哪里。 1:查看了kafka服务器的日志,logstash还在向kafka推实时日志,kafka集群日志留存时间是两个小时。 2:kafka消费者一共有两台,两台都在同时运行。 3:kafka集群有三台服务器,查问题的时候发现,kafka消费者只连接到了一台broker上,不知道这是不是原因所在。

springboot 集成 kafka 广播式消费

项目是springboot 集成 kafka ,生产环境有四台实例部署.我想实现Kafka广播式消费, 需要每个实例拥有不同的groupId,那么我在springboot中需要如何配置呢? 每个实例跑的代码和配置中心的配置都是相同的.

kafka 消费者消费不到数据

[root@hzctc-kafka-5d61 ~]# kafka-run-class.sh kafka.tools.ConsumerOffsetChecker --group sbs-haodian-message1 --topic Message --zookeeper 10.1.5.61:2181 [2018-04-18 16:43:43,467] WARN WARNING: ConsumerOffsetChecker is deprecated and will be dropped in releases following 0.9.0. Use ConsumerGroupCommand instead. (kafka.tools.ConsumerOffsetChecker$) Exiting due to: org.apache.zookeeper.KeeperException$NoNodeException: KeeperErrorCode = NoNode for /consumers/sbs-haodian-message1/offsets/Message/8. 用kafka的时候 用命令查看消费组消费情况 报这个错误 其他的消费组是正常的 哪位大神知道这是什么原因导致的 我在消费操作的时候加了缓存锁 每次poll操作之后的间隔时间不确定 可能是10S或者20S或者30S 不过我的sessiontimeiut设置了90s。这个会有什么影响吗

spring cloud stream 集成Kafka 接收到的消息内容前带contentType、application/json字符串

1、发送消息的代码: ``` this.source.output().send(MessageBuilder.withPayload("{'name':'zyy'}").setHeader(MessageHeaders.CONTENT_TYPE, MimeTypeUtils.APPLICATION_JSON).build()); ``` 2、接受消息的代码: ``` @StreamListener(target = Sink.INPUT) public void onProductMsg(@Payload Object object) { System.out.println(object); } ``` 3、结果: ![图片说明](https://img-ask.csdn.net/upload/201904/25/1556154220_224090.jpg) 4、请教大家怎么去掉contentType、application/json字符串

spring boot 整合kafka报错怎么解决?

整合了一下spring boot跟kafka老是一直报错 ``` 2018-05-20 18:56:26.920 ERROR 4428 --- [ main] o.s.k.support.LoggingProducerListener : Exception thrown when sending a message with key='null' and payload='myTest--------1' to topic myTest: org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 60000 ms. ``` 请问一下各位大佬怎么解决的? 相应代码如下,也是参考了网上的: ``` <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-parent</artifactId> <version>1.5.9.RELEASE</version> <dependency> <groupId>org.springframework.kafka</groupId> <artifactId>spring-kafka</artifactId> </dependency> <dependency> <groupId>org.apache.kafka</groupId> <artifactId>kafka-clients</artifactId> <version>0.10.2.0</version> </dependency> ``` kafka版本及配置 kafka_2.11-1.1.0 配置: ``` port = 9092 host.name = 阿里内网ip advertised.host.name = xxx.xxx.xx 阿里外网ip ``` 其他配置默认了 然后spring boot配置 ``` kafka: bootstrap-servers: xxx.xxx.xxx.xxx:9092 listener.concurrency: 3 producer.batch-size: 1000 consumer.group-id: test-consumer-group ``` ``` ``` @Configuration @EnableKafka public class KafkaConfiguration { } ``` @Component public class MsgProducer { private Logger log = LoggerFactory.getLogger(MsgProducer.class); @Autowired private KafkaTemplate<String, String> kafkaTemplate; public void sendMessage(String topicName, String jsonData) { log.info("向kafka推送数据:[{}]", jsonData); try { kafkaTemplate.send(topicName, jsonData); } catch (Exception e) { System.out.println("发送数据出错!!!"+topicName+","+jsonData); System.out.println("发送数据出错=====>"+e); log.error("发送数据出错!!!{}{}", topicName, jsonData); log.error("发送数据出错=====>", e); } //消息发送的监听器,用于回调返回信息 kafkaTemplate.setProducerListener(new ProducerListener<String, String>() { @Override public void onSuccess(String topic, Integer partition, String key, String value, RecordMetadata recordMetadata) { System.out.println("topic-----"+topic); } @Override public void onError(String topic, Integer partition, String key, String value, Exception exception) { } @Override public boolean isInterestedInSuccess() { log.info("数据发送完毕"); System.out.println("数据发送完毕"); return false; } }); } } @Component public class MsgConsumer { @KafkaListener(topics = {"myTest"}) public void processMessage(String content) { System.out.println("消息被消费"+content); } } @RunWith(SpringRunner.class) @SpringBootTest public class TestKafka { @Autowired private MsgProducer msgProducer; @Test public void test() throws Exception { msgProducer.sendMessage("myTest", "myTest--------1"); } } ``` 就是这样;运行test老是一直这个错误

spring-kafka做了分区,部分分区数据可以正常消费,部分分区始终无法消费?

spring-kafka(版本:1.0.6.RELEASE,kafka-client:0.9.0.1),创建了8个分区,有一个分区的数据始终无法消费,其他分区数据正常消费。有大神知道是为什么?

springboot kafka同一个消费组内配置多个消费者,监听多个topic?

我使用springboot kafka 使用同一个消费组内配置多个消费者(因此会有多个@KafkaListener监听器),监听多个topic下的指定分区,如图所示,是这样配置吧?但使用@TopicPartition 时报错!提示:TopicPartition cannot be resolved to a type。网上别人都是这么使用的,各种查都找不到有人出现过这种错误!求指导? ![图片说明](https://img-ask.csdn.net/upload/201911/28/1574912436_908691.png)

kafka 消费者 获取消息

activeqmq 都是broker push 到 消费者,消费者 建立 messageListener 监听器 就可以 获取消息,但kafka 是 需要去broker pull消息, 怎么才能知道 broker中 已经 有了对应 topic 呢 ?定时 获取?

启动kafka 消费者有时获取不到消息

kafka消费者启动的时候有时候不能获取到消息,但是重启后就可以了,有时候还要重启好多次。。。不知道是为什么,希望大神能指导一下。 [ INFO ] [2016-09-29 14:34:53] org.hibernate.validator.internal.util.Version [30] - HV000001: Hibernate Validator 5.2.4.Final [ INFO ] [2016-09-29 14:34:53] com.coocaa.salad.stat.ApplicationMain [48] - Starting ApplicationMain on zhuxiang with PID 1740 (D:\IdeaProjects\green-salad\adx-stat\target\classes started by zhuxiang in D:\IdeaProjects\green-salad) [ INFO ] [2016-09-29 14:34:53] com.coocaa.salad.stat.ApplicationMain [663] - The following profiles are active: dev [ INFO ] [2016-09-29 14:34:54] org.springframework.boot.context.embedded.AnnotationConfigEmbeddedWebApplicationContext [581] - Refreshing org.springframework.boot.context.embedded.AnnotationConfigEmbeddedWebApplicationContext@5754de72: startup date [Thu Sep 29 14:34:54 CST 2016]; root of context hierarchy 2016-09-29 14:34:55 JRebel: Monitoring Spring bean definitions in 'D:\IdeaProjects\green-salad\adx-stat\target\classes\spring-integration-consumer.xml'. [ INFO ] [2016-09-29 14:34:55] org.springframework.beans.factory.xml.XmlBeanDefinitionReader [317] - Loading XML bean definitions from URL [file:/D:/IdeaProjects/green-salad/adx-stat/target/classes/spring-integration-consumer.xml] [ INFO ] [2016-09-29 14:34:56] org.springframework.beans.factory.config.PropertiesFactoryBean [172] - Loading properties file from URL [jar:file:/D:/maven-repo2/org/springframework/integration/spring-integration-core/4.3.1.RELEASE/spring-integration-core-4.3.1.RELEASE.jar!/META-INF/spring.integration.default.properties] 2016-09-29 14:34:56 JRebel: Monitoring properties in 'jar:file:/D:/maven-repo2/org/springframework/integration/spring-integration-core/4.3.1.RELEASE/spring-integration-core-4.3.1.RELEASE.jar!/META-INF/spring.integration.default.properties'. [ INFO ] [2016-09-29 14:34:56] org.springframework.integration.config.IntegrationRegistrar [330] - No bean named 'integrationHeaderChannelRegistry' has been explicitly defined. Therefore, a default DefaultHeaderChannelRegistry will be created. [ INFO ] [2016-09-29 14:34:56] org.springframework.beans.factory.support.DefaultListableBeanFactory [843] - Overriding bean definition for bean 'kafkaConsumerService' with a different definition: replacing [Generic bean: class [com.coocaa.salad.stat.service.KafkaConsumerService]; scope=singleton; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null; defined in file [D:\IdeaProjects\green-salad\adx-stat\target\classes\com\coocaa\salad\stat\service\KafkaConsumerService.class]] with [Generic bean: class [com.coocaa.salad.stat.service.KafkaConsumerService]; scope=; abstract=false; lazyInit=false; autowireMode=0; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=null; factoryMethodName=null; initMethodName=null; destroyMethodName=null; defined in URL [file:/D:/IdeaProjects/green-salad/adx-stat/target/classes/spring-integration-consumer.xml]] [ INFO ] [2016-09-29 14:34:57] org.springframework.integration.config.DefaultConfiguringBeanFactoryPostProcessor [130] - No bean named 'errorChannel' has been explicitly defined. Therefore, a default PublishSubscribeChannel will be created. [ INFO ] [2016-09-29 14:34:57] org.springframework.integration.config.DefaultConfiguringBeanFactoryPostProcessor [158] - No bean named 'taskScheduler' has been explicitly defined. Therefore, a default ThreadPoolTaskScheduler will be created. [ INFO ] [2016-09-29 14:34:57] org.springframework.context.support.PostProcessorRegistrationDelegate$BeanPostProcessorChecker [328] - Bean 'org.springframework.transaction.annotation.ProxyTransactionManagementConfiguration' of type [class org.springframework.transaction.annotation.ProxyTransactionManagementConfiguration$$EnhancerBySpringCGLIB$$3dea2e76] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) [ INFO ] [2016-09-29 14:34:58] org.springframework.beans.factory.config.PropertiesFactoryBean [172] - Loading properties file from URL [jar:file:/D:/maven-repo2/org/springframework/integration/spring-integration-core/4.3.1.RELEASE/spring-integration-core-4.3.1.RELEASE.jar!/META-INF/spring.integration.default.properties] 2016-09-29 14:34:58 JRebel: Monitoring properties in 'jar:file:/D:/maven-repo2/org/springframework/integration/spring-integration-core/4.3.1.RELEASE/spring-integration-core-4.3.1.RELEASE.jar!/META-INF/spring.integration.default.properties'. [ INFO ] [2016-09-29 14:34:58] org.springframework.context.support.PostProcessorRegistrationDelegate$BeanPostProcessorChecker [328] - Bean 'integrationGlobalProperties' of type [class org.springframework.beans.factory.config.PropertiesFactoryBean] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) [ INFO ] [2016-09-29 14:34:58] org.springframework.context.support.PostProcessorRegistrationDelegate$BeanPostProcessorChecker [328] - Bean 'integrationGlobalProperties' of type [class java.util.Properties] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) [ INFO ] [2016-09-29 14:34:59] org.springframework.boot.context.embedded.tomcat.TomcatEmbeddedServletContainer [88] - Tomcat initialized with port(s): 8081 (http) spring-integration-consumer.xml内容如下: <?xml version="1.0" encoding="UTF-8"?> <beans xmlns="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:int="http://www.springframework.org/schema/integration" xmlns:int-kafka="http://www.springframework.org/schema/integration/kafka" xmlns:task="http://www.springframework.org/schema/task" xsi:schemaLocation="http://www.springframework.org/schema/integration/kafka http://www.springframework.org/schema/integration/kafka/spring-integration-kafka-1.0.xsd http://www.springframework.org/schema/integration http://www.springframework.org/schema/integration/spring-integration.xsd http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd http://www.springframework.org/schema/task http://www.springframework.org/schema/task/spring-task.xsd"> <!-- topic test conf --> <int:channel id="inputFromKafka"> <int:dispatcher task-executor="kafkaMessageExecutor"/> </int:channel> <!-- zookeeper配置 可以配置多个 --> <int-kafka:zookeeper-connect id="zookeeperConnect" zk-connect="172.20.135.95:2181,172.20.135.95:2182" zk-connection-timeout="10000" zk-session-timeout="10000" zk-sync-time="2000"/> <!-- channel配置 auto-startup="true" 否则接收不发数据 --> <int-kafka:inbound-channel-adapter id="kafkaInboundChannelAdapter" kafka-consumer-context-ref="consumerContext" auto-startup="true" channel="inputFromKafka"> <int:poller fixed-delay="1" time-unit="MILLISECONDS"/> </int-kafka:inbound-channel-adapter> <task:executor id="kafkaMessageExecutor" pool-size="8" keep-alive="120" queue-capacity="500"/> <bean id="kafkaDecoder" class="org.springframework.integration.kafka.serializer.common.StringDecoder"/> <bean id="consumerProperties" class="org.springframework.beans.factory.config.PropertiesFactoryBean"> <property name="properties"> <props> <prop key="auto.offset.reset">smallest</prop> <prop key="socket.receive.buffer.bytes">10485760</prop> <!-- 10M --> <prop key="fetch.message.max.bytes">5242880</prop> <prop key="auto.commit.interval.ms">1000</prop> <prop key="auto.commit.enables">true</prop> </props> </property> </bean> <!-- 消息接收的BEEN --> <bean id="kafkaConsumerService" class="com.coocaa.salad.stat.service.KafkaConsumerService"/> <!-- 指定接收的方法 --> <int:outbound-channel-adapter channel="inputFromKafka" ref="kafkaConsumerService" method="processMessage"/> <int-kafka:consumer-context id="consumerContext" consumer-timeout="1000" zookeeper-connect="zookeeperConnect" consumer-properties="consumerProperties"> <int-kafka:consumer-configurations> <int-kafka:consumer-configuration group-id="group-4" value-decoder="kafkaDecoder" key-decoder="kafkaDecoder" max-messages="5000"> <!-- 两个TOPIC配置 --> <int-kafka:topic id="clientsRequests2" streams="4"/> <!--<int-kafka:topic id="sunneytopic" streams="4" />--> </int-kafka:consumer-configuration> </int-kafka:consumer-configurations> </int-kafka:consumer-context> </beans> kafka版本0.10的

怎么解决的 kafka消费内存溢出问题 ?

![图片说明](https://img-ask.csdn.net/upload/201906/06/1559809105_761647.png)![图片说明](https://img-ask.csdn.net/upload/201906/06/1559809115_821758.png) 并发消费导致内存溢出问题

kafka消费者处理慢的情况下如何提高消息处理速度?不允许增加分区

如题.最近的一个面试题,说是考虑kafka理论特性.具体要求我可能有理解错误.如果各位有研究一眼看出是什么问题,谢谢给个提示. 我搜索了下,提高消费性能的有: 增加分区个数(增加消费者并行数)[不允许]; 消费者使用多线程;如果消息处理是CPU密集的加多线程也没用啊; 或许我理解有问题? 换个问题? 生产者1秒生成1W消息.然而此时全部消费者1s只能消费5000,消息处理是纯CPU计算,问:在不添加分区的情况下如何消息处理速度?

springBoot整合kafka启动报错

报错日志如下,groupId=MyGroupId时可以正常连接,但是groupId=etl时就连接不到报错了,麻烦大神看看,如果需要别的相关文件窝在提供,在线等,给跪了 ``` _ _ _ _ _ _ __ _ ___ ____ | | | (_) | | | | (_)/ _| | | |__ \ |___ \ | |__| |_ | | | |_ __ _| |_ __ _ ___| |_ ) | __) | | __ | | | | | | '_ \| | _/ _` / __| __| / / |__ < | | | | | | |__| | | | | | || (_| \__ \ |_ / /_ _ ___) | |_| |_|_| \____/|_| |_|_|_| \__,_|___/\__| |____(_)____/ 2020-02-21 23:53:40.414 mallSearch [main] INFO c.u.j.c.EnableEncryptablePropertiesConfiguration - Bootstraping jasypt-string-boot auto configuration in context: application-1 2020-02-21 23:53:40.414 mallSearch [main] INFO c.c.mall.mallsearch.ReportManagerApplication - The following profiles are active: dev 2020-02-21 23:53:42.148 mallSearch [main] WARN org.mybatis.spring.mapper.ClassPathMapperScanner - No MyBatis mapper was found in '[cn.chinaunicom.sdsi.**.entity, cn.chinaunicom.mall.**.entity]' package. Please check your configuration. 2020-02-21 23:53:42.476 mallSearch [main] INFO o.s.d.r.config.RepositoryConfigurationDelegate - Multiple Spring Data modules found, entering strict repository configuration mode! 2020-02-21 23:53:42.476 mallSearch [main] INFO o.s.d.r.config.RepositoryConfigurationDelegate - Bootstrapping Spring Data repositories in DEFAULT mode. 2020-02-21 23:53:42.695 mallSearch [main] INFO o.s.d.r.config.RepositoryConfigurationDelegate - Finished Spring Data repository scanning in 219ms. Found 9 repository interfaces. 2020-02-21 23:53:42.710 mallSearch [main] INFO o.s.d.r.config.RepositoryConfigurationDelegate - Multiple Spring Data modules found, entering strict repository configuration mode! 2020-02-21 23:53:42.710 mallSearch [main] INFO o.s.d.r.config.RepositoryConfigurationDelegate - Bootstrapping Spring Data repositories in DEFAULT mode. 2020-02-21 23:53:42.804 mallSearch [main] INFO o.s.d.r.c.RepositoryConfigurationExtensionSupport - Spring Data Redis - Could not safely identify store assignment for repository candidate interface cn.chinaunicom.mall.mallsearch.repository.AttrCategoryDocumentRepository. 2020-02-21 23:53:42.804 mallSearch [main] INFO o.s.d.r.c.RepositoryConfigurationExtensionSupport - Spring Data Redis - Could not safely identify store assignment for repository candidate interface cn.chinaunicom.mall.mallsearch.repository.AttrDocumentRepository. 2020-02-21 23:53:42.819 mallSearch [main] INFO o.s.d.r.c.RepositoryConfigurationExtensionSupport - Spring Data Redis - Could not safely identify store assignment for repository candidate interface cn.chinaunicom.mall.mallsearch.repository.AttrValueDocumentRepository. 2020-02-21 23:53:42.819 mallSearch [main] INFO o.s.d.r.c.RepositoryConfigurationExtensionSupport - Spring Data Redis - Could not safely identify store assignment for repository candidate interface cn.chinaunicom.mall.mallsearch.repository.BrandDocumentRepository. 2020-02-21 23:53:42.819 mallSearch [main] INFO o.s.d.r.c.RepositoryConfigurationExtensionSupport - Spring Data Redis - Could not safely identify store assignment for repository candidate interface cn.chinaunicom.mall.mallsearch.repository.CategorytreeDocumentRepository. 2020-02-21 23:53:42.819 mallSearch [main] INFO o.s.d.r.c.RepositoryConfigurationExtensionSupport - Spring Data Redis - Could not safely identify store assignment for repository candidate interface cn.chinaunicom.mall.mallsearch.repository.GoodsPoolComSpuDocumentRepository. 2020-02-21 23:53:42.819 mallSearch [main] INFO o.s.d.r.c.RepositoryConfigurationExtensionSupport - Spring Data Redis - Could not safely identify store assignment for repository candidate interface cn.chinaunicom.mall.mallsearch.repository.GoodsSkuDocumentRepository. 2020-02-21 23:53:42.819 mallSearch [main] INFO o.s.d.r.c.RepositoryConfigurationExtensionSupport - Spring Data Redis - Could not safely identify store assignment for repository candidate interface cn.chinaunicom.mall.mallsearch.repository.GoodsSpuCategorytreeDocumentRepository. 2020-02-21 23:53:42.819 mallSearch [main] INFO o.s.d.r.c.RepositoryConfigurationExtensionSupport - Spring Data Redis - Could not safely identify store assignment for repository candidate interface cn.chinaunicom.mall.mallsearch.repository.GoodsSpuDocumentRepository. 2020-02-21 23:53:42.819 mallSearch [main] INFO o.s.d.r.config.RepositoryConfigurationDelegate - Finished Spring Data repository scanning in 31ms. Found 0 repository interfaces. 2020-02-21 23:53:43.522 mallSearch [main] INFO o.springframework.cloud.context.scope.GenericScope - BeanFactory id=f1ad9868-69a4-3087-8cf7-8a78137ec329 2020-02-21 23:53:43.554 mallSearch [main] INFO c.u.j.c.EnableEncryptablePropertiesBeanFactoryPostProcessor - Post-processing PropertySource instances 2020-02-21 23:53:43.601 mallSearch [main] INFO c.u.j.EncryptablePropertySourceConverter - Converting PropertySource configurationProperties [org.springframework.boot.context.properties.source.ConfigurationPropertySourcesPropertySource] to AOP Proxy 2020-02-21 23:53:43.601 mallSearch [main] INFO c.u.j.EncryptablePropertySourceConverter - Converting PropertySource servletConfigInitParams [org.springframework.core.env.PropertySource$StubPropertySource] to EncryptablePropertySourceWrapper 2020-02-21 23:53:43.632 mallSearch [main] INFO c.u.j.EncryptablePropertySourceConverter - Converting PropertySource servletContextInitParams [org.springframework.core.env.PropertySource$StubPropertySource] to EncryptablePropertySourceWrapper 2020-02-21 23:53:43.632 mallSearch [main] INFO c.u.j.EncryptablePropertySourceConverter - Converting PropertySource systemProperties [org.springframework.core.env.MapPropertySource] to EncryptableMapPropertySourceWrapper 2020-02-21 23:53:43.632 mallSearch [main] INFO c.u.j.EncryptablePropertySourceConverter - Converting PropertySource systemEnvironment [org.springframework.boot.env.SystemEnvironmentPropertySourceEnvironmentPostProcessor$OriginAwareSystemEnvironmentPropertySource] to EncryptableMapPropertySourceWrapper 2020-02-21 23:53:43.632 mallSearch [main] INFO c.u.j.EncryptablePropertySourceConverter - Converting PropertySource random [org.springframework.boot.env.RandomValuePropertySource] to EncryptablePropertySourceWrapper 2020-02-21 23:53:43.632 mallSearch [main] INFO c.u.j.EncryptablePropertySourceConverter - Converting PropertySource applicationConfig: [classpath:/application-dev.yml] [org.springframework.boot.env.OriginTrackedMapPropertySource] to EncryptableMapPropertySourceWrapper 2020-02-21 23:53:43.632 mallSearch [main] INFO c.u.j.EncryptablePropertySourceConverter - Converting PropertySource applicationConfig: [classpath:/config/application.yml] [org.springframework.boot.env.OriginTrackedMapPropertySource] to EncryptableMapPropertySourceWrapper 2020-02-21 23:53:43.632 mallSearch [main] INFO c.u.j.EncryptablePropertySourceConverter - Converting PropertySource applicationConfig: [classpath:/application.yml] [org.springframework.boot.env.OriginTrackedMapPropertySource] to EncryptableMapPropertySourceWrapper 2020-02-21 23:53:43.632 mallSearch [main] INFO c.u.j.EncryptablePropertySourceConverter - Converting PropertySource springCloudClientHostInfo [org.springframework.core.env.MapPropertySource] to EncryptableMapPropertySourceWrapper 2020-02-21 23:53:43.632 mallSearch [main] INFO c.u.j.EncryptablePropertySourceConverter - Converting PropertySource defaultProperties [org.springframework.core.env.MapPropertySource] to EncryptableMapPropertySourceWrapper 2020-02-21 23:53:43.694 mallSearch [main] INFO o.s.c.s.PostProcessorRegistrationDelegate$BeanPostProcessorChecker - Bean 'org.springframework.kafka.annotation.KafkaBootstrapConfiguration' of type [org.springframework.kafka.annotation.KafkaBootstrapConfiguration$$EnhancerBySpringCGLIB$$46cfe68c] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) 2020-02-21 23:53:43.710 mallSearch [main] INFO c.u.j.filter.DefaultLazyPropertyFilter - Property Filter custom Bean not found with name 'encryptablePropertyFilter'. Initializing Default Property Filter 2020-02-21 23:53:43.850 mallSearch [main] INFO o.s.c.s.PostProcessorRegistrationDelegate$BeanPostProcessorChecker - Bean 'org.springframework.security.config.annotation.configuration.ObjectPostProcessorConfiguration' of type [org.springframework.security.config.annotation.configuration.ObjectPostProcessorConfiguration$$EnhancerBySpringCGLIB$$bcb9d43] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) 2020-02-21 23:53:44.069 mallSearch [main] INFO o.s.c.s.PostProcessorRegistrationDelegate$BeanPostProcessorChecker - Bean 'objectPostProcessor' of type [org.springframework.security.config.annotation.configuration.AutowireBeanFactoryObjectPostProcessor] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) 2020-02-21 23:53:44.085 mallSearch [main] INFO o.s.c.s.PostProcessorRegistrationDelegate$BeanPostProcessorChecker - Bean 'org.springframework.security.access.expression.method.DefaultMethodSecurityExpressionHandler@70f5f59d' of type [org.springframework.security.oauth2.provider.expression.OAuth2MethodSecurityExpressionHandler] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) 2020-02-21 23:53:44.085 mallSearch [main] INFO o.s.c.s.PostProcessorRegistrationDelegate$BeanPostProcessorChecker - Bean 'org.springframework.security.config.annotation.method.configuration.GlobalMethodSecurityConfiguration' of type [org.springframework.security.config.annotation.method.configuration.GlobalMethodSecurityConfiguration$$EnhancerBySpringCGLIB$$30a03ff5] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) 2020-02-21 23:53:44.132 mallSearch [main] INFO o.s.c.s.PostProcessorRegistrationDelegate$BeanPostProcessorChecker - Bean 'methodSecurityMetadataSource' of type [org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) 2020-02-21 23:53:44.163 mallSearch [main] INFO c.u.j.resolver.DefaultLazyPropertyResolver - Property Resolver custom Bean not found with name 'encryptablePropertyResolver'. Initializing Default Property Resolver 2020-02-21 23:53:44.163 mallSearch [main] INFO c.u.j.detector.DefaultLazyPropertyDetector - Property Detector custom Bean not found with name 'encryptablePropertyDetector'. Initializing Default Property Detector 2020-02-21 23:53:44.179 mallSearch [main] INFO o.s.c.s.PostProcessorRegistrationDelegate$BeanPostProcessorChecker - Bean 'spring.cache-org.springframework.boot.autoconfigure.cache.CacheProperties' of type [org.springframework.boot.autoconfigure.cache.CacheProperties] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) 2020-02-21 23:53:44.194 mallSearch [main] INFO o.s.c.s.PostProcessorRegistrationDelegate$BeanPostProcessorChecker - Bean 'unifastRedisCacheConfig' of type [cn.chinaunicom.sdsi.security.cache.config.UnifastRedisCacheConfig$$EnhancerBySpringCGLIB$$7185313] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) 2020-02-21 23:53:44.335 mallSearch [main] INFO o.s.c.s.PostProcessorRegistrationDelegate$BeanPostProcessorChecker - Bean 'org.springframework.cloud.autoconfigure.ConfigurationPropertiesRebinderAutoConfiguration' of type [org.springframework.cloud.autoconfigure.ConfigurationPropertiesRebinderAutoConfiguration$$EnhancerBySpringCGLIB$$8f37d806] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) 2020-02-21 23:53:45.006 mallSearch [main] INFO o.s.boot.web.embedded.tomcat.TomcatWebServer - Tomcat initialized with port(s): 9011 (http) 2020-02-21 23:53:45.022 mallSearch [main] INFO org.apache.coyote.http11.Http11NioProtocol - Initializing ProtocolHandler ["http-nio-9011"] 2020-02-21 23:53:45.038 mallSearch [main] INFO org.apache.catalina.core.StandardService - Starting service [Tomcat] 2020-02-21 23:53:45.038 mallSearch [main] INFO org.apache.catalina.core.StandardEngine - Starting Servlet engine: [Apache Tomcat/9.0.16] 2020-02-21 23:53:45.053 mallSearch [main] INFO org.apache.catalina.core.AprLifecycleListener - The APR based Apache Tomcat Native library which allows optimal performance in production environments was not found on the java.library.path: [C:\Program Files\Java\jdk1.8.0_144\bin;C:\WINDOWS\Sun\Java\bin;C:\WINDOWS\system32;C:\WINDOWS;C:\Program Files (x86)\Intel\iCLS Client\;C:\ProgramData\Oracle\Java\javapath;C:\Program Files\Intel\iCLS Client\;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem;C:\WINDOWS\System32\WindowsPowerShell\v1.0\;C:\Program Files\Java\jdk1.8.0_144\bin;C:\Program Files\Git\cmd;C:\Program Files\apache-maven-3.2.2\bin;C:\Program Files\Mysql\bin;C:\Program Files\MySQL\MySQL Utilities 1.6\;C:\Program Files (x86)\Intel\Intel(R) Management Engine Components\DAL;C:\Program Files\Intel\Intel(R) Management Engine Components\DAL;C:\Program Files (x86)\Intel\Intel(R) Management Engine Components\IPT;C:\Program Files\Intel\Intel(R) Management Engine Components\IPT;C:\Program Files\TortoiseSVN\bin;D:\工具\curl-7.59.0-win64-mingw\bin;C:\WINDOWS\System32\OpenSSH\;C:\Windows\WinSxS\amd64_microsoft-windows-telnet-client_31bf3856ad364e35_10.0.17134.1_none_9db21dbc8e34d070;C:\Program Files\nodejs\;D:\nginx-1.13.7;C:\Program Files\erl10.4\bin;D:\工具\rabbitmq_server-3.7.15\sbin;C:\Program Files\Intel\WiFi\bin\;C:\Program Files\Common Files\Intel\WirelessCommon\;C:\Program Files\TortoiseGit\bin;C:\Users\xiaolei\AppData\Local\Microsoft\WindowsApps;;C:\Users\xiaolei\AppData\Local\Programs\Microsoft VS Code\bin;C:\Users\xiaolei\AppData\Roaming\npm;.] 2020-02-21 23:53:45.350 mallSearch [main] INFO o.a.c.core.ContainerBase.[Tomcat].[localhost].[/] - Initializing Spring embedded WebApplicationContext 2020-02-21 23:53:45.350 mallSearch [main] INFO org.springframework.web.context.ContextLoader - Root WebApplicationContext: initialization completed in 4905 ms 2020-02-21 23:53:47.034 mallSearch [main] INFO org.elasticsearch.plugins.PluginsService - no modules loaded 2020-02-21 23:53:47.036 mallSearch [main] INFO org.elasticsearch.plugins.PluginsService - loaded plugin [org.elasticsearch.index.reindex.ReindexPlugin] 2020-02-21 23:53:47.036 mallSearch [main] INFO org.elasticsearch.plugins.PluginsService - loaded plugin [org.elasticsearch.join.ParentJoinPlugin] 2020-02-21 23:53:47.039 mallSearch [main] INFO org.elasticsearch.plugins.PluginsService - loaded plugin [org.elasticsearch.percolator.PercolatorPlugin] 2020-02-21 23:53:47.039 mallSearch [main] INFO org.elasticsearch.plugins.PluginsService - loaded plugin [org.elasticsearch.script.mustache.MustachePlugin] 2020-02-21 23:53:47.039 mallSearch [main] INFO org.elasticsearch.plugins.PluginsService - loaded plugin [org.elasticsearch.transport.Netty4Plugin] 2020-02-21 23:53:49.084 mallSearch [main] INFO o.s.d.e.client.TransportClientFactoryBean - Adding transport node : 10.236.6.52:9200 2020-02-21 23:54:20.425 mallSearch [main] ERROR o.s.d.e.r.support.AbstractElasticsearchRepository - failed to load elasticsearch nodes : org.elasticsearch.client.transport.NoNodeAvailableException: None of the configured nodes are available: [{#transport#-1}{yZBpr1VZQfm3KgYuKq0eRA}{10.236.6.52}{10.236.6.52:9200}] 2020-02-21 23:54:20.940 mallSearch [main] ERROR o.s.d.e.r.support.AbstractElasticsearchRepository - failed to load elasticsearch nodes : org.elasticsearch.client.transport.NoNodeAvailableException: None of the configured nodes are available: [{#transport#-1}{yZBpr1VZQfm3KgYuKq0eRA}{10.236.6.52}{10.236.6.52:9200}] 2020-02-21 23:54:20.971 mallSearch [main] ERROR o.s.d.e.r.support.AbstractElasticsearchRepository - failed to load elasticsearch nodes : org.elasticsearch.client.transport.NoNodeAvailableException: None of the configured nodes are available: [{#transport#-1}{yZBpr1VZQfm3KgYuKq0eRA}{10.236.6.52}{10.236.6.52:9200}] 2020-02-21 23:54:21.034 mallSearch [main] ERROR o.s.d.e.r.support.AbstractElasticsearchRepository - failed to load elasticsearch nodes : org.elasticsearch.client.transport.NoNodeAvailableException: None of the configured nodes are available: [{#transport#-1}{yZBpr1VZQfm3KgYuKq0eRA}{10.236.6.52}{10.236.6.52:9200}] 2020-02-21 23:54:21.065 mallSearch [main] ERROR o.s.d.e.r.support.AbstractElasticsearchRepository - failed to load elasticsearch nodes : org.elasticsearch.client.transport.NoNodeAvailableException: None of the configured nodes are available: [{#transport#-1}{yZBpr1VZQfm3KgYuKq0eRA}{10.236.6.52}{10.236.6.52:9200}] 2020-02-21 23:54:21.112 mallSearch [main] ERROR o.s.d.e.r.support.AbstractElasticsearchRepository - failed to load elasticsearch nodes : org.elasticsearch.client.transport.NoNodeAvailableException: None of the configured nodes are available: [{#transport#-1}{yZBpr1VZQfm3KgYuKq0eRA}{10.236.6.52}{10.236.6.52:9200}] 2020-02-21 23:54:21.315 mallSearch [main] ERROR o.s.d.e.r.support.AbstractElasticsearchRepository - failed to load elasticsearch nodes : org.elasticsearch.client.transport.NoNodeAvailableException: None of the configured nodes are available: [{#transport#-1}{yZBpr1VZQfm3KgYuKq0eRA}{10.236.6.52}{10.236.6.52:9200}] 2020-02-21 23:54:21.362 mallSearch [main] ERROR o.s.d.e.r.support.AbstractElasticsearchRepository - failed to load elasticsearch nodes : org.elasticsearch.client.transport.NoNodeAvailableException: None of the configured nodes are available: [{#transport#-1}{yZBpr1VZQfm3KgYuKq0eRA}{10.236.6.52}{10.236.6.52:9200}] 2020-02-21 23:54:21.643 mallSearch [main] INFO org.redisson.Version - Redisson 3.12.0 2020-02-21 23:54:21.909 mallSearch [redisson-netty-4-24] INFO o.r.connection.pool.MasterPubSubConnectionPool - 1 connections initialized for 10.236.6.54/10.236.6.54:6379 2020-02-21 23:54:21.956 mallSearch [redisson-netty-4-28] INFO org.redisson.connection.pool.MasterConnectionPool - 20 connections initialized for 10.236.6.54/10.236.6.54:6379 2020-02-21 23:54:22.690 mallSearch [main] ERROR o.s.d.e.r.support.AbstractElasticsearchRepository - failed to load elasticsearch nodes : org.elasticsearch.client.transport.NoNodeAvailableException: None of the configured nodes are available: [{#transport#-1}{yZBpr1VZQfm3KgYuKq0eRA}{10.236.6.52}{10.236.6.52:9200}] 2020-02-21 23:54:22.971 mallSearch [main] INFO s.d.s.w.PropertySourcedRequestMappingHandlerMapping - Mapped URL path [/v2/api-docs] onto method [public org.springframework.http.ResponseEntity<springfox.documentation.spring.web.json.Json> springfox.documentation.swagger2.web.Swagger2Controller.getDocumentation(java.lang.String,javax.servlet.http.HttpServletRequest)] 2020-02-21 23:54:23.408 mallSearch [main] INFO o.s.security.web.DefaultSecurityFilterChain - Creating filter chain: any request, [org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter@2408ca4c, org.springframework.security.web.context.SecurityContextPersistenceFilter@29509774, org.springframework.security.web.header.HeaderWriterFilter@3741a170, org.springframework.security.web.authentication.logout.LogoutFilter@1988e095, org.springframework.security.oauth2.provider.authentication.OAuth2AuthenticationProcessingFilter@4870d2e1, org.springframework.security.web.savedrequest.RequestCacheAwareFilter@9198fe3, org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter@4dfe14d4, org.springframework.security.web.authentication.AnonymousAuthenticationFilter@2f7e2481, org.springframework.security.web.session.SessionManagementFilter@26c24e5, org.springframework.security.web.access.ExceptionTranslationFilter@4a467f08, org.springframework.security.web.access.intercept.FilterSecurityInterceptor@5de15ce1] _____ __________ __________________ _______ ________ ______________ __ / / /___ __ \___ ____/___ __ \__ __ \___ __ \___ __/__|__ \ _ / / / __ /_/ /__ __/ __ /_/ /_ / / /__ /_/ /__ / ____/ / / /_/ / _ _, _/ _ /___ _ ____/ / /_/ / _ _, _/ _ / _ __/ \____/ /_/ |_| /_____/ /_/ \____/ /_/ |_| /_/ /____/ ........................................................................................................ . uReport, is a Chinese style report engine licensed under the Apache License 2.0, . . which is opensource, easy to use,high-performance, with browser-based-designer. . ........................................................................................................ 2020-02-21 23:54:25.330 mallSearch [main] WARN com.netflix.config.sources.URLConfigurationSource - No URLs will be polled as dynamic configuration sources. 2020-02-21 23:54:25.330 mallSearch [main] INFO com.netflix.config.sources.URLConfigurationSource - To enable URLs as dynamic configuration sources, define System property archaius.configurationSource.additionalUrls or make config.properties available on classpath. 2020-02-21 23:54:25.345 mallSearch [main] WARN com.netflix.config.sources.URLConfigurationSource - No URLs will be polled as dynamic configuration sources. 2020-02-21 23:54:25.345 mallSearch [main] INFO com.netflix.config.sources.URLConfigurationSource - To enable URLs as dynamic configuration sources, define System property archaius.configurationSource.additionalUrls or make config.properties available on classpath. 2020-02-21 23:54:25.720 mallSearch [main] INFO o.s.scheduling.concurrent.ThreadPoolTaskExecutor - Initializing ExecutorService 'applicationTaskExecutor' 2020-02-21 23:54:27.095 mallSearch [main] WARN o.s.b.a.freemarker.FreeMarkerAutoConfiguration - Cannot find template location(s): [classpath:/templates/] (please add some templates, check your FreeMarker configuration, or set spring.freemarker.checkTemplateLocation=false) 2020-02-21 23:54:29.282 mallSearch [main] INFO o.s.cloud.netflix.eureka.InstanceInfoFactory - Setting initial instance status as: STARTING 2020-02-21 23:54:29.360 mallSearch [main] INFO com.netflix.discovery.DiscoveryClient - Initializing Eureka in region us-east-1 2020-02-21 23:54:29.438 mallSearch [main] INFO c.n.discovery.provider.DiscoveryJerseyProvider - Using JSON encoding codec LegacyJacksonJson 2020-02-21 23:54:29.438 mallSearch [main] INFO c.n.discovery.provider.DiscoveryJerseyProvider - Using JSON decoding codec LegacyJacksonJson 2020-02-21 23:54:29.610 mallSearch [main] INFO c.n.discovery.provider.DiscoveryJerseyProvider - Using XML encoding codec XStreamXml 2020-02-21 23:54:29.610 mallSearch [main] INFO c.n.discovery.provider.DiscoveryJerseyProvider - Using XML decoding codec XStreamXml 2020-02-21 23:54:29.954 mallSearch [main] INFO c.n.d.shared.resolver.aws.ConfigClusterResolver - Resolving eureka endpoints via configuration 2020-02-21 23:54:30.266 mallSearch [main] INFO com.netflix.discovery.DiscoveryClient - Disable delta property : false 2020-02-21 23:54:30.375 mallSearch [main] INFO com.netflix.discovery.DiscoveryClient - Single vip registry refresh property : null 2020-02-21 23:54:30.375 mallSearch [main] INFO com.netflix.discovery.DiscoveryClient - Force full registry fetch : false 2020-02-21 23:54:30.375 mallSearch [main] INFO com.netflix.discovery.DiscoveryClient - Application is null : false 2020-02-21 23:54:30.375 mallSearch [main] INFO com.netflix.discovery.DiscoveryClient - Registered Applications size is zero : true 2020-02-21 23:54:30.375 mallSearch [main] INFO com.netflix.discovery.DiscoveryClient - Application version is -1: true 2020-02-21 23:54:30.375 mallSearch [main] INFO com.netflix.discovery.DiscoveryClient - Getting all instance registry info from the eureka server 2020-02-21 23:54:30.828 mallSearch [main] INFO com.netflix.discovery.DiscoveryClient - The response status is 200 2020-02-21 23:54:30.844 mallSearch [main] INFO com.netflix.discovery.DiscoveryClient - Starting heartbeat executor: renew interval is: 30 2020-02-21 23:54:30.844 mallSearch [main] INFO com.netflix.discovery.InstanceInfoReplicator - InstanceInfoReplicator onDemand update allowed rate per min is 4 2020-02-21 23:54:30.844 mallSearch [main] INFO com.netflix.discovery.DiscoveryClient - Discovery Client initialized at timestamp 1582300470844 with initial instances count: 8 2020-02-21 23:54:30.875 mallSearch [main] INFO o.s.c.n.e.serviceregistry.EurekaServiceRegistry - Registering application reportmanager with eureka with status UP 2020-02-21 23:54:30.875 mallSearch [main] INFO com.netflix.discovery.DiscoveryClient - Saw local status change event StatusChangeEvent [timestamp=1582300470875, current=UP, previous=STARTING] 2020-02-21 23:54:30.907 mallSearch [main] INFO org.apache.kafka.clients.consumer.ConsumerConfig - ConsumerConfig values: auto.commit.interval.ms = 1000 auto.offset.reset = earliest bootstrap.servers = [10.236.6.52:9092] check.crcs = true client.id = connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = true exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = myGroupId heartbeat.interval.ms = 3000 interceptor.classes = [] internal.leave.group.on.close = true isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 1000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT send.buffer.bytes = 131072 session.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = https ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.StringDeserializer 2020-02-21 23:54:31.016 mallSearch [main] INFO org.apache.kafka.common.utils.AppInfoParser - Kafka version : 2.0.1 2020-02-21 23:54:31.016 mallSearch [main] INFO org.apache.kafka.common.utils.AppInfoParser - Kafka commitId : fa14705e51bd2ce5 2020-02-21 23:54:31.313 mallSearch [DiscoveryClient-InstanceInfoReplicator-0] INFO com.netflix.discovery.DiscoveryClient - DiscoveryClient_REPORTMANAGER/192.168.62.1:9011: registering service... 2020-02-21 23:54:31.547 mallSearch [DiscoveryClient-InstanceInfoReplicator-0] INFO com.netflix.discovery.DiscoveryClient - DiscoveryClient_REPORTMANAGER/192.168.62.1:9011 - registration status: 204 2020-02-21 23:54:31.656 mallSearch [main] INFO org.apache.kafka.clients.Metadata - Cluster ID: rgWO07ohQH6Rn35woczhUQ 2020-02-21 23:54:31.719 mallSearch [main] INFO org.apache.kafka.clients.consumer.ConsumerConfig - ConsumerConfig values: auto.commit.interval.ms = 1000 auto.offset.reset = earliest bootstrap.servers = [10.236.6.52:9092] check.crcs = true client.id = connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = true exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = myGroupId heartbeat.interval.ms = 3000 interceptor.classes = [] internal.leave.group.on.close = true isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 1000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT send.buffer.bytes = 131072 session.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = https ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.StringDeserializer 2020-02-21 23:54:31.719 mallSearch [main] INFO org.apache.kafka.common.utils.AppInfoParser - Kafka version : 2.0.1 2020-02-21 23:54:31.719 mallSearch [main] INFO org.apache.kafka.common.utils.AppInfoParser - Kafka commitId : fa14705e51bd2ce5 2020-02-21 23:54:31.719 mallSearch [main] INFO o.s.scheduling.concurrent.ThreadPoolTaskScheduler - Initializing ExecutorService 2020-02-21 23:54:31.734 mallSearch [main] INFO org.apache.kafka.clients.consumer.ConsumerConfig - ConsumerConfig values: auto.commit.interval.ms = 5000 auto.offset.reset = earliest bootstrap.servers = [127.0.0.1:9092] check.crcs = true client.id = connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = true exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = etl heartbeat.interval.ms = 3000 interceptor.classes = [] internal.leave.group.on.close = true isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 1000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 180000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT send.buffer.bytes = 131072 session.timeout.ms = 120000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] ssl.endpoint.identification.algorithm = https ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLS ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.StringDeserializer 2020-02-21 23:54:31.734 mallSearch [main] INFO org.apache.kafka.common.utils.AppInfoParser - Kafka version : 2.0.1 2020-02-21 23:54:31.750 mallSearch [main] INFO org.apache.kafka.common.utils.AppInfoParser - Kafka commitId : fa14705e51bd2ce5 2020-02-21 23:54:32.016 mallSearch [org.springframework.kafka.KafkaListenerEndpointContainer#6-0-C-1] INFO org.apache.kafka.clients.Metadata - Cluster ID: rgWO07ohQH6Rn35woczhUQ 2020-02-21 23:54:32.047 mallSearch [org.springframework.kafka.KafkaListenerEndpointContainer#6-0-C-1] INFO o.a.k.c.consumer.internals.AbstractCoordinator - [Consumer clientId=consumer-2, groupId=myGroupId] Discovered group coordinator 10.236.6.52:9092 (id: 2147483646 rack: null) 2020-02-21 23:54:32.063 mallSearch [org.springframework.kafka.KafkaListenerEndpointContainer#6-0-C-1] INFO o.a.k.c.consumer.internals.ConsumerCoordinator - [Consumer clientId=consumer-2, groupId=myGroupId] Revoking previously assigned partitions [] 2020-02-21 23:54:32.063 mallSearch [org.springframework.kafka.KafkaListenerEndpointContainer#6-0-C-1] INFO o.s.kafka.listener.KafkaMessageListenerContainer - partitions revoked: [] 2020-02-21 23:54:32.063 mallSearch [org.springframework.kafka.KafkaListenerEndpointContainer#6-0-C-1] INFO o.a.k.c.consumer.internals.AbstractCoordinator - [Consumer clientId=consumer-2, groupId=myGroupId] (Re-)joining group 2020-02-21 23:54:32.391 mallSearch [org.springframework.kafka.KafkaListenerEndpointContainer#6-0-C-1] INFO o.a.k.c.consumer.internals.AbstractCoordinator - [Consumer clientId=consumer-2, groupId=myGroupId] Successfully joined group with generation 21 2020-02-21 23:54:32.406 mallSearch [org.springframework.kafka.KafkaListenerEndpointContainer#6-0-C-1] INFO o.a.k.c.consumer.internals.ConsumerCoordinator - [Consumer clientId=consumer-2, groupId=myGroupId] Setting newly assigned partitions [es-mall-brand-update-0] 2020-02-21 23:54:32.625 mallSearch [org.springframework.kafka.KafkaListenerEndpointContainer#6-0-C-1] INFO o.s.kafka.listener.KafkaMessageListenerContainer - partitions assigned: [es-mall-brand-update-0] 2020-02-21 23:54:32.828 mallSearch [main] WARN org.apache.kafka.clients.NetworkClient - [Consumer clientId=consumer-3, groupId=etl] Connection to node -1 could not be established. Broker may not be available. 2020-02-21 23:54:33.969 mallSearch [main] WARN org.apache.kafka.clients.NetworkClient - [Consumer clientId=consumer-3, groupId=etl] Connection to node -1 could not be established. Broker may not be available. 2020-02-21 23:54:35.219 mallSearch [main] WARN org.apache.kafka.clients.NetworkClient - [Consumer clientId=consumer-3, groupId=etl] Connection to node -1 could not be established. Broker may not be available. 2020-02-21 23:54:36.468 mallSearch [main] WARN org.apache.kafka.clients.NetworkClient - [Consumer clientId=consumer-3, groupId=etl] Connection to node -1 could not be established. Broker may not be available. 2020-02-21 23:54:37.937 mallSearch [main] WARN org.apache.kafka.clients.NetworkClient - [Consumer clientId=consumer-3, groupId=etl] Connection to node -1 could not be established. Broker may not be available. 2020-02-21 23:54:39.764 mallSearch [main] WARN org.apache.kafka.clients.NetworkClient - [Consumer clientId=consumer-3, groupId=etl] Connection to node -1 could not be established. Broker may not be available. 2020-02-21 23:54:41.904 mallSearch [main] WARN org.apache.kafka.clients.NetworkClient - [Consumer clientId=consumer-3, groupId=etl] Connection to node -1 could not be established. Broker may not be available. 2020-02-21 23:54:44.185 mallSearch [main] WARN org.apache.kafka.clients.NetworkClient - [Consumer clientId=consumer-3, groupId=etl] Connection to node -1 could not be established. Broker may not be available. 2020-02-21 23:54:46.372 mallSearch [main] WARN org.apache.kafka.clients.NetworkClient - [Consumer clientId=consumer-3, groupId=etl] Connection to node -1 could not be established. Broker may not be available. 2020-02-21 23:54:48.309 mallSearch [main] WARN org.apache.kafka.clients.NetworkClient - [Consumer clientId=consumer-3, groupId=etl] Connection to node -1 could not be established. Broker may not be available. 2020-02-21 23:54:50.231 mallSearch [main] WARN org.apache.kafka.clients.NetworkClient - [Consumer clientId=consumer-3, groupId=etl] Connection to node -1 could not be established. Broker may not be available. 2020-02-21 23:54:52.152 mallSearch [main] WARN org.apache.kafka.clients.NetworkClient - [Consumer clientId=consumer-3, groupId=etl] Connection to node -1 could not be established. Broker may not be available. 2020-02-21 23:54:54.370 mallSearch [main] WARN org.apache.kafka.clients.NetworkClient - [Consumer clientId=consumer-3, groupId=etl] Connection to node -1 could not be established. Broker may not be available. 2020-02-21 23:54:56.495 mallSearch [main] WARN org.apache.kafka.clients.NetworkClient - [Consumer clientId=consumer-3, groupId=etl] Connection to node -1 could not be established. Broker may not be available. 2020-02-21 23:54:58.416 mallSearch [main] WARN org.apache.kafka.clients.NetworkClient - [Consumer clientId=consumer-3, groupId=etl] Connection to node -1 could not be established. Broker may not be available. 2020-02-21 23:55:00.588 mallSearch [main] WARN org.apache.kafka.clients.NetworkClient - [Consumer clientId=consumer-3, groupId=etl] Connection to node -1 could not be established. Broker may not be available. 2020-02-21 23:55:02.602 mallSearch [main] WARN org.apache.kafka.clients.NetworkClient - [Consumer clientId=consumer-3, groupId=etl] Connection to node -1 could not be established. Broker may not be available. 2020-02-21 23:55:04.540 mallSearch [main] WARN org.apache.kafka.clients.NetworkClient - [Consumer clientId=consumer-3, groupId=etl] Connection to node -1 could not be established. Broker may not be available. 2020-02-21 23:55:06.570 mallSearch [main] WARN org.apache.kafka.clients.NetworkClient - [Consumer clientId=consumer-3, groupId=etl] Connection to node -1 could not be established. Broker may not be available. 2020-02-21 23:55:08.617 mallSearch [main] WARN org.apache.kafka.clients.NetworkClient - [Consumer clientId=consumer-3, groupId=etl] Connection to node -1 could not be established. Broker may not be available. 2020-02-21 23:55:10.507 mallSearch [main] WARN org.apache.kafka.clients.NetworkClient - [Consumer clientId=consumer-3, groupId=etl] Connection to node -1 could not be established. Broker may not be available. 2020-02-21 23:55:12.459 mallSearch [main] WARN org.apache.kafka.clients.NetworkClient - [Consumer clientId=consumer-3, groupId=etl] Connection to node -1 could not be established. Broker may not be available. 2020-02-21 23:55:14.506 mallSearch [main] WARN org.apache.kafka.clients.NetworkClient - [Consumer clientId=consumer-3, groupId=etl] Connection to node -1 could not be established. Broker may not be available. 2020-02-21 23:55:16.631 mallSearch [main] WARN org.apache.kafka.clients.NetworkClient - [Consumer clientId=consumer-3, groupId=etl] Connection to node -1 could not be established. Broker may not be available. 2020-02-21 23:55:18.802 mallSearch [main] WARN org.apache.kafka.clients.NetworkClient - [Consumer clientId=consumer-3, groupId=etl] Connection to node -1 could not be established. Broker may not be available. 2020-02-21 23:55:20.708 mallSearch [main] WARN org.apache.kafka.clients.NetworkClient - [Consumer clientId=consumer-3, groupId=etl] Connection to node -1 could not be established. Broker may not be available. 2020-02-21 23:55:22.848 mallSearch [main] WARN org.apache.kafka.clients.NetworkClient - [Consumer clientId=consumer-3, groupId=etl] Connection to node -1 could not be established. Broker may not be available. 2020-02-21 23:55:24.988 mallSearch [main] WARN org.apache.kafka.clients.NetworkClient - [Consumer clientId=consumer-3, groupId=etl] Connection to node -1 could not be established. Broker may not be available. 2020-02-21 23:55:27.113 mallSearch [main] WARN org.apache.kafka.clients.NetworkClient - [Consumer clientId=consumer-3, groupId=etl] Connection to node -1 could not be established. Broker may not be available. 2020-02-21 23:55:29.346 mallSearch [main] WARN org.apache.kafka.clients.NetworkClient - [Consumer clientId=consumer-3, groupId=etl] Connection to node -1 could not be established. Broker may not be available. 2020-02-21 23:55:31.611 mallSearch [main] WARN org.apache.kafka.clients.NetworkClient - [Consumer clientId=consumer-3, groupId=etl] Connection to node -1 could not be established. Broker may not be available. 2020-02-21 23:55:31.750 mallSearch [main] WARN o.s.b.w.s.c.AnnotationConfigServletWebServerApplicationContext - Exception encountered during context initialization - cancelling refresh attempt: org.springframework.context.ApplicationContextException: Failed to start bean 'org.springframework.kafka.config.internalKafkaListenerEndpointRegistry'; nested exception is org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic metadata 2020-02-21 23:55:31.782 mallSearch [main] INFO o.s.scheduling.concurrent.ThreadPoolTaskExecutor - Shutting down ExecutorService 'applicationTaskExecutor' 2020-02-21 23:55:32.516 mallSearch [main] INFO o.s.jmx.export.annotation.AnnotationMBeanExporter - Could not unregister MBean [com.github.tobato.fastdfs.conn:name=fdfsConnectionPool,type=FdfsConnectionPool] as said MBean is not registered (perhaps already unregistered by an external process) 2020-02-21 23:56:00.384 mallSearch [main] INFO com.netflix.discovery.DiscoveryClient - Shutting down DiscoveryClient ... 2020-02-21 23:56:00.431 mallSearch [main] WARN o.s.c.annotation.CommonAnnotationBeanPostProcessor - Destroy method on bean with name 'scopedTarget.eurekaClient' threw an exception: org.springframework.beans.factory.BeanCreationNotAllowedException: Error creating bean with name 'eurekaInstanceConfigBean': Singleton bean creation not allowed while singletons of this factory are in destruction (Do not request a bean from a BeanFactory in a destroy method implementation!) 2020-02-21 23:56:00.431 mallSearch [main] INFO org.apache.catalina.core.StandardService - Stopping service [Tomcat] 2020-02-21 23:56:00.462 mallSearch [main] INFO o.s.b.a.l.ConditionEvaluationReportLoggingListener - Error starting ApplicationContext. To display the conditions report re-run your application with 'debug' enabled. 2020-02-21 23:56:00.478 mallSearch [main] ERROR org.springframework.boot.SpringApplication - Application run failed org.springframework.context.ApplicationContextException: Failed to start bean 'org.springframework.kafka.config.internalKafkaListenerEndpointRegistry'; nested exception is org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic metadata at org.springframework.context.support.DefaultLifecycleProcessor.doStart(DefaultLifecycleProcessor.java:185) at org.springframework.context.support.DefaultLifecycleProcessor.access$200(DefaultLifecycleProcessor.java:53) at org.springframework.context.support.DefaultLifecycleProcessor$LifecycleGroup.start(DefaultLifecycleProcessor.java:360) at org.springframework.context.support.DefaultLifecycleProcessor.startBeans(DefaultLifecycleProcessor.java:158) at org.springframework.context.support.DefaultLifecycleProcessor.onRefresh(DefaultLifecycleProcessor.java:122) at org.springframework.context.support.AbstractApplicationContext.finishRefresh(AbstractApplicationContext.java:893) at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.finishRefresh(ServletWebServerApplicationContext.java:163) at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:552) at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:142) at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:775) at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:397) at org.springframework.boot.SpringApplication.run(SpringApplication.java:316) at org.springframework.boot.SpringApplication.run(SpringApplication.java:1260) at org.springframework.boot.SpringApplication.run(SpringApplication.java:1248) at cn.chinaunicom.mall.mallsearch.ReportManagerApplication.main(ReportManagerApplication.java:85) Caused by: org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic metadata 2020-02-21 23:56:01.634 mallSearch [DiscoveryClient-InstanceInfoReplicator-0] WARN com.netflix.discovery.InstanceInfoReplicator - There was a problem with the instance info replicator org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'eurekaInstanceConfigBean' defined in class path resource [org/springframework/cloud/netflix/eureka/EurekaClientAutoConfiguration.class]: Unsatisfied dependency expressed through method 'eurekaInstanceConfigBean' parameter 0; nested exception is org.springframework.boot.context.properties.ConfigurationPropertiesBindException: Error creating bean with name 'inetUtilsProperties': Could not bind properties to 'InetUtilsProperties' : prefix=spring.cloud.inetutils, ignoreInvalidFields=false, ignoreUnknownFields=true; nested exception is java.lang.IllegalStateException: org.springframework.boot.web.servlet.context.AnnotationConfigServletWebServerApplicationContext@31c2affc has not been refreshed yet at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:769) at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:509) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFactory.java:1305) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1144) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:555) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:515) at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:320) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:318) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199) at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:277) at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1247) at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1167) at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:857) at org.springframework.beans.factory.support.ConstructorResolver.resolvePreparedArguments(ConstructorResolver.java:804) at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:430) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFactory.java:1305) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1144) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:555) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:515) at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$1(AbstractBeanFactory.java:356) at org.springframework.cloud.context.scope.GenericScope$BeanLifecycleWrapper.getBean(GenericScope.java:390) at org.springframework.cloud.context.scope.GenericScope.get(GenericScope.java:184) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:353) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199) at org.springframework.aop.target.SimpleBeanTargetSource.getTarget(SimpleBeanTargetSource.java:35) at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:672) at com.netflix.appinfo.ApplicationInfoManager$$EnhancerBySpringCGLIB$$918f0f04.refreshDataCenterInfoIfRequired(<generated>) at com.netflix.discovery.DiscoveryClient.refreshInstanceInfo(DiscoveryClient.java:1377) at com.netflix.discovery.InstanceInfoReplicator.run(InstanceInfoReplicator.java:117) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run$$$capture(FutureTask.java:266) at java.util.concurrent.FutureTask.run(FutureTask.java) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: org.springframework.boot.context.properties.ConfigurationPropertiesBindException: Error creating bean with name 'inetUtilsProperties': Could not bind properties to 'InetUtilsProperties' : prefix=spring.cloud.inetutils, ignoreInvalidFields=false, ignoreUnknownFields=true; nested exception is java.lang.IllegalStateException: org.springframework.boot.web.servlet.context.AnnotationConfigServletWebServerApplicationContext@31c2affc has not been refreshed yet at org.springframework.boot.context.properties.ConfigurationPropertiesBindingPostProcessor.bind(ConfigurationPropertiesBindingPostProcessor.java:110) at org.springframework.boot.context.properties.ConfigurationPropertiesBindingPostProcessor.postProcessBeforeInitialization(ConfigurationPropertiesBindingPostProcessor.java:93) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyBeanPostProcessorsBeforeInitialization(AbstractAutowireCapableBeanFactory.java:414) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1754) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:593) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:515) at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:320) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:318) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199) at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:277) at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1247) at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1167) at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:857) at org.springframework.beans.factory.support.ConstructorResolver.resolvePreparedArguments(ConstructorResolver.java:804) at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:430) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFactory.java:1305) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1144) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:555) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:515) at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:320) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:318) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:199) at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:277) at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1247) at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1167) at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:857) at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:760) ... 37 common frames omitted Caused by: java.lang.IllegalStateException: org.springframework.boot.web.servlet.context.AnnotationConfigServletWebServerApplicationContext@31c2affc has not been refreshed yet at org.springframework.context.support.AbstractApplicationContext.assertBeanFactoryActive(AbstractApplicationContext.java:1092) at org.springframework.context.support.AbstractApplicationContext.getBeanProvider(AbstractApplicationContext.java:1134) at org.springframework.boot.context.properties.ConfigurationPropertiesBinder.getBindHandlerAdvisors(ConfigurationPropertiesBinder.java:138) at org.springframework.boot.context.properties.ConfigurationPropertiesBinder.getBindHandler(ConfigurationPropertiesBinder.java:130) at org.springframework.boot.context.properties.ConfigurationPropertiesBinder.bind(ConfigurationPropertiesBinder.java:82) at org.springframework.boot.context.properties.ConfigurationPropertiesBindingPostProcessor.bind(ConfigurationPropertiesBindingPostProcessor.java:107) ... 65 common frames omitted Disconnected from the target VM, address: '127.0.0.1:62410', transport: 'socket' ```

springboot整合kafka,kafka已正常加载,但是consumer的listner不执行。

<?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <parent> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-parent</artifactId> <version>2.1.2.RELEASE</version> <relativePath/> <!-- lookup parent from repository --> </parent> <groupId>com.unitdream.kafka</groupId> <artifactId>consumer</artifactId> <version>0.0.1-SNAPSHOT</version> <name>consumer</name> <description>Demo project for Spring Boot</description> <properties> <java.version>1.8</java.version> </properties> <dependencies> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-data-redis</artifactId> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-web</artifactId> </dependency> <dependency> <groupId>org.mybatis.spring.boot</groupId> <artifactId>mybatis-spring-boot-starter</artifactId> <version>2.0.0</version> </dependency> <dependency> <groupId>com.alibaba</groupId> <artifactId>fastjson</artifactId> <version>1.2.54</version> </dependency> <!-- kafka --> <dependency> <groupId>org.springframework.kafka</groupId> <artifactId>spring-kafka</artifactId> </dependency> <dependency> <groupId>mysql</groupId> <artifactId>mysql-connector-java</artifactId> <scope>runtime</scope> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-test</artifactId> <scope>test</scope> </dependency> <dependency> <groupId>org.springframework.kafka</groupId> <artifactId>spring-kafka-test</artifactId> <scope>test</scope> </dependency> </dependencies> <build> <plugins> <plugin> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-maven-plugin</artifactId> </plugin> </plugins> </build> </project> ``` package com.unitdream.kafka.consumer.kafkaconfig; import com.unitdream.kafka.consumer.kafkaconfig.listener.KafkaConsumerListener; import org.apache.kafka.clients.consumer.ConsumerConfig; import org.apache.kafka.common.serialization.StringDeserializer; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.kafka.annotation.EnableKafka; import org.springframework.kafka.config.ConcurrentKafkaListenerContainerFactory; import org.springframework.kafka.config.KafkaListenerContainerFactory; import org.springframework.kafka.core.ConsumerFactory; import org.springframework.kafka.core.DefaultKafkaConsumerFactory; import org.springframework.kafka.listener.ConcurrentMessageListenerContainer; import java.util.HashMap; import java.util.Map; @Configuration @EnableKafka public class KafkaConsumerConfig { public Map<String, Object> consumerProperties() { Map<String, Object> props = new HashMap<String, Object>(); props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "n1:9092,n2:9092,n3:9092,n4:9092,es1:9092"); props.put(ConsumerConfig.GROUP_ID_CONFIG, "alarm"); props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, "true"); props.put(ConsumerConfig.AUTO_COMMIT_INTERVAL_MS_CONFIG, "1000"); props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class); props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class); return props; } @Bean public ConsumerFactory<String, String> consumerFactory() { return new DefaultKafkaConsumerFactory<String, String>(consumerProperties()); } @Bean public KafkaListenerContainerFactory<ConcurrentMessageListenerContainer<String, String>> kafkaListenerContainerFactory() { ConcurrentKafkaListenerContainerFactory<String, String> factory = new ConcurrentKafkaListenerContainerFactory(); factory.setConsumerFactory(consumerFactory()); factory.setConcurrency(3); factory.getContainerProperties().setPollTimeout(3000); return factory; } @Bean public KafkaConsumerListener kafkaConsumerListener(){ return new KafkaConsumerListener(); } } ``` package com.unitdream.kafka.consumer.kafkaconfig.listener; import com.unitdream.kafka.consumer.services.MqKafkaAnalysisService; import com.unitdream.kafka.consumer.services.MqKafkaMataService; import org.apache.kafka.clients.consumer.ConsumerRecord; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.kafka.annotation.KafkaListener; public class KafkaConsumerListener { private static final Logger loger = LoggerFactory.getLogger(KafkaConsumerListener.class); @Autowired private MqKafkaMataService mqKafkaMataService; @Autowired private MqKafkaAnalysisService mqKafkaAnalysisService; @KafkaListener(topics = {"alarm"}) public void listener(ConsumerRecord<String, String> record) { loger.info("kafka消费[{}-{}]-{}的数据--------------------start",record.topic(),record.partition(),record.offset()); if (mqKafkaMataService.insertOne(record) > 0) { mqKafkaAnalysisService.insertOne(record); } loger.info("kafka消费[{}-{}]-{}的数据--------------------end",record.topic(),record.partition(),record.offset()); } }

springboot kafka 多个接收系统订阅多个发送系统,多个消费组多个消费者该如何写代码?

1、kafka结合springboot开发。 2、生产消息的系统有多个,可能会动态的改变消费系统的个数。举例:A、B、C这3个系统生产消息后分别想让不同的系统使用广播模式消费消息。A想要系统1、2、3消费同一个消息,B想要系统2、3、4、5消费同一个消息,C想要系统1、2、5、6消费同一个消息。这样消费组的个数相当于在动态改变着,可能后面新增或减少,实际环境中生产消息的系统有几十个,消费消息的系统也有几十个,这种网上完全找不到代码案例,都是简单的案例或写死的。求指导代码该怎么写!太复杂了完全没思路!

Kafka消费者组丢失未提交的消息

<div class="post-text" itemprop="text"> <p>I am using consumer group with just one consumer, just one broker ( docker wurstmeister image ). It's decided in a code to commit offset or not - if code returns error then message is not commited. I need to ensure that system does not lose any message - even if that means retrying same msg forever ( for now ;) ). For testing this I have created simple handler which does not commit offset in case of 'error' string send as message to kafka. All other strings are commited. </p> <pre><code>kafka-console-producer --broker-list localhost:9092 --topic test &gt;this will be commited </code></pre> <p>Now running </p> <pre><code>kafka-run-class kafka.admin.ConsumerGroupCommand --bootstrap-server localhost:9092 --group michalgrupa --describe </code></pre> <p>returns</p> <pre><code>TOPIC PARTITION CURRENT-OFFSET LOG-END-OFFSET LAG CONSUMER-ID HOST CLIENT-ID test 0 13 13 0 </code></pre> <p>so thats ok, there is no lag. Now we pass 'error' string to fake that something bad happened and message is not commited:</p> <pre><code>TOPIC PARTITION CURRENT-OFFSET LOG-END-OFFSET LAG CONSUMER-ID HOST CLIENT-ID test 0 13 14 1 </code></pre> <p>Current offset stays at right position + there is 1 lagged message. Now if we pass correct message again offset will move on to 15:</p> <p><code>TOPIC PARTITION CURRENT-OFFSET LOG-END-OFFSET LAG test 0 15 15</code> </p> <p>and message number 14 will not be picked up ever again. Is it default behaviour? Do I need to trace last offset and load message by it+1 manually? I have set commit interval to 0 to hopefully not use any auto.commit mechanism.</p> <p>fetch/commit code:</p> <pre><code>go func() { for { ctx := context.Background() m, err := mr.brokerReader.FetchMessage(ctx) if err != nil { break } if err := msgFunc(m); err != nil { log.Errorf("# messaging # cannot commit a message: %v", err) continue } // commit message if no error if err := mr.brokerReader.CommitMessages(ctx, m); err != nil { // should we do something else to just logging not committed message? log.Errorf("cannot commit message [%s] %v/%v: %s = %s; with error: %v", m.Topic, m.Partition, m.Offset, string(m.Key), string(m.Value), err) } } }() </code></pre> <p>reader configuration:</p> <pre><code>kafkaReader := kafka.NewReader(kafka.ReaderConfig{ Brokers: brokers, GroupID: groupID, Topic: topic, CommitInterval: 0, MinBytes: 10e3, MaxBytes: 10e6, }) </code></pre> <p>library used: <a href="https://github.com/segmentio/kafka-go" rel="nofollow noreferrer">https://github.com/segmentio/kafka-go</a></p> </div>

kafka2.1.0怎么拿到消费者群组的信息。

卡夫卡旧版把消费者情况放在zookeeper的ZkClient里面;新版我在网上查是放在adminclient里面,但是卡夫卡adminclient没有listAllConsumerGroupsf方法。 求问zenm拿到卡夫卡消费者群组信息对其进行监控

kafka同一消费者组内的不同消费者可以订阅不同主题吗

假如有一个消费者组group,消费者组内有两个消费者c1、c2, c1订阅topic1,c2订阅topic2,那么结果是怎样的? 1、是两个消费者分别消费自己的主题; 2、还是组内这两个消费者都订阅了topic1和topic2; 3、还是报错?

在中国程序员是青春饭吗?

今年,我也32了 ,为了不给大家误导,咨询了猎头、圈内好友,以及年过35岁的几位老程序员……舍了老脸去揭人家伤疤……希望能给大家以帮助,记得帮我点赞哦。 目录: 你以为的人生 一次又一次的伤害 猎头界的真相 如何应对互联网行业的「中年危机」 一、你以为的人生 刚入行时,拿着傲人的工资,想着好好干,以为我们的人生是这样的: 等真到了那一天,你会发现,你的人生很可能是这样的: ...

程序员请照顾好自己,周末病魔差点一套带走我。

程序员在一个周末的时间,得了重病,差点当场去世,还好及时挽救回来了。

Java基础知识面试题(2020最新版)

文章目录Java概述何为编程什么是Javajdk1.5之后的三大版本JVM、JRE和JDK的关系什么是跨平台性?原理是什么Java语言有哪些特点什么是字节码?采用字节码的最大好处是什么什么是Java程序的主类?应用程序和小程序的主类有何不同?Java应用程序与小程序之间有那些差别?Java和C++的区别Oracle JDK 和 OpenJDK 的对比基础语法数据类型Java有哪些数据类型switc...

和黑客斗争的 6 天!

互联网公司工作,很难避免不和黑客们打交道,我呆过的两家互联网公司,几乎每月每天每分钟都有黑客在公司网站上扫描。有的是寻找 Sql 注入的缺口,有的是寻找线上服务器可能存在的漏洞,大部分都...

Intellij IDEA 实用插件安利

1. 前言从2020 年 JVM 生态报告解读 可以看出Intellij IDEA 目前已经稳坐 Java IDE 头把交椅。而且统计得出付费用户已经超过了八成(国外统计)。IDEA 的...

搜狗输入法也在挑战国人的智商!

故事总是一个接着一个到来...上周写完《鲁大师已经彻底沦为一款垃圾流氓软件!》这篇文章之后,鲁大师的市场工作人员就找到了我,希望把这篇文章删除掉。经过一番沟通我先把这篇文章从公号中删除了...

总结了 150 余个神奇网站,你不来瞅瞅吗?

原博客再更新,可能就没了,之后将持续更新本篇博客。

副业收入是我做程序媛的3倍,工作外的B面人生是怎样的?

提到“程序员”,多数人脑海里首先想到的大约是:为人木讷、薪水超高、工作枯燥…… 然而,当离开工作岗位,撕去层层标签,脱下“程序员”这身外套,有的人生动又有趣,马上展现出了完全不同的A/B面人生! 不论是简单的爱好,还是正经的副业,他们都干得同样出色。偶尔,还能和程序员的特质结合,产生奇妙的“化学反应”。 @Charlotte:平日素颜示人,周末美妆博主 大家都以为程序媛也个个不修边幅,但我们也许...

MySQL数据库面试题(2020最新版)

文章目录数据库基础知识为什么要使用数据库什么是SQL?什么是MySQL?数据库三大范式是什么mysql有关权限的表都有哪几个MySQL的binlog有有几种录入格式?分别有什么区别?数据类型mysql有哪些数据类型引擎MySQL存储引擎MyISAM与InnoDB区别MyISAM索引与InnoDB索引的区别?InnoDB引擎的4大特性存储引擎选择索引什么是索引?索引有哪些优缺点?索引使用场景(重点)...

如果你是老板,你会不会踢了这样的员工?

有个好朋友ZS,是技术总监,昨天问我:“有一个老下属,跟了我很多年,做事勤勤恳恳,主动性也很好。但随着公司的发展,他的进步速度,跟不上团队的步伐了,有点...

我入职阿里后,才知道原来简历这么写

私下里,有不少读者问我:“二哥,如何才能写出一份专业的技术简历呢?我总感觉自己写的简历太烂了,所以投了无数份,都石沉大海了。”说实话,我自己好多年没有写过简历了,但我认识的一个同行,他在阿里,给我说了一些他当年写简历的方法论,我感觉太牛逼了,实在是忍不住,就分享了出来,希望能够帮助到你。 01、简历的本质 作为简历的撰写者,你必须要搞清楚一点,简历的本质是什么,它就是为了来销售你的价值主张的。往深...

魂迁光刻,梦绕芯片,中芯国际终获ASML大型光刻机

据羊城晚报报道,近日中芯国际从荷兰进口的一台大型光刻机,顺利通过深圳出口加工区场站两道闸口进入厂区,中芯国际发表公告称该光刻机并非此前盛传的EUV光刻机,主要用于企业复工复产后的生产线扩容。 我们知道EUV主要用于7nm及以下制程的芯片制造,光刻机作为集成电路制造中最关键的设备,对芯片制作工艺有着决定性的影响,被誉为“超精密制造技术皇冠上的明珠”,根据之前中芯国际的公报,目...

优雅的替换if-else语句

场景 日常开发,if-else语句写的不少吧??当逻辑分支非常多的时候,if-else套了一层又一层,虽然业务功能倒是实现了,但是看起来是真的很不优雅,尤其是对于我这种有强迫症的程序"猿",看到这么多if-else,脑袋瓜子就嗡嗡的,总想着解锁新姿势:干掉过多的if-else!!!本文将介绍三板斧手段: 优先判断条件,条件不满足的,逻辑及时中断返回; 采用策略模式+工厂模式; 结合注解,锦...

离职半年了,老东家又发 offer,回不回?

有小伙伴问松哥这个问题,他在上海某公司,在离职了几个月后,前公司的领导联系到他,希望他能够返聘回去,他很纠结要不要回去? 俗话说好马不吃回头草,但是这个小伙伴既然感到纠结了,我觉得至少说明了两个问题:1.曾经的公司还不错;2.现在的日子也不是很如意。否则应该就不会纠结了。 老实说,松哥之前也有过类似的经历,今天就来和小伙伴们聊聊回头草到底吃不吃。 首先一个基本观点,就是离职了也没必要和老东家弄的苦...

2020阿里全球数学大赛:3万名高手、4道题、2天2夜未交卷

阿里巴巴全球数学竞赛( Alibaba Global Mathematics Competition)由马云发起,由中国科学技术协会、阿里巴巴基金会、阿里巴巴达摩院共同举办。大赛不设报名门槛,全世界爱好数学的人都可参与,不论是否出身数学专业、是否投身数学研究。 2020年阿里巴巴达摩院邀请北京大学、剑桥大学、浙江大学等高校的顶尖数学教师组建了出题组。中科院院士、美国艺术与科学院院士、北京国际数学...

为什么你不想学习?只想玩?人是如何一步一步废掉的

不知道是不是只有我这样子,还是你们也有过类似的经历。 上学的时候总有很多光辉历史,学年名列前茅,或者单科目大佬,但是虽然慢慢地长大了,你开始懈怠了,开始废掉了。。。 什么?你说不知道具体的情况是怎么样的? 我来告诉你: 你常常潜意识里或者心理觉得,自己真正的生活或者奋斗还没有开始。总是幻想着自己还拥有大把时间,还有无限的可能,自己还能逆风翻盘,只不是自己还没开始罢了,自己以后肯定会变得特别厉害...

百度工程师,获利10万,判刑3年!

所有一夜暴富的方法都写在刑法中,但总有人心存侥幸。这些年互联网犯罪高发,一些工程师高技术犯罪更是引发关注。这两天,一个百度运维工程师的案例传遍朋友圈。1...

程序员为什么千万不要瞎努力?

本文作者用对比非常鲜明的两个开发团队的故事,讲解了敏捷开发之道 —— 如果你的团队缺乏统一标准的环境,那么即使勤劳努力,不仅会极其耗时而且成果甚微,使用...

为什么程序员做外包会被瞧不起?

二哥,有个事想询问下您的意见,您觉得应届生值得去外包吗?公司虽然挺大的,中xx,但待遇感觉挺低,马上要报到,挺纠结的。

当HR压你价,说你只值7K,你该怎么回答?

当HR压你价,说你只值7K时,你可以流畅地回答,记住,是流畅,不能犹豫。 礼貌地说:“7K是吗?了解了。嗯~其实我对贵司的面试官印象很好。只不过,现在我的手头上已经有一份11K的offer。来面试,主要也是自己对贵司挺有兴趣的,所以过来看看……”(未完) 这段话主要是陪HR互诈的同时,从公司兴趣,公司职员印象上,都给予对方正面的肯定,既能提升HR的好感度,又能让谈判气氛融洽,为后面的发挥留足空间。...

面试:第十六章:Java中级开发

HashMap底层实现原理,红黑树,B+树,B树的结构原理 Spring的AOP和IOC是什么?它们常见的使用场景有哪些?Spring事务,事务的属性,传播行为,数据库隔离级别 Spring和SpringMVC,MyBatis以及SpringBoot的注解分别有哪些?SpringMVC的工作原理,SpringBoot框架的优点,MyBatis框架的优点 SpringCould组件有哪些,他们...

面试阿里p7,被按在地上摩擦,鬼知道我经历了什么?

面试阿里p7被问到的问题(当时我只知道第一个):@Conditional是做什么的?@Conditional多个条件是什么逻辑关系?条件判断在什么时候执...

无代码时代来临,程序员如何保住饭碗?

编程语言层出不穷,从最初的机器语言到如今2500种以上的高级语言,程序员们大呼“学到头秃”。程序员一边面临编程语言不断推陈出新,一边面临由于许多代码已存在,程序员编写新应用程序时存在重复“搬砖”的现象。 无代码/低代码编程应运而生。无代码/低代码是一种创建应用的方法,它可以让开发者使用最少的编码知识来快速开发应用程序。开发者通过图形界面中,可视化建模来组装和配置应用程序。这样一来,开发者直...

面试了一个 31 岁程序员,让我有所触动,30岁以上的程序员该何去何从?

最近面试了一个31岁8年经验的程序猿,让我有点感慨,大龄程序猿该何去何从。

大三实习生,字节跳动面经分享,已拿Offer

说实话,自己的算法,我一个不会,太难了吧

程序员垃圾简历长什么样?

已经连续五年参加大厂校招、社招的技术面试工作,简历看的不下于万份 这篇文章会用实例告诉你,什么是差的程序员简历! 疫情快要结束了,各个公司也都开始春招了,作为即将红遍大江南北的新晋UP主,那当然要为小伙伴们做点事(手动狗头)。 就在公众号里公开征简历,义务帮大家看,并一一点评。《启舰:春招在即,义务帮大家看看简历吧》 一石激起千层浪,三天收到两百多封简历。 花光了两个星期的所有空闲时...

《Oracle Java SE编程自学与面试指南》最佳学习路线图2020年最新版(进大厂必备)

正确选择比瞎努力更重要!

字节跳动面试官竟然问了我JDBC?

轻松等回家通知

面试官:你连SSO都不懂,就别来面试了

大厂竟然要考我SSO,卧槽。

实时更新:计算机编程语言排行榜—TIOBE世界编程语言排行榜(2020年6月份最新版)

内容导航: 1、TIOBE排行榜 2、总榜(2020年6月份) 3、本月前三名 3.1、C 3.2、Java 3.3、Python 4、学习路线图 5、参考地址 1、TIOBE排行榜 TIOBE排行榜是根据全世界互联网上有经验的程序员、课程和第三方厂商的数量,并使用搜索引擎(如Google、Bing、Yahoo!)以及Wikipedia、Amazon、YouTube统计出排名数据。

立即提问
相关内容推荐