windows下MySQL5.7.19无法启动日志是为什么

安装的是解压缩版的MySQL5.7.19,可以正常启动和进入mysql,但是一直无法启动日志,
图片说明
以下是我的my.ini内容,希望各路大神能够指导一下我:
# For advice on how to change settings please see

# http://dev.mysql.com/doc/refman/5.7/en/server-configuration-defaults.html

# *** DO NOT EDIT THIS FILE. It's a template which will be copied to the

# *** default location during install, and will be replaced if you

# *** upgrade to a newer version of MySQL.

[client]

default-character-set=utf8

[mysqld]

port=3306

log-bin =mysql-bin

default-storage-engine=INNODB
character-set-server=utf8
collation-server=utf8_general_ci

basedir ="E:\MySQL\mysql-5.7.19-winx64/"

datadir ="E:\MySQL\mysql-5.7.19-winx64/data/"

tmpdir ="E:\MySQL\mysql-5.7.19-winx64/data/"

socket ="E:\MySQL\mysql-5.7.19-winx64/data/mysql.sock"

log-error="E:\MySQL\mysql-5.7.19-winx64/data/mysql_error.log"

server-id =1

#skip-locking

max_connections=100

table_open_cache=256

query_cache_size=1M

tmp_table_size=32M

thread_cache_size=8

innodb_data_home_dir="E:\MySQL\mysql-5.7.19-winx64/data/"

innodb_flush_log_at_trx_commit =1

innodb_log_buffer_size=128M

innodb_buffer_pool_size=128M

innodb_log_file_size=10M

innodb_thread_concurrency=16

innodb-autoextend-increment=1000

join_buffer_size = 128M

sort_buffer_size = 32M

read_rnd_buffer_size = 32M

max_allowed_packet = 32M

explicit_defaults_for_timestamp=true

sql-mode="STRICT_TRANS_TABLES,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION"

skip-grant-tables

#sql_mode=NO_ENGINE_SUBSTITUTION,STRICT_TRANS_TABLES

2个回答

是什么Winows系统?

打错了,是Windows~~~

Csdn user default icon
上传中...
上传图片
插入图片
抄袭、复制答案,以达到刷声望分或其他目的的行为,在CSDN问答是严格禁止的,一经发现立刻封号。是时候展现真正的技术了!
其他相关推荐
服务器断电后mysql无法启动
因意外导致服务器断电,重启后发现mysql起不来了 ``` [root@cacti ~]# service mysqld restart 停止 mysqld: [确定] MySQL Daemon failed to start. 正在启动 mysqld: [失败] ``` 查看日志: ``` [root@cacti ~]# tail -100 /var/log/mysqld.log 150520 12:21:12 mysqld_safe Starting mysqld daemon with databases from /var/lib/mysql 150520 12:21:12 InnoDB: Initializing buffer pool, size = 8.0M 150520 12:21:12 InnoDB: Completed initialization of buffer pool InnoDB: Log scan progressed past the checkpoint lsn 2 1552263436 150520 12:21:12 InnoDB: Database was not shut down normally! InnoDB: Starting crash recovery. InnoDB: Reading tablespace information from the .ibd files... InnoDB: Restoring possible half-written data pages from the doublewrite InnoDB: buffer... 150520 12:21:12 InnoDB: Starting an apply batch of log records to the database... InnoDB: Progress in percents: 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 InnoDB: Database page corruption on disk or a failed InnoDB: file read of page 206. InnoDB: You may have to recover from a backup. 150520 12:21:12 InnoDB: Page dump in ascii and hex (16384 bytes): len 16384; hex cc507667000000ce0000000000000000000000025cabb0bc000200000000000000000000000000021e751eb0ffffffff0000ffffffff000000021dbb0000000000000002123200000001000000ce002c000000ce002c000000000304a196000000000304a1970001011000000000000000000000014....... .... .... .... oDB: End of page dump 150520 12:21:12 InnoDB: Page checksum 3392922109, prior-to-4.0.14-form checksum 1831220034 InnoDB: stored checksum 3427825255, prior-to-4.0.14-form stored checksum 2455963898 InnoDB: Page lsn 2 1554755772, low 4 bytes of lsn at page end 1556408857 InnoDB: Page number (if stored to page already) 206, InnoDB: space id (if created with >= MySQL-4.1.1 and stored already) 0 InnoDB: Page may be an update undo log page InnoDB: Database page corruption on disk or a failed InnoDB: file read of page 206. InnoDB: You may have to recover from a backup. InnoDB: It is also possible that your operating InnoDB: system has corrupted its own file cache InnoDB: and rebooting your computer removes the InnoDB: error. InnoDB: If the corrupt page is an index page InnoDB: you can also try to fix the corruption InnoDB: by dumping, dropping, and reimporting InnoDB: the corrupt table. You can use CHECK InnoDB: TABLE to scan your table for corruption. InnoDB: See also http://dev.mysql.com/doc/refman/5.1/en/forcing-innodb-recovery.html InnoDB: about forcing recovery. InnoDB: Ending processing because of a corrupt database page. 150520 12:21:12 mysqld_safe mysqld from pid file /var/run/mysqld/mysqld.pid ended ``` mysql新手一个,希望大家能帮帮忙分析下,谢谢!
mysql.sock的所有者所属组以及权限全部都是问号?求助神回复如何删除,亦或是如何重建,必有重谢!!!
mysql.sock的所有者所属组以及权限全部都是问号?求助神回复如何删除,亦或是如何重建,必有重谢!!! 代码如下: [root@DBServer-BAK lib]# cd /tmp [root@DBServer-BAK tmp]# ll ls: 无法访问mysql.sock: 输入/输出错误 总用量 40 drwxr-xr-x 2 root root 4096 6月 25 11:16 hsperfdata_root s????????? ? ? ? ? ? mysql.sock -rw------- 1 mysql mysql 6 6月 25 11:34 mysql.sock.lock -rwxrwxrwx 1 root root 45 6月 25 11:46 mysql.sock.tar.gz drwx------ 3 root root 4096 6月 25 11:16 systemd-private-0a3b207663d847a183bbad804fca0ebd-colord.service-sic0B1 drwx------ 3 root root 4096 6月 25 11:16 systemd-private-0a3b207663d847a183bbad804fca0ebd-cups.service-4zk1XQ drwx------ 3 root root 4096 6月 25 11:16 systemd-private-0a3b207663d847a183bbad804fca0ebd-rtkit-daemon.service-aM0JqV drwx------ 3 root root 4096 6月 20 07:42 systemd-private-6d51751ab45a4174b26d3719b9ce8eee-colord.service-1RN8yw drwx------ 3 root root 4096 6月 20 07:42 systemd-private-6d51751ab45a4174b26d3719b9ce8eee-cups.service-vXDiW2 drwx------ 3 root root 4096 6月 20 07:42 systemd-private-6d51751ab45a4174b26d3719b9ce8eee-rtkit-daemon.service-zCu1j4 drwx------ 2 ytomsbag ytomsbag 4096 6月 21 11:23 tracker-extract-files.1000 错误日志如下: 2019-06-25T05:00:27.641394Z 0 [Note] InnoDB: File './ibtmp1' size is now 12 MB. 2019-06-25T05:00:27.642251Z 0 [Note] InnoDB: 96 redo rollback segment(s) found. 96 redo rollback segment(s) are active. 2019-06-25T05:00:27.642263Z 0 [Note] InnoDB: 32 non-redo rollback segment(s) are active. 2019-06-25T05:00:27.643015Z 0 [Note] InnoDB: Waiting for purge to start 2019-06-25T05:00:27.693118Z 0 [Note] InnoDB: 5.7.19 started; log sequence number 2539728 2019-06-25T05:00:27.693642Z 0 [Note] InnoDB: Loading buffer pool(s) from /usr/local/mysql/data/ib_buffer_pool 2019-06-25T05:00:27.693845Z 0 [Note] Plugin 'FEDERATED' is disabled. 2019-06-25T05:00:27.694464Z 0 [Note] InnoDB: Buffer pool(s) load completed at 190625 13:00:27 2019-06-25T05:00:27.698589Z 0 [Warning] Failed to set up SSL because of the following SSL library error: SSL context is not usable without certificate and private key 2019-06-25T05:00:27.698608Z 0 [Note] Server hostname (bind-address): '*'; port: 3306 2019-06-25T05:00:27.698645Z 0 [Note] IPv6 is available. 2019-06-25T05:00:27.698657Z 0 [Note] - '::' resolves to '::'; 2019-06-25T05:00:27.698702Z 0 [Note] Server socket created on IP: '::'. 2019-06-25T05:00:27.698752Z 0 [ERROR] Could not create unix socket lock file tmp/mysql.sock.lock. 2019-06-25T05:00:27.698766Z 0 [ERROR] Unable to setup unix socket lock file. 2019-06-25T05:00:27.698773Z 0 [ERROR] Aborting 2019-06-25T05:00:27.698781Z 0 [Note] Binlog end
Log4j2启动报错,ava.lang.NoSuchMethodError
Caused by: java.lang.NoSuchMethodError: org.apache.logging.log4j.core.util.FileUtils.getCorrectedFilePathUri(Ljava/lang/String;)Ljava/net/URI; at org.apache.logging.log4j.web.Log4jWebInitializerImpl.getConfigURI(Log4jWebInitializerImpl.java:193) at org.apache.logging.log4j.web.Log4jWebInitializerImpl.initializeNonJndi(Log4jWebInitializerImpl.java:156) at org.apache.logging.log4j.web.Log4jWebInitializerImpl.start(Log4jWebInitializerImpl.java:107) at org.apache.logging.log4j.web.Log4jServletContainerInitializer.onStartup(Log4jServletContainerInitializer.java:57) at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5669) at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:145) ... 42 more Jul 02, 2018 1:19:18 PM org.apache.tomcat.util.modeler.BaseModelMBean invoke 严重: Exception invoking method manageApp java.lang.IllegalStateException: ContainerBase.addChild: start: org.apache.catalina.LifecycleException: Failed to start component [StandardEngine[Catalina].StandardHost[localhost].StandardContext[]] at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:1021) at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:993) at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:652) at org.apache.catalina.startup.HostConfig.manageApp(HostConfig.java:1900) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.tomcat.util.modeler.BaseModelMBean.invoke(BaseModelMBean.java:301) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.invoke(DefaultMBeanServerInterceptor.java:819) at com.sun.jmx.mbeanserver.JmxMBeanServer.invoke(JmxMBeanServer.java:801) at org.apache.catalina.mbeans.MBeanFactory.createStandardContext(MBeanFactory.java:618) at org.apache.catalina.mbeans.MBeanFactory.createStandardContext(MBeanFactory.java:565) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.tomcat.util.modeler.BaseModelMBean.invoke(BaseModelMBean.java:301) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.invoke(DefaultMBeanServerInterceptor.java:819) at com.sun.jmx.mbeanserver.JmxMBeanServer.invoke(JmxMBeanServer.java:801) at javax.management.remote.rmi.RMIConnectionImpl.doOperation(RMIConnectionImpl.java:1468) at javax.management.remote.rmi.RMIConnectionImpl.access$300(RMIConnectionImpl.java:76) at javax.management.remote.rmi.RMIConnectionImpl$PrivilegedOperation.run(RMIConnectionImpl.java:1309) at javax.management.remote.rmi.RMIConnectionImpl.doPrivilegedOperation(RMIConnectionImpl.java:1401) at javax.management.remote.rmi.RMIConnectionImpl.invoke(RMIConnectionImpl.java:829) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.rmi.server.UnicastServerRef.dispatch(UnicastServerRef.java:357) at sun.rmi.transport.Transport$1.run(Transport.java:200) at sun.rmi.transport.Transport$1.run(Transport.java:197) at java.security.AccessController.doPrivileged(Native Method) at sun.rmi.transport.Transport.serviceCall(Transport.java:196) at sun.rmi.transport.tcp.TCPTransport.handleMessages(TCPTransport.java:573) at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run0(TCPTransport.java:835) at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.lambda$run$0(TCPTransport.java:688) at java.security.AccessController.doPrivileged(Native Method) at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run(TCPTransport.java:687) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Jul 02, 2018 1:19:18 PM org.apache.tomcat.util.modeler.BaseModelMBean invoke 严重: Exception invoking method createStandardContext javax.management.RuntimeOperationsException: Exception invoking method manageApp at org.apache.tomcat.util.modeler.BaseModelMBean.invoke(BaseModelMBean.java:309) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.invoke(DefaultMBeanServerInterceptor.java:819) at com.sun.jmx.mbeanserver.JmxMBeanServer.invoke(JmxMBeanServer.java:801) at org.apache.catalina.mbeans.MBeanFactory.createStandardContext(MBeanFactory.java:618) at org.apache.catalina.mbeans.MBeanFactory.createStandardContext(MBeanFactory.java:565) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.tomcat.util.modeler.BaseModelMBean.invoke(BaseModelMBean.java:301) at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.invoke(DefaultMBeanServerInterceptor.java:819) at com.sun.jmx.mbeanserver.JmxMBeanServer.invoke(JmxMBeanServer.java:801) at javax.management.remote.rmi.RMIConnectionImpl.doOperation(RMIConnectionImpl.java:1468) at javax.management.remote.rmi.RMIConnectionImpl.access$300(RMIConnectionImpl.java:76) at javax.management.remote.rmi.RMIConnectionImpl$PrivilegedOperation.run(RMIConnectionImpl.java:1309) at javax.management.remote.rmi.RMIConnectionImpl.doPrivilegedOperation(RMIConnectionImpl.java:1401) at javax.management.remote.rmi.RMIConnectionImpl.invoke(RMIConnectionImpl.java:829) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at sun.rmi.server.UnicastServerRef.dispatch(UnicastServerRef.java:357) at sun.rmi.transport.Transport$1.run(Transport.java:200) at sun.rmi.transport.Transport$1.run(Transport.java:197) at java.security.AccessController.doPrivileged(Native Method) at sun.rmi.transport.Transport.serviceCall(Transport.java:196) at sun.rmi.transport.tcp.TCPTransport.handleMessages(TCPTransport.java:573) at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run0(TCPTransport.java:835) at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.lambda$run$0(TCPTransport.java:688) at java.security.AccessController.doPrivileged(Native Method) at sun.rmi.transport.tcp.TCPTransport$ConnectionHandler.run(TCPTransport.java:687) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.IllegalStateException: ContainerBase.addChild: start: org.apache.catalina.LifecycleException: Failed to start component [StandardEngine[Catalina].StandardHost[localhost].StandardContext[]] at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:1021) at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:993) at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:652) at org.apache.catalina.startup.HostConfig.manageApp(HostConfig.java:1900) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.tomcat.util.modeler.BaseModelMBean.invoke(BaseModelMBean.java:301) ... 33 more Jul 02, 2018 1:19:23 PM org.apache.catalina.startup.HostConfig deployDirectory 信息: Deploying web application directory /usr/local/apache-tomcat-7.0.88/webapps/manager Jul 02, 2018 1:19:23 PM org.apache.catalina.startup.TldConfig execute ``` ``` 下面是pom文件 ``` <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd"> <modelVersion>4.0.0</modelVersion> <groupId>com.konglin</groupId> <artifactId>SmartCompus</artifactId> <packaging>war</packaging> <version>1.0-SNAPSHOT</version> <name>SmartCompus Maven Webapp</name> <url>http://maven.apache.org</url> <properties> <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> <project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding> <spring.version>4.2.5.RELEASE</spring.version> <mybatis.version>3.2.8</mybatis.version> <slf4j.version>1.7.18</slf4j.version> <log4j.version>2.5</log4j.version> <mysql.driver.version>5.1.21</mysql.driver.version> <commons-fileupload.version>1.3</commons-fileupload.version> <servlet.version>3.1.0</servlet.version> <spring-mybatis.version>1.2.2</spring-mybatis.version> <fastjson.version>1.1.41</fastjson.version> <jackson.version>2.3.0</jackson.version> <quartz.versison>1.8.5</quartz.versison> <httpclient.version>3.1</httpclient.version> <jstl.version>1.2</jstl.version> <shiro.version>1.2.3</shiro.version> <lombok.version>1.16.18</lombok.version> <json.version>2.4</json.version> <poi.version>3.14</poi.version> <gexin.http.version>4.0.1.17</gexin.http.version> <maven.pluin.version>2.5.2</maven.pluin.version> </properties> <dependencies> <!--servlet--> <dependency> <groupId>javax.servlet</groupId> <artifactId>javax.servlet-api</artifactId> <version>${servlet.version}</version> </dependency> <!--spring--> <!--核心--> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-core</artifactId> <version>${spring.version}</version> </dependency> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-context</artifactId> <version>${spring.version}</version> </dependency> <!--dao--> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-jdbc</artifactId> <version>${spring.version}</version> </dependency> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-tx</artifactId> <version>${spring.version}</version> </dependency> <!--web--> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-web</artifactId> <version>${spring.version}</version> </dependency> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-webmvc</artifactId> <version>${spring.version}</version> </dependency> <!--test--> <dependency> <groupId>org.springframework</groupId> <artifactId>spring-test</artifactId> <version>${spring.version}</version> </dependency> <!--mybatis--> <dependency> <groupId>org.mybatis</groupId> <artifactId>mybatis</artifactId> <version>${mybatis.version}</version> </dependency> <!-- 添加mybatis/spring整合包依赖 --> <dependency> <groupId>org.mybatis</groupId> <artifactId>mybatis-spring</artifactId> <version>${spring-mybatis.version}</version> </dependency> <!-- jdbc driver --> <dependency> <groupId>mysql</groupId> <artifactId>mysql-connector-java</artifactId> <version>${mysql.driver.version}</version> <scope>runtime</scope> </dependency> <!--log4j--> <dependency> <groupId>org.apache.logging.log4j</groupId> <artifactId>log4j-api</artifactId> <version>${log4j.version}</version> </dependency> <dependency> <groupId>org.apache.logging.log4j</groupId> <artifactId>log4j-core</artifactId> <version>${log4j.version}</version> </dependency> <dependency> <groupId>org.apache.logging.log4j</groupId> <artifactId>log4j-web</artifactId> <version>${log4j.version}</version> </dependency> <dependency> <groupId>org.slf4j</groupId> <artifactId>slf4j-api</artifactId> <version>${slf4j.version}</version> </dependency> <dependency> <groupId>org.slf4j</groupId> <artifactId>slf4j-log4j12</artifactId> <version>${slf4j.version}</version> </dependency> <!--alibaba Json--> <dependency> <groupId>com.alibaba</groupId> <artifactId>fastjson</artifactId> <version>${fastjson.version}</version> </dependency> <!--jackson Json--> <dependency> <groupId>com.fasterxml.jackson.core</groupId> <artifactId>jackson-core</artifactId> <version>${jackson.version}</version> </dependency> <dependency> <groupId>com.fasterxml.jackson.core</groupId> <artifactId>jackson-databind</artifactId> <version>${jackson.version}</version> </dependency> <!--quartz--> <dependency> <groupId>org.quartz-scheduler</groupId> <artifactId>quartz</artifactId> <version>${quartz.versison}</version> </dependency> <!-- https://mvnrepository.com/artifact/commons-httpclient/commons-httpclient --> <dependency> <groupId>commons-httpclient</groupId> <artifactId>commons-httpclient</artifactId> <version>${httpclient.version}</version> </dependency> <!-- 添加jstl依赖 --> <dependency> <groupId>jstl</groupId> <artifactId>jstl</artifactId> <version>${jstl.version}</version> </dependency> <!-- shiro --> <dependency> <groupId>org.apache.shiro</groupId> <artifactId>shiro-all</artifactId> <version>${shiro.version}</version> </dependency> <!--lombok--> <dependency> <groupId>org.projectlombok</groupId> <artifactId>lombok</artifactId> <version>${lombok.version}</version> <scope>provided</scope> </dependency> <!-- https://mvnrepository.com/artifact/net.sf.json-lib/json-lib --> <!--必须添加jdk1,否则编译不通过》--> <dependency> <groupId>net.sf.json-lib</groupId> <artifactId>json-lib</artifactId> <version>${json.version}</version> <classifier>jdk15</classifier> </dependency> <!-- https://mvnrepository.com/artifact/org.apache.poi/poi-ooxml --> <dependency> <groupId>org.apache.poi</groupId> <artifactId>poi-ooxml</artifactId> <version>${poi.version}</version> </dependency> <!--gexin--> <dependency> <groupId>com.gexin.platform</groupId> <artifactId>gexin-rp-sdk-http</artifactId> <version>${gexin.http.version}</version> </dependency> <!--maven.plugins--> <dependency> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-release-plugin</artifactId> <version>${maven.pluin.version}</version> </dependency> </dependencies> <repositories> <repository> <id>getui-nexus</id> <url>http://mvn.gt.igexin.com/nexus/content/repositories/releases/</url> </repository> </repositories> <build> <finalName>SmartCompus</finalName> <plugins> <plugin> <groupId>org.mybatis.generator</groupId> <artifactId>mybatis-generator-maven-plugin</artifactId> <version>1.3.1</version> <configuration> <verbose>true</verbose> <overwrite>true</overwrite> </configuration> </plugin> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-compiler-plugin</artifactId> <configuration> <source>1.8</source> <target>1.8</target> </configuration> </plugin> </plugins> <resources> <resource> <directory>src/main/resources</directory> </resource> <resource> <directory>src/main/java</directory> <includes> <include>**/*.xml</include> </includes> <filtering>true</filtering> </resource> </resources> </build> </project> ``` 下面是控制的日志 ``` Connected to server [2018-07-02 06:46:01,281] Artifact SmartCompus:war exploded: Artifact is being deployed, please wait... Jul 02, 2018 6:46:01 PM org.apache.catalina.loader.WebappClassLoaderBase validateJarFile 信息: validateJarFile(/Users/albert/Workspace/IDEA/SmartCompus/target/SmartCompus/WEB-INF/lib/javax.servlet-api-3.1.0.jar) - jar not loaded. See Servlet Spec 3.0, section 10.7.2. Offending class: javax/servlet/Servlet.class Jul 02, 2018 6:46:06 PM org.apache.catalina.startup.TldConfig execute 信息: At least one JAR was scanned for TLDs yet contained no TLDs. Enable debug logging for this logger for a complete list of JARs that were scanned but no TLDs were found in them. Skipping unneeded JARs during scanning can improve startup time and JSP compilation time. SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/Users/albert/Workspace/IDEA/SmartCompus/target/SmartCompus/WEB-INF/lib/slf4j-jdk14-1.5.6.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/Users/albert/Workspace/IDEA/SmartCompus/target/SmartCompus/WEB-INF/lib/slf4j-log4j12-1.7.18.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/Users/albert/Workspace/IDEA/SmartCompus/target/SmartCompus/WEB-INF/lib/slf4j-nop-1.5.3.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: The requested version 1.5.6 by your slf4j binding is not compatible with [1.6] SLF4J: See http://www.slf4j.org/codes.html#version_mismatch for further details. Jul 02, 2018 6:46:07 PM org.apache.catalina.core.StandardContext startInternal 严重: One or more listeners failed to start. Full details will be found in the appropriate container log file Jul 02, 2018 6:46:07 PM org.apache.catalina.core.StandardContext startInternal 严重: Context [] startup failed due to previous errors [2018-07-02 06:46:07,299] Artifact SmartCompus:war exploded: Error during artifact deployment. See server log for details. Jul 02, 2018 6:46:11 PM org.apache.catalina.startup.HostConfig deployDirectory 信息: Deploying web application directory /usr/local/apache-tomcat-7.0.88/webapps/manager Jul 02, 2018 6:46:11 PM org.apache.catalina.startup.TldConfig execute 信息: At least one JAR was scanned for TLDs yet contained no TLDs. Enable debug logging for this logger for a complete list of JARs that were scanned but no TLDs were found in them. Skipping unneeded JARs during scanning can improve startup time and JSP compilation time. Jul 02, 2018 6:46:11 PM org.apache.catalina.startup.HostConfig deployDirectory 信息: Deployment of web application directory /usr/local/apache-tomcat-7.0.88/webapps/manager has finished in 271 ms ```
hive启动 which: no hbase
hive安装完成后, 启动后包which: no hbase ,但是能创建数据库、能建表、能查询。hive 所连接舍数据库也多了一个hive库(元数据)。 1、网上都说在/hive/lib 目录下添加mysql-connector-java-5.1.47-bin.jar架包,我也添加了但并不起作用。 2、这里没有其他的error信息,我想请问一下hive的启动日志是在哪个目录下 3、想用beeline连接hive,是否需要安装habase ``` [root@devcrm ~]# hive which: no hbase in (/usr/local/kafka/zookeeper-3.4.10/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/open/maven/rj/apache-maven-3.5.2/bin:/usr/local/java/bin:/usr/local/kafka/hadoop-2.7.6/bin:/usr/local/kafka/hadoop-2.7.6/sbin:/usr/local/kafka/hive/bin:/root/bin) SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/usr/local/kafka/hive/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/local/kafka/hadoop-2.7.6/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] Logging initialized using configuration in jar:file:/usr/local/kafka/hive/lib/hive-common-2.3.0.jar!/hive-log4j2.properties Async: true Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. tez, spark) or using Hive 1.X releases. hive> use myhive; OK Time taken: 3.439 seconds hive> select * from student where name like '%小%; OK 95014 王小丽 女 19 CS 95019 邢小丽 女 19 IS 95010 孔小涛 男 19 CS 95011 包小柏 男 18 MA 95014 王小丽 女 19 CS 95019 邢小丽 女 19 IS 95010 孔小涛 男 19 CS 95011 包小柏 男 18 MA Time taken: 1.901 seconds, Fetched: 8 row(s) hive> ``` 这是hive连接的mysql数据库 ![图片说明](https://img-ask.csdn.net/upload/201904/23/1555982309_734580.png) hive/lib目录下添加的mysql驱动架包 ![图片说明](https://img-ask.csdn.net/upload/201904/23/1555982608_723323.png)
Rsyslog数据库占用过大
在Rsyslog首页中查看日志,发现很多重复的。 ![图片说明](https://img-ask.csdn.net/upload/201612/19/1482110072_909190.png) Mysql 数据路路径下查看发现/var/lib/mysql/Syslog5天就把空间占满了 988K ./mysql 4.0K ./test 663G ./Syslog 目录下 这个文件SystemEvents.MYD占用了651G 这个文件要如何清理重复日志,或者如何减小占用。 window客户端安装配置命令如下: evtsys -i -s 10 -h log-server-ip -p 514 Linux客户端配置如下: *.* @192.168.7.201 (就只添加了这一段) 请问Rsyslog配置文件如何设置: 1.server端不再接收重复日志,和间隔一段时间接收客户端消息 2.客户端如何设置间隔一段时间发送日志消息,和不发送重复日志信息。 求大神指点!!!!!!!!!!!!!!!
web项目运行一段时间就宕掉了,警告: processCallbacks status 2
以下是日志信息,请大家帮忙看下: 2014-05-07 20:03:57 Commons Daemon procrun stderr initialized 2014-5-7 20:03:59 org.apache.catalina.core.AprLifecycleListener init 信息: The APR based Apache Tomcat Native library which allows optimal performance in production environments was not found on the java.library.path: E:\search_online\standard-tomcat-6.0.29\bin;.;C:\WINDOWS\Sun\Java\bin;C:\WINDOWS\system32;C:\WINDOWS;c:\Ora10InstantClient\bin;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem;C:\Java\jdk1.6.0_21\lib;C:\Java\jdk1.6.0_21\bin;C:\Program Files\Dell\SysMgt\oma\bin;C:\Program Files\Dell\SysMgt\idrac;C:\Program Files\Microsoft SQL Server\80\Tools\Binn\;C:\Program Files\Microsoft SQL Server\90\DTS\Binn\;C:\Program Files\Microsoft SQL Server\90\Tools\binn\;C:\Program Files\Microsoft SQL Server\90\Tools\Binn\VSShell\Common7\IDE\;C:\Program Files\Microsoft Visual Studio 8\Common7\IDE\PrivateAssemblies\ 2014-5-7 20:03:59 org.apache.catalina.startup.Catalina load 信息: Initialization processed in 761 ms 2014-5-7 20:03:59 org.apache.catalina.core.StandardService start 信息: Starting service EmpSearchPro 2014-5-7 20:03:59 org.apache.catalina.core.StandardEngine start 信息: Starting Servlet Engine: Apache Tomcat/6.0.29 2014-5-7 20:04:00 org.apache.catalina.core.ApplicationContext log 信息: Initializing Spring root WebApplicationContext 2014-5-7 20:04:02 org.apache.catalina.startup.HostConfig deployDescriptor 信息: Deploying configuration descriptor host-manager.xml 2014-5-7 20:04:02 org.apache.catalina.startup.HostConfig deployDescriptor 信息: Deploying configuration descriptor manager.xml 2014-5-7 20:04:02 org.apache.catalina.startup.HostConfig deployDirectory 信息: Deploying web application directory docs 2014-5-7 20:04:02 org.apache.catalina.startup.HostConfig deployDirectory 信息: Deploying web application directory examples 2014-5-7 20:04:02 org.apache.catalina.core.ApplicationContext log 信息: ContextListener: contextInitialized() 2014-5-7 20:04:02 org.apache.catalina.core.ApplicationContext log 信息: SessionListener: contextInitialized() 2014-5-7 20:04:02 org.apache.catalina.startup.HostConfig deployDirectory 信息: Deploying web application directory WebQuery 2014-5-7 20:04:03 org.apache.catalina.core.ApplicationContext log 信息: Initializing Spring root WebApplicationContext 2014-5-7 20:04:04 org.apache.jk.common.ChannelSocket init 信息: JK: ajp13 listening on /0.0.0.0:8009 2014-5-7 20:04:04 org.apache.jk.server.JkMain start 信息: Jk running ID=0 time=0/78 config=null 2014-5-7 20:04:04 org.apache.catalina.startup.Catalina start 信息: Server startup in 4629 ms 2014-5-8 9:59:06 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-8 9:59:08 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-8 9:59:30 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-8 10:24:51 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-8 10:24:53 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-8 10:24:56 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-8 10:25:11 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-8 17:43:01 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-8 17:44:40 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-8 18:18:41 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-8 18:19:02 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-8 18:19:14 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-8 18:20:06 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-8 18:22:07 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-8 18:22:46 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-8 18:23:06 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-8 18:34:39 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-8 18:35:24 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-8 20:00:44 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-8 20:01:29 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-8 20:01:57 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-8 20:02:25 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-8 20:02:52 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-9 9:35:57 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-9 10:44:22 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-9 11:38:34 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-9 11:38:39 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-9 12:48:51 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-9 13:29:28 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-9 13:30:33 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-9 13:30:43 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-9 16:07:07 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-9 16:07:07 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-9 16:15:35 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-9 18:22:24 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-10 12:35:22 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-10 13:45:59 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-10 13:48:08 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-10 13:48:38 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-10 13:48:49 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-10 13:49:08 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-11 13:07:58 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-11 13:09:47 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-11 14:32:39 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-11 14:33:11 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-11 14:34:14 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-11 14:34:32 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-11 14:40:08 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-11 14:42:42 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-11 14:43:31 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-11 14:46:53 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-11 16:04:39 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-11 16:06:31 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-11 16:47:14 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-11 16:48:04 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-11 16:49:08 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-11 16:50:33 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-11 17:04:46 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-11 17:05:49 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-11 17:23:43 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-11 17:24:52 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-11 19:01:26 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-11 19:04:00 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-11 19:04:01 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-11 19:04:01 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-11 19:10:35 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-11 19:11:05 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-12 10:38:58 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-12 12:02:52 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-12 12:03:17 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-12 15:11:51 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-12 15:55:16 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-12 15:55:54 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-12 15:55:55 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-12 15:56:20 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-12 16:29:55 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-12 17:09:33 org.apache.jk.common.ChannelSocket processConnection 警告: processCallbacks status 2 2014-5-13 8:59:49 org.apache.catalina.core.StandardService stop 信息: Stopping service EmpSearchPro 2014-5-13 8:59:49 org.apache.catalina.core.ApplicationContext log 信息: SessionListener: contextDestroyed() 2014-5-13 8:59:49 org.apache.catalina.core.ApplicationContext log 信息: ContextListener: contextDestroyed() 2014-5-13 8:59:50 org.apache.catalina.core.ApplicationContext log 信息: Closing Spring root WebApplicationContext 2014-5-13 8:59:50 org.apache.catalina.loader.WebappClassLoader clearReferencesJdbc 严重: The web application [/WebQuery] registered the JBDC driver [oracle.jdbc.driver.OracleDriver] but failed to unregister it when the web application was stopped. To prevent a memory leak, the JDBC Driver has been forcibly unregistered. 2014-5-13 8:59:50 org.apache.catalina.loader.WebappClassLoader clearThreadLocalMap 严重: The web application [/WebQuery] created a ThreadLocal with key of type [null] (value [com.opensymphony.xwork2.inject.ContainerImpl$10@1ab1379]) and a value of type [java.lang.Object[]] (value [[Ljava.lang.Object;@8b8c90]) but failed to remove it when the web application was stopped. This is very likely to create a memory leak. 2014-5-13 8:59:51 org.apache.catalina.core.StandardWrapper unload 信息: Waiting for 538 instance(s) to be deallocated 2014-5-13 8:59:52 org.apache.catalina.core.StandardWrapper unload 信息: Waiting for 538 instance(s) to be deallocated 2014-5-13 8:59:53 org.apache.catalina.core.StandardWrapper unload 信息: Waiting for 538 instance(s) to be deallocated 2014-5-13 8:59:54 org.apache.catalina.core.ApplicationContext log 信息: Closing Spring root WebApplicationContext 2014-5-13 8:59:54 org.apache.catalina.loader.WebappClassLoader clearReferencesJdbc 严重: The web application [] registered the JBDC driver [com.mysql.jdbc.Driver] but failed to unregister it when the web application was stopped. To prevent a memory leak, the JDBC Driver has been forcibly unregistered. 2014-5-13 8:59:54 org.apache.catalina.loader.WebappClassLoader clearReferencesJdbc 严重: The web application [] registered the JBDC driver [oracle.jdbc.driver.OracleDriver] but failed to unregister it when the web application was stopped. To prevent a memory leak, the JDBC Driver has been forcibly unregistered. 2014-5-13 8:59:54 org.apache.catalina.loader.WebappClassLoader clearReferencesThreads 严重: The web application [] is still processing a request that has yet to finish. This is very likely to create a memory leak. You can control the time allowed for requests to finish by using the unloadDelay attribute of the standard Context implementation. 2014-5-13 8:59:54 org.apache.catalina.loader.WebappClassLoader clearReferencesThreads 严重: The web application [] is still processing a request that has yet to finish. This is very likely to create a memory leak. You can control the time allowed for requests to finish by using the unloadDelay attribute of the standard Context implementation. 2014-5-13 8:59:54 org.apache.catalina.loader.WebappClassLoader clearReferencesThreads 严重: The web application [] is still processing a request that has yet to finish. This is very likely to create a memory leak. You can control the time allowed for requests to finish by using the unloadDelay attribute of the standard Context implementation. 2014-5-13 8:59:54 org.apache.catalina.loader.WebappClassLoader clearReferencesThreads 严重: The web application [] is still processing a request that has yet to finish. This is very likely to create a memory leak. You can control the time allowed for requests to finish by using the unloadDelay attribute of the standard Context implementation. 2014-5-13 8:59:54 org.apache.catalina.loader.WebappClassLoader clearReferencesThreads 严重: The web application [] is still processing a request that has yet to finish. This is very likely to create a memory leak. You can control the time allowed for requests to finish by using the unloadDelay attribute of the standard Context implementation. 2014-5-13 8:59:54 org.apache.catalina.loader.WebappClassLoader clearReferencesThreads 严重: The web application [] is still processing a request that has yet to finish. This is very likely to create a memory leak. You can control the time allowed for requests to finish by using the unloadDelay attribute of the standard Context implementation. 2014-5-13 8:59:54 org.apache.catalina.loader.WebappClassLoader clearReferencesThreads 严重: The web application [] is still processing a request that has yet to finish. This is very likely to create a memory leak. You can control the time allowed for requests to finish by using the unloadDelay attribute of the standard Context implementation. 2014-5-13 8:59:54 org.apache.catalina.loader.WebappClassLoader clearReferencesThreads 严重: The web application [] appears to have started a thread named [AWT-Windows] but has failed to stop it. This is very likely to create a memory leak. 2014-5-13 8:59:54 org.apache.catalina.loader.WebappClassLoader clearReferencesThreads 严重: The web application [] appears to have started a thread named [Timer-0] but has failed to stop it. This is very likely to create a memory leak. 2014-5-13 8:59:54 org.apache.catalina.loader.WebappClassLoader clearReferencesThreads 严重: The web application [] appears to have started a thread named [com.mchange.v2.async.ThreadPoolAsynchronousRunner$PoolThread-#0] but has failed to stop it. This is very likely to create a memory leak. 2014-5-13 8:59:54 org.apache.catalina.loader.WebappClassLoader clearReferencesThreads 严重: The web application [] appears to have started a thread named [com.mchange.v2.async.ThreadPoolAsynchronousRunner$PoolThread-#1] but has failed to stop it. This is very likely to create a memory leak. 2014-5-13 8:59:54 org.apache.catalina.loader.WebappClassLoader clearReferencesThreads 严重: The web application [] appears to have started a thread named [com.mchange.v2.async.ThreadPoolAsynchronousRunner$PoolThread-#2] but has failed to stop it. This is very likely to create a memory leak. 2014-5-13 8:59:54 org.apache.catalina.loader.WebappClassLoader clearReferencesThreads 严重: The web application [] is still processing a request that has yet to finish. This is very likely to create a memory leak. You can control the time allowed for requests to finish by using the unloadDelay attribute of the standard Context implementation. 2014-5-13 8:59:54 org.apache.catalina.loader.WebappClassLoader clearReferencesThreads 严重: The web application [] is still processing a request that has yet to finish. This is very likely to create a memory leak. You can control the time allowed for requests to finish by using the unloadDelay attribute of the standard Context implementation. 2014-5-13 8:59:54 org.apache.catalina.loader.WebappClassLoader clearReferencesThreads 严重: The web application [] is still processing a request that has yet to finish. This is very likely to create a memory leak. You can control the time allowed for requests to finish by using the unloadDelay attribute of the standard Context implementation. 2014-5-13 8:59:54 org.apache.catalina.loader.WebappClassLoader clearReferencesThreads 严重: The web application [] is still processing a request that has yet to finish. This is very likely to create a memory leak. You can control the time allowed for requests to finish by using the unloadDelay attribute of the standard Context implementation. 2014-5-13 8:59:54 org.apache.catalina.loader.WebappClassLoader clearReferencesThreads 严重: The web application [] is still processing a request that has yet to finish. This is very likely to create a memory leak. You can control the time allowed for requests to finish by using the unloadDelay attribute of the standard Context implementation. 2014-5-13 8:59:54 org.apache.catalina.loader.WebappClassLoader clearReferencesThreads 严重: The web application [] is still processing a request that has yet to finish. This is very likely to create a memory leak. You can control the time allowed for requests to finish by using the unloadDelay attribute of the standard Context implementation. 2014-5-13 8:59:54 org.apache.catalina.loader.WebappClassLoader clearReferencesThreads 严重: The web application [] is still processing a request that has yet to finish. This is very likely to create a memory leak. You can control the time allowed for requests to finish by using the unloadDelay attribute of the standard Context implementation. 2014-5-13 8:59:54 org.apache.catalina.loader.WebappClassLoader clearReferencesThreads 严重: The web application [] is still processing a request that has yet to finish. This is very likely to create a memory leak. You can control the time allowed for requests to finish by using the unloadDelay attribute of the standard Context implementation. 2014-5-13 8:59:54 org.apache.catalina.loader.WebappClassLoader clearReferencesThreads 严重: The web application [] is still processing a request that has yet to finish. This is very likely to create a memory leak. You can control the time allowed for requests to finish by using the unloadDelay attribute of the standard Context implementation. 2014-5-13 8:59:54 org.apache.catalina.loader.WebappClassLoader clearReferencesThreads 严重: The web application [] is still processing a request that has yet to finish. This is very likely to create a memory leak. You can control the time allowed for requests to finish by using the unloadDelay attribute of the standard Context implementation. 2014-5-13 8:59:54 org.apache.catalina.loader.WebappClassLoader clearReferencesThreads 严重: The web application [] is still processing a request that has yet to finish. This is very likely to create a memory leak. You can control the time allowed for requests to finish by using the unloadDelay attribute of the standard Context implementation. 2014-5-13 8:59:54 org.apache.catalina.loader.WebappClassLoader clearReferencesThreads 严重: The web application [] is still processing a request that has yet to finish. This is very likely to create a memory leak. You can control the time allowed for requests to finish by using the unloadDelay attribute of the standard Context implementation. 2014-5-13 8:59:54 org.apache.catalina.loader.WebappClassLoader clearReferencesThreads 严重: The web application [] is still processing a request that has yet to finish. This is very likely to create a memory leak. You can control the time allowed for requests to finish by using the unloadDelay attribute of the standard Context implementation.
hive beeline 连接 User: root is not allowed to impersonate root
beeline 连接不上。已经困扰我半个月,请各位师兄指点一下。我部署hadoop是单机版的。hive 能做查询,能简历数据库。 用!connect jdbc:hive2://devcrm:10000 出现权限问题 ``` beeline> !connect jdbc:hive2://devcrm:10000 Connecting to jdbc:hive2://devcrm:10000 Enter username for jdbc:hive2://devcrm:10000: hadoop Enter password for jdbc:hive2://devcrm:10000: ****** 19/04/23 15:36:53 [main]: WARN jdbc.HiveConnection: Failed to connect to devcrm:10000 Error: Could not open client transport with JDBC Uri: jdbc:hive2://devcrm:10000: Failed to open new session: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate hadoop (state=08S01,code=0) ``` 用 beeline -u jdbc:hive2//devcrm:10000 -n hadoop连接也不行 ``` [root@devcrm hadoop]# beeline -u jdbc:hive2//devcrm:10000 -n hadoop SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/usr/local/kafka/hive/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/local/kafka/hadoop-2.7.6/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] scan complete in 1ms scan complete in 963ms No known driver to handle "jdbc:hive2//devcrm:10000" Beeline version 2.3.0 by Apache Hive ``` hive-site.xml文件 ``` <configuration> <property> <name>javax.jdo.option.ConnectionUserName</name> <value>root</value> <description>username to use against metastore database</description> </property> <property> <name>javax.jdo.option.ConnectionPassword</name> <value>123</value> <description>password to use against metastore database</description> </property> <property> <name>javax.jdo.option.ConnectionURL</name> <value>jdbc:mysql://192.168.12.77:3306/hive?createDatabaseIfNotExist=true</value> <description>JDBC connect string for a JDBC metastore</description> </property> <property> <name>javax.jdo.option.ConnectionDriverName</name> <value>com.mysql.jdbc.Driver</value> <description>Driver class name for a JDBC metastore</description> </property> <property> <name>hive.server2.thrift.client.user</name> <value>hadoop</value> <description>Username to use against thrift client</description> </property> <property> <name>hive.server2.thrift.client.password</name> <value>hadoop</value> <description>Password to use against thrift client</description> </property> ``` core-site.xml文件 ``` <configuration> <!--指定namenode的地址--> <property> <name>fs.defaultFS</name> <value>hdfs://192.168.11.207:9000</value> </property> <!--用来指定使用hadoop时产生文件的存放目录--> <property> <name>hadoop.tmp.dir</name> <!--<value>file:/usr/local/kafka/hadoop-2.7.6/tmp</value>--> <value>file:/home/hadoop/temp</value> </property> <!--用来设置检查点备份日志的最长时间--> <!-- <name>fs.checkpoint.period</name> <value>3600</value> --> <!-- 表示设置 hadoop 的代理用户--> <property> <!--表示任意节点使用 hadoop 集群的代理用户 hadoop 都能访问 hdfs 集群--> <name>hadoop.proxyuser.root.hosts</name> <value>*</value> </property> <property> <!--表示代理用户的组所属--> <name>hadoop.proxyuser.root.groups</name> <value>*</value> </property> </configuration> ``` hdfs-site.xml 文件 ``` <configuration> <!--指定hdfs保存数据的副本数量--> <property> <name>dfs.replication</name> <value>1</value> </property> <!--指定hdfs中namenode的存储位置--> <property> <name>dfs.namenode.name.dir</name> <value>file:/usr/local/kafka/hadoop-2.7.6/tmp/dfs/name</value> </property> <!--指定hdfs中datanode的存储位置--> <property> <name>dfs.datanode.data.dir</name> <value>file:/usr/local/kafka/hadoop-2.7.6/tmp/dfs/data</value> </property> <property> <name>dfs.secondary.http.address</name> <value>192.168.11.207:50090</value> </property> <property> <name>dfs.permissions</name> <value>false</value> </property> <!-- 表示启用 webhdfs--> <property> <name>dfs.webhdfs.enabled</name> <value>true</value> </property> </configuration> ``` http://192.168.11.207:10002/页面能看到HiveServer2的启动时间 ![图片说明](https://img-ask.csdn.net/upload/201904/23/1556005658_291513.png) hive 的日志 ``` 2019-04-24T09:20:11,829 INFO [main] http.HttpServer: Started HttpServer[hiveserver2] on port 10002 2019-04-24T09:20:50,464 INFO [HiveServer2-Handler-Pool: Thread-38] thrift.ThriftCLIService: Client protocol version: HIVE_CLI_SERVICE_PROTOCOL_V10 2019-04-24T09:20:50,494 INFO [HiveServer2-Handler-Pool: Thread-38] conf.HiveConf: Using the default value passed in for log id: b0f59ac1-d17a-404f-8bf5-fbe4693c9964 2019-04-24T09:20:50,494 INFO [b0f59ac1-d17a-404f-8bf5-fbe4693c9964 HiveServer2-Handler-Pool: Thread-38] conf.HiveConf: Using the default value passed in for log id: b0f59ac1-d17a-404f-8bf5-fbe4693c9964 2019-04-24T09:20:50,494 INFO [HiveServer2-Handler-Pool: Thread-38] conf.HiveConf: Using the default value passed in for log id: b0f59ac1-d17a-404f-8bf5-fbe4693c9964 2019-04-24T09:20:50,495 INFO [b0f59ac1-d17a-404f-8bf5-fbe4693c9964 HiveServer2-Handler-Pool: Thread-38] conf.HiveConf: Using the default value passed in for log id: b0f59ac1-d17a-404f-8bf5-fbe4693c9964 2019-04-24T09:20:50,494 WARN [HiveServer2-Handler-Pool: Thread-38] service.CompositeService: Failed to open session java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate hadoop at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:89) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63) ~[hive-service-2.3.0.jar:2.3.0] at java.security.AccessController.doPrivileged(Native Method) ~[?:1.7.0_80] at javax.security.auth.Subject.doAs(Subject.java:415) ~[?:1.7.0_80] at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1758) ~[hadoop-common-2.7.6.jar:?] at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59) ~[hive-service-2.3.0.jar:2.3.0] at com.sun.proxy.$Proxy36.open(Unknown Source) ~[?:?] at org.apache.hive.service.cli.session.SessionManager.createSession(SessionManager.java:410) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.session.SessionManager.openSession(SessionManager.java:362) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.CLIService.openSessionWithImpersonation(CLIService.java:193) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.thrift.ThriftCLIService.getSessionHandle(ThriftCLIService.java:440) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.thrift.ThriftCLIService.OpenSession(ThriftCLIService.java:322) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.rpc.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1377) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.hive.service.rpc.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1362) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286) ~[hive-exec-2.3.0.jar:2.3.0] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [?:1.7.0_80] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [?:1.7.0_80] at java.lang.Thread.run(Thread.java:745) [?:1.7.0_80] Caused by: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate hadoop at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:606) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:544) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.session.HiveSessionImpl.open(HiveSessionImpl.java:164) ~[hive-service-2.3.0.jar:2.3.0] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.7.0_80] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) ~[?:1.7.0_80] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.7.0_80] at java.lang.reflect.Method.invoke(Method.java:606) ~[?:1.7.0_80] at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78) ~[hive-service-2.3.0.jar:2.3.0] ... 21 more Caused by: org.apache.hadoop.ipc.RemoteException: User: root is not allowed to impersonate hadoop at org.apache.hadoop.ipc.Client.call(Client.java:1476) ~[hadoop-common-2.7.6.jar:?] at org.apache.hadoop.ipc.Client.call(Client.java:1413) ~[hadoop-common-2.7.6.jar:?] at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) ~[hadoop-common-2.7.6.jar:?] at com.sun.proxy.$Proxy29.getFileInfo(Unknown Source) ~[?:?] at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:776) ~[hadoop-hdfs-2.7.6.jar:?] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.7.0_80] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) ~[?:1.7.0_80] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.7.0_80] at java.lang.reflect.Method.invoke(Method.java:606) ~[?:1.7.0_80] at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) ~[hadoop-common-2.7.6.jar:?] at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.7.6.jar:?] at com.sun.proxy.$Proxy30.getFileInfo(Unknown Source) ~[?:?] at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2117) ~[hadoop-hdfs-2.7.6.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305) ~[hadoop-hdfs-2.7.6.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301) ~[hadoop-hdfs-2.7.6.jar:?] at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) ~[hadoop-common-2.7.6.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317) ~[hadoop-hdfs-2.7.6.jar:?] at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1425) ~[hadoop-common-2.7.6.jar:?] at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:704) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:650) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:582) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:544) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.session.HiveSessionImpl.open(HiveSessionImpl.java:164) ~[hive-service-2.3.0.jar:2.3.0] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.7.0_80] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) ~[?:1.7.0_80] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.7.0_80] at java.lang.reflect.Method.invoke(Method.java:606) ~[?:1.7.0_80] at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78) ~[hive-service-2.3.0.jar:2.3.0] ... 21 more 2019-04-24T09:20:50,494 INFO [HiveServer2-Handler-Pool: Thread-38] session.SessionState: Updating thread name to b0f59ac1-d17a-404f-8bf5-fbe4693c9964 HiveServer2-Handler-Pool: Thread-38 2019-04-24T09:20:50,494 INFO [HiveServer2-Handler-Pool: Thread-38] session.SessionState: Resetting thread name to HiveServer2-Handler-Pool: Thread-38 2019-04-24T09:20:50,494 INFO [HiveServer2-Handler-Pool: Thread-38] session.SessionState: Updating thread name to b0f59ac1-d17a-404f-8bf5-fbe4693c9964 HiveServer2-Handler-Pool: Thread-38 2019-04-24T09:20:50,495 INFO [HiveServer2-Handler-Pool: Thread-38] session.SessionState: Resetting thread name to HiveServer2-Handler-Pool: Thread-38 2019-04-24T09:20:50,509 WARN [HiveServer2-Handler-Pool: Thread-38] thrift.ThriftCLIService: Error opening session: org.apache.hive.service.cli.HiveSQLException: Failed to open new session: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate hadoop at org.apache.hive.service.cli.session.SessionManager.createSession(SessionManager.java:419) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.session.SessionManager.openSession(SessionManager.java:362) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.CLIService.openSessionWithImpersonation(CLIService.java:193) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.thrift.ThriftCLIService.getSessionHandle(ThriftCLIService.java:440) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.thrift.ThriftCLIService.OpenSession(ThriftCLIService.java:322) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.rpc.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1377) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.hive.service.rpc.thrift.TCLIService$Processor$OpenSession.getResult(TCLIService.java:1362) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286) ~[hive-exec-2.3.0.jar:2.3.0] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [?:1.7.0_80] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [?:1.7.0_80] at java.lang.Thread.run(Thread.java:745) [?:1.7.0_80] Caused by: java.lang.RuntimeException: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate hadoop at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:89) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63) ~[hive-service-2.3.0.jar:2.3.0] at java.security.AccessController.doPrivileged(Native Method) ~[?:1.7.0_80] at javax.security.auth.Subject.doAs(Subject.java:415) ~[?:1.7.0_80] at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1758) ~[hadoop-common-2.7.6.jar:?] at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59) ~[hive-service-2.3.0.jar:2.3.0] at com.sun.proxy.$Proxy36.open(Unknown Source) ~[?:?] at org.apache.hive.service.cli.session.SessionManager.createSession(SessionManager.java:410) ~[hive-service-2.3.0.jar:2.3.0] ... 13 more Caused by: java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException): User: root is not allowed to impersonate hadoop at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:606) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:544) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.session.HiveSessionImpl.open(HiveSessionImpl.java:164) ~[hive-service-2.3.0.jar:2.3.0] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.7.0_80] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) ~[?:1.7.0_80] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.7.0_80] at java.lang.reflect.Method.invoke(Method.java:606) ~[?:1.7.0_80] at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63) ~[hive-service-2.3.0.jar:2.3.0] at java.security.AccessController.doPrivileged(Native Method) ~[?:1.7.0_80] at javax.security.auth.Subject.doAs(Subject.java:415) ~[?:1.7.0_80] at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1758) ~[hadoop-common-2.7.6.jar:?] at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59) ~[hive-service-2.3.0.jar:2.3.0] at com.sun.proxy.$Proxy36.open(Unknown Source) ~[?:?] at org.apache.hive.service.cli.session.SessionManager.createSession(SessionManager.java:410) ~[hive-service-2.3.0.jar:2.3.0] ... 13 more Caused by: org.apache.hadoop.ipc.RemoteException: User: root is not allowed to impersonate hadoop at org.apache.hadoop.ipc.Client.call(Client.java:1476) ~[hadoop-common-2.7.6.jar:?] at org.apache.hadoop.ipc.Client.call(Client.java:1413) ~[hadoop-common-2.7.6.jar:?] at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229) ~[hadoop-common-2.7.6.jar:?] at com.sun.proxy.$Proxy29.getFileInfo(Unknown Source) ~[?:?] at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:776) ~[hadoop-hdfs-2.7.6.jar:?] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.7.0_80] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) ~[?:1.7.0_80] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.7.0_80] at java.lang.reflect.Method.invoke(Method.java:606) ~[?:1.7.0_80] at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191) ~[hadoop-common-2.7.6.jar:?] at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) ~[hadoop-common-2.7.6.jar:?] at com.sun.proxy.$Proxy30.getFileInfo(Unknown Source) ~[?:?] at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2117) ~[hadoop-hdfs-2.7.6.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1305) ~[hadoop-hdfs-2.7.6.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem$22.doCall(DistributedFileSystem.java:1301) ~[hadoop-hdfs-2.7.6.jar:?] at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) ~[hadoop-common-2.7.6.jar:?] at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1317) ~[hadoop-hdfs-2.7.6.jar:?] at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1425) ~[hadoop-common-2.7.6.jar:?] at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:704) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:650) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:582) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:544) ~[hive-exec-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.session.HiveSessionImpl.open(HiveSessionImpl.java:164) ~[hive-service-2.3.0.jar:2.3.0] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.7.0_80] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) ~[?:1.7.0_80] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.7.0_80] at java.lang.reflect.Method.invoke(Method.java:606) ~[?:1.7.0_80] at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36) ~[hive-service-2.3.0.jar:2.3.0] at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63) ~[hive-service-2.3.0.jar:2.3.0] at java.security.AccessController.doPrivileged(Native Method) ~[?:1.7.0_80] at javax.security.auth.Subject.doAs(Subject.java:415) ~[?:1.7.0_80] at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1758) ~[hadoop-common-2.7.6.jar:?] at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59) ~[hive-service-2.3.0.jar:2.3.0] at com.sun.proxy.$Proxy36.open(Unknown Source) ~[?:?] at org.apache.hive.service.cli.session.SessionManager.createSession(SessionManager.java:410) ~[hive-service-2.3.0.jar:2.3.0] ... 13 more ```
【急!】java访问hbase出错,无明显异常,无法读取对应表的信息(需求就是从Hbase之中的某张表读取信息)
有点着急请哪位大神帮忙一下 ## 本地Hosts配置(对应的机上面的hosts的文件配置都一样): -------------------------- 127.0.0.1 localhost 192.168.0.25 Master.Hadoop 192.168.0.26 Slave1.Hadoop 192.168.0.27 Slave2.Hadoop 192.168.0.28 Slave3.Hadoop ## hbase的基本信息: -------------------------- http://master.hadoop:60010/master-status HBase Root Directory: hdfs://Master.Hadoop:9000/hbase ## 代码 -------------------------- Configuration configuration = HBaseConfiguration.create(); configuration = HBaseConfiguration.create(); configuration.set("hbase.zookeeper.property.clientPort", "2181"); configuration.set("hbase.zookeeper.quorum", "Master.Hadoop,Slave1.Hadoop,Slave2.Hadoop,Slave1.Hadoop"); configuration.set("hbase.master", "Master.Hadoop:60010"); //"Master.Hadoop:60010" System.setProperty("hadoop.home.dir", "C:\\Program Files\\hadoop\\hadoop-common-2.2.0-bin-master"); connection = ConnectionFactory.createConnection(configuration); System.out.println( connection ); Admin admin = connection.getAdmin(); System.out.println( admin ); TableName tableName1 = TableName.valueOf("hbase:meta"); System.out.println( tableName1 ); System.out.println( admin.tableExists(tableName1) ); Table table = connection.getTable(tableName1); HTableDescriptor a = table.getTableDescriptor(); System.out.println( a ); ## 日志信息: -------------------------- "C:\Program Files\Java\jdk1.8.0_181\bin\java.exe" "-javaagent:C:\Program Files\JetBrains\IntelliJ IDEA 2018.2.3\lib\idea_rt.jar=57269:C:\Program Files\JetBrains\IntelliJ IDEA 2018.2.3\bin" -Dfile.encoding=UTF-8 -classpath "C:\Program Files\Java\jdk1.8.0_181\jre\lib\charsets.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\deploy.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\ext\access-bridge-64.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\ext\cldrdata.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\ext\dnsns.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\ext\jaccess.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\ext\jfxrt.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\ext\localedata.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\ext\nashorn.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\ext\sunec.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\ext\sunjce_provider.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\ext\sunmscapi.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\ext\sunpkcs11.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\ext\zipfs.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\javaws.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\jce.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\jfr.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\jfxswt.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\jsse.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\management-agent.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\plugin.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\resources.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\rt.jar;C:\Users\Administrator\IdeaProjects\TestHbase\out\production\TestHbase;D:\lib\xz-1.0.jar;D:\lib\asm-3.1.jar;D:\lib\avro-1.7.4.jar;D:\lib\common-1.0.jar;D:\lib\domain-1.0.jar;D:\lib\jfinal-3.1.jar;D:\lib\joni-2.1.2.jar;D:\lib\noggit-0.6.jar;D:\lib\jsch-0.1.42.jar;D:\lib\service-1.0.jar;D:\lib\xmlenc-0.52.jar;D:\lib\druid-1.0.31.jar;D:\lib\guava-12.0.1.jar;D:\lib\jcifs-1.3.17.jar;D:\lib\jetty-6.1.26.jar;D:\lib\jsr305-1.3.9.jar;D:\lib\log4j-1.2.16.jar;D:\lib\cos-26Dec2008.jar;D:\lib\paranamer-2.3.jar;D:\lib\activation-1.1.jar;D:\lib\commons-el-1.0.jar;D:\lib\commons-io-2.4.jar;D:\lib\httpcore-4.4.4.jar;D:\lib\httpmime-4.4.1.jar;D:\lib\jaxb-api-2.2.2.jar;D:\lib\jcodings-1.0.8.jar;D:\lib\jsp-2.1-6.1.14.jar;D:\lib\stax-api-1.0-2.jar;D:\lib\cglib-nodep-3.1.jar;D:\lib\commons-cli-1.2.jar;D:\lib\commons-net-3.1.jar;D:\lib\disruptor-3.3.0.jar;D:\lib\fastjson-1.2.37.jar;D:\lib\jersey-core-1.9.jar;D:\lib\servlet-api-2.4.jar;D:\lib\slf4j-api-1.6.6.jar;D:\lib\stax2-api-3.1.4.jar;D:\lib\zookeeper-3.4.6.jar;D:\lib\commons-lang-2.6.jar;D:\lib\commons-math-2.2.jar;D:\lib\httpclient-4.5.2.jar;D:\lib\solr-solrj-6.1.0.jar;D:\lib\freemarker-2.3.23.jar;D:\lib\hadoop-auth-2.5.1.jar;D:\lib\hadoop-hdfs-2.7.4.jar;D:\lib\jersey-server-1.9.jar;D:\lib\jetty-util-6.1.26.jar;D:\lib\netty-3.6.2.Final.jar;D:\lib\api-util-1.0.0-M20.jar;D:\lib\commons-codec-1.11.jar;D:\lib\hbase-client-1.2.3.jar;D:\lib\hbase-common-1.2.3.jar;D:\lib\hbase-server-1.4.0.jar;D:\lib\jsp-api-2.1-6.1.14.jar;D:\lib\leveldbjni-all-1.8.jar;D:\lib\metrics-core-2.2.0.jar;D:\lib\metrics-core-3.1.2.jar;D:\lib\commons-logging-1.2.jar;D:\lib\commons-math3-3.1.1.jar;D:\lib\hadoop-client-2.7.4.jar;D:\lib\hadoop-common-2.5.1.jar;D:\lib\hbase-metrics-1.4.0.jar;D:\lib\jamon-runtime-2.4.1.jar;D:\lib\protobuf-java-2.5.0.jar;D:\lib\slf4j-log4j12-1.6.6.jar;D:\lib\snappy-java-1.0.4.1.jar;D:\lib\commons-digester-1.8.jar;D:\lib\hbase-protocol-1.2.3.jar;D:\lib\jackson-jaxrs-1.9.13.jar;D:\lib\jcl-over-slf4j-1.7.7.jar;D:\lib\commons-daemon-1.0.13.jar;D:\lib\hadoop-yarn-api-2.7.4.jar;D:\lib\hbase-procedure-1.4.0.jar;D:\lib\jasper-runtime-5.5.23.jar;D:\lib\api-asn1-api-1.0.0-M20.jar;D:\lib\commons-compress-1.4.1.jar;D:\lib\commons-httpclient-3.1.jar;D:\lib\jasper-compiler-5.5.23.jar;D:\lib\jetty-sslengine-6.1.26.jar;D:\lib\netty-all-4.0.23.Final.jar;D:\lib\servlet-api-2.5-6.1.14.jar;D:\lib\apacheds-i18n-2.0.0-M15.jar;D:\lib\commons-beanutils-1.7.0.jar;D:\lib\hbase-annotations-1.2.3.jar;D:\lib\hbase-metrics-api-1.4.0.jar;D:\lib\hbase-prefix-tree-1.4.0.jar;D:\lib\jackson-core-asl-1.9.13.jar;D:\lib\woodstox-core-asl-4.4.1.jar;D:\lib\hadoop-annotations-2.5.1.jar;D:\lib\hadoop-yarn-client-2.7.4.jar;D:\lib\hadoop-yarn-common-2.5.1.jar;D:\lib\hbase-common-1.4.0-tests.jar;D:\lib\commons-collections-3.2.2.jar;D:\lib\commons-configuration-1.6.jar;D:\lib\hbase-hadoop-compat-1.4.0.jar;D:\lib\jackson-mapper-asl-1.9.13.jar;D:\lib\hbase-hadoop2-compat-1.4.0.jar;D:\lib\mysql-connector-java-5.1.38.jar;D:\lib\commons-beanutils-core-1.8.0.jar;D:\lib\findbugs-annotations-1.3.9-1.jar;D:\lib\htrace-core-3.1.0-incubating.jar;D:\lib\hadoop-yarn-server-common-2.7.4.jar;D:\lib\apacheds-kerberos-codec-2.0.0-M15.jar;D:\lib\hadoop-mapreduce-client-app-2.7.4.jar;D:\lib\hadoop-mapreduce-client-core-2.5.1.jar;D:\lib\hadoop-mapreduce-client-common-2.7.4.jar;D:\lib\hadoop-mapreduce-client-shuffle-2.7.4.jar;D:\lib\hadoop-mapreduce-client-jobclient-2.7.4.jar" TestHbase [DEBUG] [10:23:53] org.apache.hadoop.security.Groups - Creating new Groups object [DEBUG] [10:23:53] org.apache.hadoop.util.NativeCodeLoader - Trying to load the custom-built native-hadoop library... [DEBUG] [10:23:53] org.apache.hadoop.util.NativeCodeLoader - Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path [DEBUG] [10:23:53] org.apache.hadoop.util.NativeCodeLoader - java.library.path=C:\Program Files\Java\jdk1.8.0_181\bin;C:\Windows\Sun\Java\bin;C:\Windows\system32;C:\Windows;C:\Program Files (x86)\Common Files\Oracle\Java\javapath;C:\ProgramData\Oracle\Java\javapath;C:\Program Files\Java\jdk1.7.0_51\bin;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;. [WARN ] [10:23:53] org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable [DEBUG] [10:23:53] org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback - Falling back to shell based [DEBUG] [10:23:53] org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback - Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping [DEBUG] [10:23:53] org.apache.hadoop.security.Groups - Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000 [DEBUG] [10:23:53] org.apache.hadoop.metrics2.lib.MutableMetricsFactory - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, value=[Rate of successful kerberos logins and latency (milliseconds)], valueName=Time) [DEBUG] [10:23:53] org.apache.hadoop.metrics2.lib.MutableMetricsFactory - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, value=[Rate of failed kerberos logins and latency (milliseconds)], valueName=Time) [DEBUG] [10:23:53] org.apache.hadoop.metrics2.lib.MutableMetricsFactory - field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(about=, sampleName=Ops, always=false, type=DEFAULT, value=[GetGroups], valueName=Time) [DEBUG] [10:23:53] org.apache.hadoop.metrics2.impl.MetricsSystemImpl - UgiMetrics, User and group related metrics [DEBUG] [10:23:54] org.apache.hadoop.security.authentication.util.KerberosName - Kerberos krb5 configuration not found, setting default realm to empty [DEBUG] [10:23:54] org.apache.hadoop.security.UserGroupInformation - hadoop login [DEBUG] [10:23:54] org.apache.hadoop.security.UserGroupInformation - hadoop login commit [DEBUG] [10:23:54] org.apache.hadoop.security.UserGroupInformation - using local user:NTUserPrincipal: admin [DEBUG] [10:23:54] org.apache.hadoop.security.UserGroupInformation - UGI loginUser:admin (auth:SIMPLE) [INFO ] [10:23:55] org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper - Process identifier=hconnection-0x17d0685f connecting to ZooKeeper ensemble=192.168.0.25:2181 [INFO ] [10:23:55] org.apache.zookeeper.ZooKeeper - Client environment:zookeeper.version=3.4.6-1569965, built on 02/20/2014 09:09 GMT [INFO ] [10:23:55] org.apache.zookeeper.ZooKeeper - Client environment:host.name=WIN-SSJFMH6ELVT [INFO ] [10:23:55] org.apache.zookeeper.ZooKeeper - Client environment:java.version=1.8.0_181 [INFO ] [10:23:55] org.apache.zookeeper.ZooKeeper - Client environment:java.vendor=Oracle Corporation [INFO ] [10:23:55] org.apache.zookeeper.ZooKeeper - Client environment:java.home=C:\Program Files\Java\jdk1.8.0_181\jre [INFO ] [10:23:55] org.apache.zookeeper.ZooKeeper - Client environment:java.class.path=C:\Program Files\Java\jdk1.8.0_181\jre\lib\charsets.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\deploy.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\ext\access-bridge-64.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\ext\cldrdata.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\ext\dnsns.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\ext\jaccess.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\ext\jfxrt.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\ext\localedata.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\ext\nashorn.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\ext\sunec.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\ext\sunjce_provider.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\ext\sunmscapi.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\ext\sunpkcs11.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\ext\zipfs.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\javaws.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\jce.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\jfr.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\jfxswt.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\jsse.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\management-agent.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\plugin.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\resources.jar;C:\Program Files\Java\jdk1.8.0_181\jre\lib\rt.jar;C:\Users\Administrator\IdeaProjects\TestHbase\out\production\TestHbase;D:\lib\xz-1.0.jar;D:\lib\asm-3.1.jar;D:\lib\avro-1.7.4.jar;D:\lib\common-1.0.jar;D:\lib\domain-1.0.jar;D:\lib\jfinal-3.1.jar;D:\lib\joni-2.1.2.jar;D:\lib\noggit-0.6.jar;D:\lib\jsch-0.1.42.jar;D:\lib\service-1.0.jar;D:\lib\xmlenc-0.52.jar;D:\lib\druid-1.0.31.jar;D:\lib\guava-12.0.1.jar;D:\lib\jcifs-1.3.17.jar;D:\lib\jetty-6.1.26.jar;D:\lib\jsr305-1.3.9.jar;D:\lib\log4j-1.2.16.jar;D:\lib\cos-26Dec2008.jar;D:\lib\paranamer-2.3.jar;D:\lib\activation-1.1.jar;D:\lib\commons-el-1.0.jar;D:\lib\commons-io-2.4.jar;D:\lib\httpcore-4.4.4.jar;D:\lib\httpmime-4.4.1.jar;D:\lib\jaxb-api-2.2.2.jar;D:\lib\jcodings-1.0.8.jar;D:\lib\jsp-2.1-6.1.14.jar;D:\lib\stax-api-1.0-2.jar;D:\lib\cglib-nodep-3.1.jar;D:\lib\commons-cli-1.2.jar;D:\lib\commons-net-3.1.jar;D:\lib\disruptor-3.3.0.jar;D:\lib\fastjson-1.2.37.jar;D:\lib\jersey-core-1.9.jar;D:\lib\servlet-api-2.4.jar;D:\lib\slf4j-api-1.6.6.jar;D:\lib\stax2-api-3.1.4.jar;D:\lib\zookeeper-3.4.6.jar;D:\lib\commons-lang-2.6.jar;D:\lib\commons-math-2.2.jar;D:\lib\httpclient-4.5.2.jar;D:\lib\solr-solrj-6.1.0.jar;D:\lib\freemarker-2.3.23.jar;D:\lib\hadoop-auth-2.5.1.jar;D:\lib\hadoop-hdfs-2.7.4.jar;D:\lib\jersey-server-1.9.jar;D:\lib\jetty-util-6.1.26.jar;D:\lib\netty-3.6.2.Final.jar;D:\lib\api-util-1.0.0-M20.jar;D:\lib\commons-codec-1.11.jar;D:\lib\hbase-client-1.2.3.jar;D:\lib\hbase-common-1.2.3.jar;D:\lib\hbase-server-1.4.0.jar;D:\lib\jsp-api-2.1-6.1.14.jar;D:\lib\leveldbjni-all-1.8.jar;D:\lib\metrics-core-2.2.0.jar;D:\lib\metrics-core-3.1.2.jar;D:\lib\commons-logging-1.2.jar;D:\lib\commons-math3-3.1.1.jar;D:\lib\hadoop-client-2.7.4.jar;D:\lib\hadoop-common-2.5.1.jar;D:\lib\hbase-metrics-1.4.0.jar;D:\lib\jamon-runtime-2.4.1.jar;D:\lib\protobuf-java-2.5.0.jar;D:\lib\slf4j-log4j12-1.6.6.jar;D:\lib\snappy-java-1.0.4.1.jar;D:\lib\commons-digester-1.8.jar;D:\lib\hbase-protocol-1.2.3.jar;D:\lib\jackson-jaxrs-1.9.13.jar;D:\lib\jcl-over-slf4j-1.7.7.jar;D:\lib\commons-daemon-1.0.13.jar;D:\lib\hadoop-yarn-api-2.7.4.jar;D:\lib\hbase-procedure-1.4.0.jar;D:\lib\jasper-runtime-5.5.23.jar;D:\lib\api-asn1-api-1.0.0-M20.jar;D:\lib\commons-compress-1.4.1.jar;D:\lib\commons-httpclient-3.1.jar;D:\lib\jasper-compiler-5.5.23.jar;D:\lib\jetty-sslengine-6.1.26.jar;D:\lib\netty-all-4.0.23.Final.jar;D:\lib\servlet-api-2.5-6.1.14.jar;D:\lib\apacheds-i18n-2.0.0-M15.jar;D:\lib\commons-beanutils-1.7.0.jar;D:\lib\hbase-annotations-1.2.3.jar;D:\lib\hbase-metrics-api-1.4.0.jar;D:\lib\hbase-prefix-tree-1.4.0.jar;D:\lib\jackson-core-asl-1.9.13.jar;D:\lib\woodstox-core-asl-4.4.1.jar;D:\lib\hadoop-annotations-2.5.1.jar;D:\lib\hadoop-yarn-client-2.7.4.jar;D:\lib\hadoop-yarn-common-2.5.1.jar;D:\lib\hbase-common-1.4.0-tests.jar;D:\lib\commons-collections-3.2.2.jar;D:\lib\commons-configuration-1.6.jar;D:\lib\hbase-hadoop-compat-1.4.0.jar;D:\lib\jackson-mapper-asl-1.9.13.jar;D:\lib\hbase-hadoop2-compat-1.4.0.jar;D:\lib\mysql-connector-java-5.1.38.jar;D:\lib\commons-beanutils-core-1.8.0.jar;D:\lib\findbugs-annotations-1.3.9-1.jar;D:\lib\htrace-core-3.1.0-incubating.jar;D:\lib\hadoop-yarn-server-common-2.7.4.jar;D:\lib\apacheds-kerberos-codec-2.0.0-M15.jar;D:\lib\hadoop-mapreduce-client-app-2.7.4.jar;D:\lib\hadoop-mapreduce-client-core-2.5.1.jar;D:\lib\hadoop-mapreduce-client-common-2.7.4.jar;D:\lib\hadoop-mapreduce-client-shuffle-2.7.4.jar;D:\lib\hadoop-mapreduce-client-jobclient-2.7.4.jar;C:\Program Files\JetBrains\IntelliJ IDEA 2018.2.3\lib\idea_rt.jar [INFO ] [10:23:55] org.apache.zookeeper.ZooKeeper - Client environment:java.library.path=C:\Program Files\Java\jdk1.8.0_181\bin;C:\Windows\Sun\Java\bin;C:\Windows\system32;C:\Windows;C:\Program Files (x86)\Common Files\Oracle\Java\javapath;C:\ProgramData\Oracle\Java\javapath;C:\Program Files\Java\jdk1.7.0_51\bin;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;. [INFO ] [10:23:55] org.apache.zookeeper.ZooKeeper - Client environment:java.io.tmpdir=C:\Users\ADMINI~1\AppData\Local\Temp\1\ [INFO ] [10:23:55] org.apache.zookeeper.ZooKeeper - Client environment:java.compiler=<NA> [INFO ] [10:23:55] org.apache.zookeeper.ZooKeeper - Client environment:os.name=Windows Server 2008 R2 [INFO ] [10:23:55] org.apache.zookeeper.ZooKeeper - Client environment:os.arch=amd64 [INFO ] [10:23:55] org.apache.zookeeper.ZooKeeper - Client environment:os.version=6.1 [INFO ] [10:23:55] org.apache.zookeeper.ZooKeeper - Client environment:user.name=admin [INFO ] [10:23:55] org.apache.zookeeper.ZooKeeper - Client environment:user.home=C:\Users\Administrator [INFO ] [10:23:55] org.apache.zookeeper.ZooKeeper - Client environment:user.dir=C:\Users\Administrator\IdeaProjects\TestHbase [INFO ] [10:23:55] org.apache.zookeeper.ZooKeeper - Initiating client connection, connectString=192.168.0.25:2181 sessionTimeout=90000 watcher=hconnection-0x17d0685f0x0, quorum=192.168.0.25:2181, baseZNode=/hbase [DEBUG] [10:23:55] org.apache.zookeeper.ClientCnxn - zookeeper.disableAutoWatchReset is false [INFO ] [10:23:55] org.apache.zookeeper.ClientCnxn - Opening socket connection to server 192.168.0.25/192.168.0.25:2181. Will not attempt to authenticate using SASL (unknown error) [INFO ] [10:23:55] org.apache.zookeeper.ClientCnxn - Socket connection established to 192.168.0.25/192.168.0.25:2181, initiating session [DEBUG] [10:23:55] org.apache.zookeeper.ClientCnxn - Session establishment request sent on 192.168.0.25/192.168.0.25:2181 [INFO ] [10:23:56] org.apache.zookeeper.ClientCnxn - Session establishment complete on server 192.168.0.25/192.168.0.25:2181, sessionid = 0x165a4d303830022, negotiated timeout = 40000 [DEBUG] [10:23:56] org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher - hconnection-0x17d0685f0x0, quorum=192.168.0.25:2181, baseZNode=/hbase Received ZooKeeper Event, type=None, state=SyncConnected, path=null [DEBUG] [10:23:56] org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher - hconnection-0x17d0685f-0x165a4d303830022 connected [DEBUG] [10:23:56] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 1,3 replyHeader:: 1,184683596486,0 request:: '/hbase/hbaseid,F response:: s{4294967310,184683593733,1421399758722,1536068918058,938,0,0,0,60,0,4294967310} [DEBUG] [10:23:56] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 2,4 replyHeader:: 2,184683596486,0 request:: '/hbase/hbaseid,F response:: #ffffffff000133135313538404d61737465722e4861646f6f7066626162613563302d313737332d343731342d613630622d643233626232623865373831,s{4294967310,184683593733,1421399758722,1536068918058,938,0,0,0,60,0,4294967310} [DEBUG] [10:23:56] org.apache.hadoop.hbase.ipc.AbstractRpcClient - Codec=org.apache.hadoop.hbase.codec.KeyValueCodec@7c1e2a9e, compressor=null, tcpKeepAlive=true, tcpNoDelay=true, connectTO=10000, readTO=20000, writeTO=60000, minIdleTimeBeforeClose=120000, maxRetries=0, fallbackAllowed=false, bind address=null hconnection-0x17d0685f org.apache.hadoop.hbase.client.HBaseAdmin@272ed83b hbase:meta true [DEBUG] [10:23:56] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 3,3 replyHeader:: 3,184683596486,0 request:: '/hbase,F response:: s{4294967298,4294967298,1421399756338,1421399756338,0,1956,0,0,0,12,184683593750} [DEBUG] [10:23:56] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 4,4 replyHeader:: 4,184683596486,0 request:: '/hbase/master,F response:: #ffffffff000133135313538404d61737465722e4861646f6f70004d61737465722e4861646f6f702c36303030302c31353336303638393135363739,s{184683593731,184683593731,1536068917156,1536068917156,0,0,0,316840575388352512,59,0,184683593731} [DEBUG] [10:23:56] org.apache.hadoop.hbase.ipc.RpcClientImpl - Use SIMPLE authentication for service MasterService, sasl=false [DEBUG] [10:23:56] org.apache.hadoop.hbase.ipc.RpcClientImpl - Connecting to Master.Hadoop/192.168.0.25:60000 [DEBUG] [10:23:57] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 5,3 replyHeader:: 5,184683596486,0 request:: '/hbase,F response:: s{4294967298,4294967298,1421399756338,1421399756338,0,1956,0,0,0,12,184683593750} [DEBUG] [10:23:57] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 6,4 replyHeader:: 6,184683596486,0 request:: '/hbase/master,F response:: #ffffffff000133135313538404d61737465722e4861646f6f70004d61737465722e4861646f6f702c36303030302c31353336303638393135363739,s{184683593731,184683593731,1536068917156,1536068917156,0,0,0,316840575388352512,59,0,184683593731} [DEBUG] [10:23:57] org.apache.hadoop.hbase.ipc.RpcClientImpl - Use SIMPLE authentication for service MasterService, sasl=false [DEBUG] [10:23:57] org.apache.hadoop.hbase.ipc.RpcClientImpl - Not trying to connect to Master.Hadoop/192.168.0.25:60000 this server is in the failed servers list [DEBUG] [10:23:57] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 7,3 replyHeader:: 7,184683596486,0 request:: '/hbase,F response:: s{4294967298,4294967298,1421399756338,1421399756338,0,1956,0,0,0,12,184683593750} [DEBUG] [10:23:57] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 8,4 replyHeader:: 8,184683596486,0 request:: '/hbase/master,F response:: #ffffffff000133135313538404d61737465722e4861646f6f70004d61737465722e4861646f6f702c36303030302c31353336303638393135363739,s{184683593731,184683593731,1536068917156,1536068917156,0,0,0,316840575388352512,59,0,184683593731} [DEBUG] [10:23:57] org.apache.hadoop.hbase.ipc.RpcClientImpl - Use SIMPLE authentication for service MasterService, sasl=false [DEBUG] [10:23:57] org.apache.hadoop.hbase.ipc.RpcClientImpl - Not trying to connect to Master.Hadoop/192.168.0.25:60000 this server is in the failed servers list [DEBUG] [10:23:57] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 9,3 replyHeader:: 9,184683596486,0 request:: '/hbase,F response:: s{4294967298,4294967298,1421399756338,1421399756338,0,1956,0,0,0,12,184683593750} [DEBUG] [10:23:57] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 10,4 replyHeader:: 10,184683596486,0 request:: '/hbase/master,F response:: #ffffffff000133135313538404d61737465722e4861646f6f70004d61737465722e4861646f6f702c36303030302c31353336303638393135363739,s{184683593731,184683593731,1536068917156,1536068917156,0,0,0,316840575388352512,59,0,184683593731} [DEBUG] [10:23:57] org.apache.hadoop.hbase.ipc.RpcClientImpl - Use SIMPLE authentication for service MasterService, sasl=false [DEBUG] [10:23:57] org.apache.hadoop.hbase.ipc.RpcClientImpl - Not trying to connect to Master.Hadoop/192.168.0.25:60000 this server is in the failed servers list [DEBUG] [10:23:58] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 11,3 replyHeader:: 11,184683596486,0 request:: '/hbase,F response:: s{4294967298,4294967298,1421399756338,1421399756338,0,1956,0,0,0,12,184683593750} [DEBUG] [10:23:58] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 12,4 replyHeader:: 12,184683596486,0 request:: '/hbase/master,F response:: #ffffffff000133135313538404d61737465722e4861646f6f70004d61737465722e4861646f6f702c36303030302c31353336303638393135363739,s{184683593731,184683593731,1536068917156,1536068917156,0,0,0,316840575388352512,59,0,184683593731} [DEBUG] [10:23:58] org.apache.hadoop.hbase.ipc.RpcClientImpl - Use SIMPLE authentication for service MasterService, sasl=false [DEBUG] [10:23:58] org.apache.hadoop.hbase.ipc.RpcClientImpl - Not trying to connect to Master.Hadoop/192.168.0.25:60000 this server is in the failed servers list [DEBUG] [10:23:59] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 13,3 replyHeader:: 13,184683596486,0 request:: '/hbase,F response:: s{4294967298,4294967298,1421399756338,1421399756338,0,1956,0,0,0,12,184683593750} [DEBUG] [10:23:59] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 14,4 replyHeader:: 14,184683596486,0 request:: '/hbase/master,F response:: #ffffffff000133135313538404d61737465722e4861646f6f70004d61737465722e4861646f6f702c36303030302c31353336303638393135363739,s{184683593731,184683593731,1536068917156,1536068917156,0,0,0,316840575388352512,59,0,184683593731} [DEBUG] [10:23:59] org.apache.hadoop.hbase.ipc.RpcClientImpl - Use SIMPLE authentication for service MasterService, sasl=false [DEBUG] [10:23:59] org.apache.hadoop.hbase.ipc.RpcClientImpl - Connecting to Master.Hadoop/192.168.0.25:60000 [DEBUG] [10:24:01] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 15,3 replyHeader:: 15,184683596486,0 request:: '/hbase,F response:: s{4294967298,4294967298,1421399756338,1421399756338,0,1956,0,0,0,12,184683593750} [DEBUG] [10:24:01] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 16,4 replyHeader:: 16,184683596486,0 request:: '/hbase/master,F response:: #ffffffff000133135313538404d61737465722e4861646f6f70004d61737465722e4861646f6f702c36303030302c31353336303638393135363739,s{184683593731,184683593731,1536068917156,1536068917156,0,0,0,316840575388352512,59,0,184683593731} [DEBUG] [10:24:01] org.apache.hadoop.hbase.ipc.RpcClientImpl - Use SIMPLE authentication for service MasterService, sasl=false [DEBUG] [10:24:01] org.apache.hadoop.hbase.ipc.RpcClientImpl - Connecting to Master.Hadoop/192.168.0.25:60000 [DEBUG] [10:24:05] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 17,3 replyHeader:: 17,184683596486,0 request:: '/hbase,F response:: s{4294967298,4294967298,1421399756338,1421399756338,0,1956,0,0,0,12,184683593750} [DEBUG] [10:24:05] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 18,4 replyHeader:: 18,184683596486,0 request:: '/hbase/master,F response:: #ffffffff000133135313538404d61737465722e4861646f6f70004d61737465722e4861646f6f702c36303030302c31353336303638393135363739,s{184683593731,184683593731,1536068917156,1536068917156,0,0,0,316840575388352512,59,0,184683593731} [DEBUG] [10:24:05] org.apache.hadoop.hbase.ipc.RpcClientImpl - Use SIMPLE authentication for service MasterService, sasl=false [DEBUG] [10:24:05] org.apache.hadoop.hbase.ipc.RpcClientImpl - Connecting to Master.Hadoop/192.168.0.25:60000 [DEBUG] [10:24:15] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 19,3 replyHeader:: 19,184683596486,0 request:: '/hbase,F response:: s{4294967298,4294967298,1421399756338,1421399756338,0,1956,0,0,0,12,184683593750} [DEBUG] [10:24:15] org.apache.zookeeper.ClientCnxn - Got ping response for sessionid: 0x165a4d303830022 after 11ms [DEBUG] [10:24:15] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 20,4 replyHeader:: 20,184683596486,0 request:: '/hbase/master,F response:: #ffffffff000133135313538404d61737465722e4861646f6f70004d61737465722e4861646f6f702c36303030302c31353336303638393135363739,s{184683593731,184683593731,1536068917156,1536068917156,0,0,0,316840575388352512,59,0,184683593731} [DEBUG] [10:24:15] org.apache.hadoop.hbase.ipc.RpcClientImpl - Use SIMPLE authentication for service MasterService, sasl=false [DEBUG] [10:24:15] org.apache.hadoop.hbase.ipc.RpcClientImpl - Connecting to Master.Hadoop/192.168.0.25:60000 [DEBUG] [10:24:25] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 21,3 replyHeader:: 21,184683596486,0 request:: '/hbase,F response:: s{4294967298,4294967298,1421399756338,1421399756338,0,1956,0,0,0,12,184683593750} [DEBUG] [10:24:25] org.apache.zookeeper.ClientCnxn - Got ping response for sessionid: 0x165a4d303830022 after 3ms [DEBUG] [10:24:25] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 22,4 replyHeader:: 22,184683596486,0 request:: '/hbase/master,F response:: #ffffffff000133135313538404d61737465722e4861646f6f70004d61737465722e4861646f6f702c36303030302c31353336303638393135363739,s{184683593731,184683593731,1536068917156,1536068917156,0,0,0,316840575388352512,59,0,184683593731} [DEBUG] [10:24:25] org.apache.hadoop.hbase.ipc.RpcClientImpl - Use SIMPLE authentication for service MasterService, sasl=false [DEBUG] [10:24:25] org.apache.hadoop.hbase.ipc.RpcClientImpl - Connecting to Master.Hadoop/192.168.0.25:60000 [DEBUG] [10:24:35] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 23,3 replyHeader:: 23,184683596486,0 request:: '/hbase,F response:: s{4294967298,4294967298,1421399756338,1421399756338,0,1956,0,0,0,12,184683593750} [DEBUG] [10:24:35] org.apache.zookeeper.ClientCnxn - Got ping response for sessionid: 0x165a4d303830022 after 1ms [DEBUG] [10:24:35] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 24,4 replyHeader:: 24,184683596486,0 request:: '/hbase/master,F response:: #ffffffff000133135313538404d61737465722e4861646f6f70004d61737465722e4861646f6f702c36303030302c31353336303638393135363739,s{184683593731,184683593731,1536068917156,1536068917156,0,0,0,316840575388352512,59,0,184683593731} [DEBUG] [10:24:35] org.apache.hadoop.hbase.ipc.RpcClientImpl - Use SIMPLE authentication for service MasterService, sasl=false [DEBUG] [10:24:35] org.apache.hadoop.hbase.ipc.RpcClientImpl - Connecting to Master.Hadoop/192.168.0.25:60000 [INFO ] [10:24:35] org.apache.hadoop.hbase.client.RpcRetryingCaller - Call exception, tries=10, retries=35, started=38894 ms ago, cancelled=false, msg= [DEBUG] [10:24:45] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 25,3 replyHeader:: 25,184683596486,0 request:: '/hbase,F response:: s{4294967298,4294967298,1421399756338,1421399756338,0,1956,0,0,0,12,184683593750} [DEBUG] [10:24:45] org.apache.zookeeper.ClientCnxn - Got ping response for sessionid: 0x165a4d303830022 after 1ms [DEBUG] [10:24:45] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 26,4 replyHeader:: 26,184683596486,0 request:: '/hbase/master,F response:: #ffffffff000133135313538404d61737465722e4861646f6f70004d61737465722e4861646f6f702c36303030302c31353336303638393135363739,s{184683593731,184683593731,1536068917156,1536068917156,0,0,0,316840575388352512,59,0,184683593731} [DEBUG] [10:24:45] org.apache.hadoop.hbase.ipc.RpcClientImpl - Use SIMPLE authentication for service MasterService, sasl=false [DEBUG] [10:24:45] org.apache.hadoop.hbase.ipc.RpcClientImpl - Connecting to Master.Hadoop/192.168.0.25:60000 [INFO ] [10:24:45] org.apache.hadoop.hbase.client.RpcRetryingCaller - Call exception, tries=11, retries=35, started=48956 ms ago, cancelled=false, msg= [DEBUG] [10:24:58] org.apache.zookeeper.ClientCnxn - Got ping response for sessionid: 0x165a4d303830022 after 1ms [DEBUG] [10:25:05] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 27,3 replyHeader:: 27,184683596486,0 request:: '/hbase,F response:: s{4294967298,4294967298,1421399756338,1421399756338,0,1956,0,0,0,12,184683593750} [DEBUG] [10:25:05] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 28,4 replyHeader:: 28,184683596486,0 request:: '/hbase/master,F response:: #ffffffff000133135313538404d61737465722e4861646f6f70004d61737465722e4861646f6f702c36303030302c31353336303638393135363739,s{184683593731,184683593731,1536068917156,1536068917156,0,0,0,316840575388352512,59,0,184683593731} [DEBUG] [10:25:05] org.apache.hadoop.hbase.ipc.RpcClientImpl - Use SIMPLE authentication for service MasterService, sasl=false [DEBUG] [10:25:05] org.apache.hadoop.hbase.ipc.RpcClientImpl - Connecting to Master.Hadoop/192.168.0.25:60000 [INFO ] [10:25:05] org.apache.hadoop.hbase.client.RpcRetryingCaller - Call exception, tries=12, retries=35, started=69113 ms ago, cancelled=false, msg= [DEBUG] [10:25:19] org.apache.zookeeper.ClientCnxn - Got ping response for sessionid: 0x165a4d303830022 after 1ms [DEBUG] [10:25:25] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 29,3 replyHeader:: 29,184683596486,0 request:: '/hbase,F response:: s{4294967298,4294967298,1421399756338,1421399756338,0,1956,0,0,0,12,184683593750} [DEBUG] [10:25:25] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 30,4 replyHeader:: 30,184683596486,0 request:: '/hbase/master,F response:: #ffffffff000133135313538404d61737465722e4861646f6f70004d61737465722e4861646f6f702c36303030302c31353336303638393135363739,s{184683593731,184683593731,1536068917156,1536068917156,0,0,0,316840575388352512,59,0,184683593731} [DEBUG] [10:25:25] org.apache.hadoop.hbase.ipc.RpcClientImpl - Use SIMPLE authentication for service MasterService, sasl=false [DEBUG] [10:25:25] org.apache.hadoop.hbase.ipc.RpcClientImpl - Connecting to Master.Hadoop/192.168.0.25:60000 [INFO ] [10:25:25] org.apache.hadoop.hbase.client.RpcRetryingCaller - Call exception, tries=13, retries=35, started=89176 ms ago, cancelled=false, msg= [DEBUG] [10:25:39] org.apache.zookeeper.ClientCnxn - Got ping response for sessionid: 0x165a4d303830022 after 0ms [DEBUG] [10:25:45] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 31,3 replyHeader:: 31,184683596486,0 request:: '/hbase,F response:: s{4294967298,4294967298,1421399756338,1421399756338,0,1956,0,0,0,12,184683593750} [DEBUG] [10:25:45] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 32,4 replyHeader:: 32,184683596486,0 request:: '/hbase/master,F response:: #ffffffff000133135313538404d61737465722e4861646f6f70004d61737465722e4861646f6f702c36303030302c31353336303638393135363739,s{184683593731,184683593731,1536068917156,1536068917156,0,0,0,316840575388352512,59,0,184683593731} [DEBUG] [10:25:45] org.apache.hadoop.hbase.ipc.RpcClientImpl - Use SIMPLE authentication for service MasterService, sasl=false [DEBUG] [10:25:45] org.apache.hadoop.hbase.ipc.RpcClientImpl - Connecting to Master.Hadoop/192.168.0.25:60000 [INFO ] [10:25:45] org.apache.hadoop.hbase.client.RpcRetryingCaller - Call exception, tries=14, retries=35, started=109299 ms ago, cancelled=false, msg= [DEBUG] [10:25:59] org.apache.zookeeper.ClientCnxn - Got ping response for sessionid: 0x165a4d303830022 after 0ms [DEBUG] [10:26:05] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 33,3 replyHeader:: 33,184683596486,0 request:: '/hbase,F response:: s{4294967298,4294967298,1421399756338,1421399756338,0,1956,0,0,0,12,184683593750} [DEBUG] [10:26:05] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 34,4 replyHeader:: 34,184683596486,0 request:: '/hbase/master,F response:: #ffffffff000133135313538404d61737465722e4861646f6f70004d61737465722e4861646f6f702c36303030302c31353336303638393135363739,s{184683593731,184683593731,1536068917156,1536068917156,0,0,0,316840575388352512,59,0,184683593731} [DEBUG] [10:26:05] org.apache.hadoop.hbase.ipc.RpcClientImpl - Use SIMPLE authentication for service MasterService, sasl=false [DEBUG] [10:26:05] org.apache.hadoop.hbase.ipc.RpcClientImpl - Connecting to Master.Hadoop/192.168.0.25:60000 [INFO ] [10:26:05] org.apache.hadoop.hbase.client.RpcRetryingCaller - Call exception, tries=15, retries=35, started=129364 ms ago, cancelled=false, msg= [DEBUG] [10:26:19] org.apache.zookeeper.ClientCnxn - Got ping response for sessionid: 0x165a4d303830022 after 0ms [DEBUG] [10:26:26] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 35,3 replyHeader:: 35,184683596486,0 request:: '/hbase,F response:: s{4294967298,4294967298,1421399756338,1421399756338,0,1956,0,0,0,12,184683593750} [DEBUG] [10:26:26] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 36,4 replyHeader:: 36,184683596486,0 request:: '/hbase/master,F response:: #ffffffff000133135313538404d61737465722e4861646f6f70004d61737465722e4861646f6f702c36303030302c31353336303638393135363739,s{184683593731,184683593731,1536068917156,1536068917156,0,0,0,316840575388352512,59,0,184683593731} [DEBUG] [10:26:26] org.apache.hadoop.hbase.ipc.RpcClientImpl - Use SIMPLE authentication for service MasterService, sasl=false [DEBUG] [10:26:26] org.apache.hadoop.hbase.ipc.RpcClientImpl - Connecting to Master.Hadoop/192.168.0.25:60000 [INFO ] [10:26:26] org.apache.hadoop.hbase.client.RpcRetryingCaller - Call exception, tries=16, retries=35, started=149567 ms ago, cancelled=false, msg= [DEBUG] [10:26:39] org.apache.zookeeper.ClientCnxn - Got ping response for sessionid: 0x165a4d303830022 after 0ms [DEBUG] [10:26:46] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 37,3 replyHeader:: 37,184683596486,0 request:: '/hbase,F response:: s{4294967298,4294967298,1421399756338,1421399756338,0,1956,0,0,0,12,184683593750} [DEBUG] [10:26:46] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 38,4 replyHeader:: 38,184683596486,0 request:: '/hbase/master,F response:: #ffffffff000133135313538404d61737465722e4861646f6f70004d61737465722e4861646f6f702c36303030302c31353336303638393135363739,s{184683593731,184683593731,1536068917156,1536068917156,0,0,0,316840575388352512,59,0,184683593731} [DEBUG] [10:26:46] org.apache.hadoop.hbase.ipc.RpcClientImpl - Use SIMPLE authentication for service MasterService, sasl=false [DEBUG] [10:26:46] org.apache.hadoop.hbase.ipc.RpcClientImpl - Connecting to Master.Hadoop/192.168.0.25:60000 [INFO ] [10:26:46] org.apache.hadoop.hbase.client.RpcRetryingCaller - Call exception, tries=17, retries=35, started=169742 ms ago, cancelled=false, msg= [DEBUG] [10:26:59] org.apache.zookeeper.ClientCnxn - Got ping response for sessionid: 0x165a4d303830022 after 1ms [DEBUG] [10:27:06] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 39,3 replyHeader:: 39,184683596486,0 request:: '/hbase,F response:: s{4294967298,4294967298,1421399756338,1421399756338,0,1956,0,0,0,12,184683593750} [DEBUG] [10:27:06] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 40,4 replyHeader:: 40,184683596486,0 request:: '/hbase/master,F response:: #ffffffff000133135313538404d61737465722e4861646f6f70004d61737465722e4861646f6f702c36303030302c31353336303638393135363739,s{184683593731,184683593731,1536068917156,1536068917156,0,0,0,316840575388352512,59,0,184683593731} [DEBUG] [10:27:06] org.apache.hadoop.hbase.ipc.RpcClientImpl - Use SIMPLE authentication for service MasterService, sasl=false [DEBUG] [10:27:06] org.apache.hadoop.hbase.ipc.RpcClientImpl - Connecting to Master.Hadoop/192.168.0.25:60000 [INFO ] [10:27:06] org.apache.hadoop.hbase.client.RpcRetryingCaller - Call exception, tries=18, retries=35, started=189961 ms ago, cancelled=false, msg= [DEBUG] [10:27:19] org.apache.zookeeper.ClientCnxn - Got ping response for sessionid: 0x165a4d303830022 after 0ms [DEBUG] [10:27:26] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 41,3 replyHeader:: 41,184683596486,0 request:: '/hbase,F response:: s{4294967298,4294967298,1421399756338,1421399756338,0,1956,0,0,0,12,184683593750} [DEBUG] [10:27:26] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 42,4 replyHeader:: 42,184683596486,0 request:: '/hbase/master,F response:: #ffffffff000133135313538404d61737465722e4861646f6f70004d61737465722e4861646f6f702c36303030302c31353336303638393135363739,s{184683593731,184683593731,1536068917156,1536068917156,0,0,0,316840575388352512,59,0,184683593731} [DEBUG] [10:27:26] org.apache.hadoop.hbase.ipc.RpcClientImpl - Use SIMPLE authentication for service MasterService, sasl=false [DEBUG] [10:27:26] org.apache.hadoop.hbase.ipc.RpcClientImpl - Connecting to Master.Hadoop/192.168.0.25:60000 [INFO ] [10:27:26] org.apache.hadoop.hbase.client.RpcRetryingCaller - Call exception, tries=19, retries=35, started=210070 ms ago, cancelled=false, msg= [DEBUG] [10:27:39] org.apache.zookeeper.ClientCnxn - Got ping response for sessionid: 0x165a4d303830022 after 2ms [DEBUG] [10:27:46] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 43,3 replyHeader:: 43,184683596486,0 request:: '/hbase,F response:: s{4294967298,4294967298,1421399756338,1421399756338,0,1956,0,0,0,12,184683593750} [DEBUG] [10:27:46] org.apache.zookeeper.ClientCnxn - Reading reply sessionid:0x165a4d303830022, packet:: clientPath:null serverPath:null finished:false header:: 44,4 replyHeader:: 44,184683596486,0 request:: '/hbase/master,F response:: #ffffffff000133135313538404d61737465722e4861646f6f70004d61737465722e4861646f6f702c36303030302c31353336303638393135363739,s{184683593731,184683593731,1536068917156,1536068917156,0,0,0,316840575388352512,59,0,184683593731} [DEBUG] [10:27:46] org.apache.hadoop.hbase.ipc.RpcClientImpl - Use SIMPLE authentication for service MasterService, sasl=false [DEBUG] [10:27:46] org.apache.hadoop.hbase.ipc.RpcClientImpl - Connecting to Master.Hadoop/192.168.0.25:60000 [INFO ] [10:27:46] org.apache.hadoop.hbase.client.RpcRetryingCaller - Call exception, tries=20, retries=35, started=230074 ms ago, cancelled=false, msg= [DEBUG] [10:28:00] org.apache.zookeeper.ClientCnxn - Got ping response for sessionid: 0x165a4d303830022 after 1ms
急求关于mysql数据库自动停止的问题
数据库:mysql5.6.17 OS: 2008SERVER 32位 业务正在运行,突然数据库就自动停止服务了,日志如下: 2016-12-03 11:14:06 4860 [ERROR] Can't create thread to handle request (errno= 12) 2016-12-03 11:15:25 4860 [ERROR] InnoDB: InnoDB: Unable to allocate memory of size 1065016. 2016-12-03 11:15:25 1398 InnoDB: Assertion failure in thread 5016 in file ha_innodb.cc line 17080 InnoDB: We intentionally generate a memory trap. InnoDB: Submit a detailed bug report to http://bugs.mysql.com. InnoDB: If you get repeated assertion failures or crashes, even InnoDB: immediately after the mysqld startup, there may be InnoDB: corruption in the InnoDB tablespace. Please refer to InnoDB: http://dev.mysql.com/doc/refman/5.6/en/forcing-innodb-recovery.html InnoDB: about forcing recovery. 2016-12-03 11:15:39 4860 [ERROR] Error log throttle: 7 'Can't create thread to handle new connection' error(s) suppressed 2016-12-03 11:15:39 4860 [ERROR] Can't create thread to handle request (errno= 12) 2016-12-03 11:16:40 4860 [ERROR] Error log throttle: 963 'Can't create thread to handle new connection' error(s) suppressed 2016-12-03 11:16:40 4860 [ERROR] Can't create thread to handle request (errno= 12) 2016-12-03 11:18:00 5108 [Note] Plugin 'FEDERATED' is disabled. 2016-12-03 11:18:00 171c InnoDB: Warning: Using innodb_additional_mem_pool_size is DEPRECATED. This option may be removed in future releases, together with the option innodb_use_sys_malloc and with the InnoDB's internal memory allocator. 2016-12-03 11:18:00 5108 [Note] InnoDB: Using atomics to ref count buffer pool pages 2016-12-03 11:18:00 5108 [Note] InnoDB: The InnoDB memory heap is disabled 2016-12-03 11:18:00 5108 [Note] InnoDB: Mutexes and rw_locks use Windows interlocked functions 2016-12-03 11:18:00 5108 [Note] InnoDB: Compressed tables use zlib 1.2.3 2016-12-03 11:18:00 5108 [Note] InnoDB: Not using CPU crc32 instructions 2016-12-03 11:18:00 5108 [Note] InnoDB: Initializing buffer pool, size = 823.0M 2016-12-03 11:18:00 5108 [Note] InnoDB: Completed initialization of buffer pool 2016-12-03 11:18:00 5108 [Note] InnoDB: Highest supported file format is Barracuda. 2016-12-03 11:18:00 5108 [Note] InnoDB: Log scan progressed past the checkpoint lsn 37892781681 2016-12-03 11:18:00 5108 [Note] InnoDB: Database was not shutdown normally! 2016-12-03 11:18:00 5108 [Note] InnoDB: Starting crash recovery. 2016-12-03 11:18:00 5108 [Note] InnoDB: Reading tablespace information from the .ibd files... 2016-12-03 11:18:04 5108 [Note] InnoDB: Restoring possible half-written data pages 2016-12-03 11:18:04 5108 [Note] InnoDB: from the doublewrite buffer... InnoDB: Doing recovery: scanned up to log sequence number 37893927335 InnoDB: 6 transaction(s) which must be rolled back or cleaned up InnoDB: in total 63 row operations to undo InnoDB: Trx id counter is 12591616 2016-12-03 11:18:05 5108 [Note] InnoDB: Starting an apply batch of log records to the database... InnoDB: Progress in percent: 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 InnoDB: Apply batch completed 2016-12-03 11:18:06 5108 [Note] InnoDB: 128 rollback segment(s) are active. InnoDB: Starting in background the rollback of uncommitted transactions 我挂了五个tomcat搭载了nginx 。大约有100个客户端会频繁访问。想着应该没问题,但是连续两天数据库突然停止,急求大神帮忙解决,谢谢!
在MySQL中安装sqljieba插件的时候报错
在mysql中安装sqljieba插件的时候报错 sudo cp libsqljieba.so /usr/lib/mysql/plugin sudo cp -r ./dict /usr/share # 安装错误 install plugin sqljieba soname 'libsqljieba.so'; ERROR 2013 (HY000): Lost connection to MySQL server during query 查看插件路径下是有的 +------------------------+ | @@plugin_dir | +------------------------+ | /usr/lib/mysql/plugin/ | +------------------------+ c@PC:/usr/lib/mysql/plugin$ ll 总用量 792 drwxr-xr-x 2 root root 4096 6月 14 13:40 ./ drwxr-xr-x 3 root root 4096 6月 14 13:40 ../ -rw-r--r-- 1 root root 21224 4月 20 19:52 adt_null.so -rw-r--r-- 1 root root 6288 4月 20 19:52 auth_socket.so -rw-r--r-- 1 root root 44144 4月 20 19:52 connection_control.so -rw-r--r-- 1 root root 108696 4月 20 19:52 innodb_engine.so -rw-r--r-- 1 root root 88608 4月 20 19:52 keyring_file.so -rw-r--r-- 1 root root 154592 4月 20 19:52 libmemcached.so -rwxr-xr-x 1 root root 141592 6月 14 02:20 libsqljieba.so 后来查看日志 发现在执行指令的时候程序抛出异常了 在读取 /usr/share/dict/jieba.dict.utf8报错了 # 查看错误日志文件 vim /var/log/mysql/error.log 2018-06-14 14:37:01 ./deps/cppjieba/DictTrie.hpp:153 FATAL exp: [ifs.is_open()] false. open /usr/share/dict/jieba.dict.utf8 failed. 06:37:01 UTC - mysqld got signal 6 ; This could be because you hit a bug. It is also possible that this binary or one of the libraries it was linked against is corrupt, improperly built, or misconfigured. This error can also be caused by malfunctioning hardware. Attempting to collect some information that could help diagnose the problem. As this is a crash and something is definitely wrong, the information collection process might fail. key_buffer_size=16777216 read_buffer_size=131072 max_used_connections=1 max_threads=151 thread_count=1 connection_count=1 It is possible that mysqld could use up to key_buffer_size + (read_buffer_size + sort_buffer_size)*max_threads = 76387 K bytes of memory Hope that's ok; if not, decrease some variables in the equation. Thread pointer: 0x7f8b0c000ae0 Attempting backtrace. You can use the following information to find out where mysqld died. If you see no messages after this, something went terribly wrong... stack_bottom = 7f8b2c125e70 thread_stack 0x30000 /usr/sbin/mysqld(my_print_stacktrace+0x3b)[0xe907ab] /usr/sbin/mysqld(handle_fatal_signal+0x489)[0x789b49] /lib/x86_64-linux-gnu/libpthread.so.0(+0x11390)[0x7f8b45a80390] /lib/x86_64-linux-gnu/libc.so.6(gsignal+0x38)[0x7f8b44e39428] /lib/x86_64-linux-gnu/libc.so.6(abort+0x16a)[0x7f8b44e3b02a] /usr/lib/mysql/plugin/libsqljieba.so(_ZN6limonp6LoggerD1Ev+0xf5)[0x7f8b245e6495] /usr/lib/mysql/plugin/libsqljieba.so(_ZN8cppjieba8DictTrie8LoadDictERKNSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEE+0x716)[0x7f8b245eecb6] /usr/lib/mysql/plugin/libsqljieba.so(+0x7ce5)[0x7f8b245e4ce5] /usr/sbin/mysqld[0xc707f4] /usr/sbin/mysqld[0xc76b0a] /usr/sbin/mysqld(_ZN22Sql_cmd_install_plugin7executeEP3THD+0x1a)[0xc76d9a] /usr/sbin/mysqld(_Z21mysql_execute_commandP3THDb+0x24de)[0xc4e29e] /usr/sbin/mysqld(_Z11mysql_parseP3THDP12Parser_state+0x3ad)[0xc52b3d] /usr/sbin/mysqld(_Z16dispatch_commandP3THDPK8COM_DATA19enum_server_command+0x102a)[0xc53c7a] /usr/sbin/mysqld(_Z10do_commandP3THD+0x1c7)[0xc55137] /usr/sbin/mysqld(handle_connection+0x288)[0xd16788] /usr/sbin/mysqld(pfs_spawn_thread+0x1b4)[0xec9294] /lib/x86_64-linux-gnu/libpthread.so.0(+0x76ba)[0x7f8b45a766ba] /lib/x86_64-linux-gnu/libc.so.6(clone+0x6d)[0x7f8b44f0b41d] 最后确认一下 /usr/share/dict/jieba.dict.utf8文件时存在的 并且权限也是OK的 pc@PC:/usr/share/dict$ ll 总用量 11356 drwxr-xr-x 3 mysql mysql 4096 6月 14 02:20 ./ drwxr-xr-x 111 root root 4096 6月 14 13:40 ../ -rw-r--r-- 1 mysql mysql 519739 6月 14 13:43 hmm_model.utf8 -rw-r--r-- 1 mysql mysql 5998717 6月 14 13:43 idf.utf8 -rw-r--r-- 1 mysql mysql 5071204 6月 14 13:43 jieba.dict.utf8 drwxr-xr-x 2 mysql mysql 4096 6月 14 02:20 pos_dict/ -rw-r--r-- 1 mysql mysql 683 6月 14 13:43 README.md -rw-r--r-- 1 mysql mysql 8974 6月 14 13:43 stop_words.utf8 -rw-r--r-- 1 mysql mysql 49 6月 14 13:43 user.dict.utf8
cloudera manager 离线安装安装agent时,向主节点下载资源超时
错误日志: [19/Nov/2018 16:16:04 +0000] 2789 MainThread stacks_collection_manager INFO Using max_uncompressed_file_size_bytes: 5242880 [19/Nov/2018 16:16:04 +0000] 2789 MainThread __init__ INFO Importing metric schema from file /opt/cloudera-manager/cm-5.10.2/lib64/cmf/agent/build/env/lib/python2.6/site-packages/cmf-5.10.2-py2.6.egg/cmf/monitor/schema.json [19/Nov/2018 16:16:04 +0000] 2789 MainThread agent INFO Supervised processes will add the following to their environment (in addition to the supervisor's env): {'CDH_PARQUET_HOME': '/usr/lib/parquet', 'JSVC_HOME': '/usr/libexec/bigtop-utils', 'CMF_PACKAGE_DIR': '/opt/cloudera-manager/cm-5.10.2/lib64/cmf/service', 'CDH_HADOOP_BIN': '/usr/bin/hadoop', 'MGMT_HOME': '/opt/cloudera-manager/cm-5.10.2/share/cmf', 'CDH_IMPALA_HOME': '/usr/lib/impala', 'CDH_YARN_HOME': '/usr/lib/hadoop-yarn', 'CDH_HDFS_HOME': '/usr/lib/hadoop-hdfs', 'PATH': '/sbin:/usr/sbin:/bin:/usr/bin', 'CDH_HUE_PLUGINS_HOME': '/usr/lib/hadoop', 'CM_STATUS_CODES': u'STATUS_NONE HDFS_DFS_DIR_NOT_EMPTY HBASE_TABLE_DISABLED HBASE_TABLE_ENABLED JOBTRACKER_IN_STANDBY_MODE YARN_RM_IN_STANDBY_MODE', 'KEYTRUSTEE_KP_HOME': '/usr/share/keytrustee-keyprovider', 'CLOUDERA_ORACLE_CONNECTOR_JAR': '/usr/share/java/oracle-connector-java.jar', 'CDH_SQOOP2_HOME': '/usr/lib/sqoop2', 'KEYTRUSTEE_SERVER_HOME': '/usr/lib/keytrustee-server', 'CDH_MR2_HOME': '/usr/lib/hadoop-mapreduce', 'HIVE_DEFAULT_XML': '/etc/hive/conf.dist/hive-default.xml', 'CLOUDERA_POSTGRESQL_JDBC_JAR': '/opt/cloudera-manager/cm-5.10.2/share/cmf/lib/postgresql-9.0-801.jdbc4.jar', 'CDH_KMS_HOME': '/usr/lib/hadoop-kms', 'CDH_HBASE_HOME': '/usr/lib/hbase', 'CDH_SQOOP_HOME': '/usr/lib/sqoop', 'WEBHCAT_DEFAULT_XML': '/etc/hive-webhcat/conf.dist/webhcat-default.xml', 'CDH_OOZIE_HOME': '/usr/lib/oozie', 'CDH_ZOOKEEPER_HOME': '/usr/lib/zookeeper', 'CDH_HUE_HOME': '/usr/lib/hue', 'CLOUDERA_MYSQL_CONNECTOR_JAR': '/usr/share/java/mysql-connector-java.jar', 'CDH_HBASE_INDEXER_HOME': '/usr/lib/hbase-solr', 'CDH_MR1_HOME': '/usr/lib/hadoop-0.20-mapreduce', 'CDH_SOLR_HOME': '/usr/lib/solr', 'CDH_PIG_HOME': '/usr/lib/pig', 'CDH_SENTRY_HOME': '/usr/lib/sentry', 'CDH_CRUNCH_HOME': '/usr/lib/crunch', 'CDH_LLAMA_HOME': '/usr/lib/llama/', 'CDH_HTTPFS_HOME': '/usr/lib/hadoop-httpfs', 'ROOT': '/opt/cloudera-manager/cm-5.10.2/lib64/cmf', 'CDH_HADOOP_HOME': '/usr/lib/hadoop', 'CDH_HIVE_HOME': '/usr/lib/hive', 'ORACLE_HOME': '/usr/share/oracle/instantclient', 'CDH_HCAT_HOME': '/usr/lib/hive-hcatalog', 'CDH_KAFKA_HOME': '/usr/lib/kafka', 'CDH_SPARK_HOME': '/usr/lib/spark', 'TOMCAT_HOME': '/usr/lib/bigtop-tomcat', 'CDH_FLUME_HOME': '/usr/lib/flume-ng'} [19/Nov/2018 16:16:04 +0000] 2789 MainThread agent INFO To override these variables, use /etc/cloudera-scm-agent/config.ini. Environment variables for CDH locations are not used when CDH is installed from parcels. [19/Nov/2018 16:16:04 +0000] 2789 MainThread agent INFO Created /opt/cloudera-manager/cm-5.10.2/run/cloudera-scm-agent/process [19/Nov/2018 16:16:04 +0000] 2789 MainThread agent INFO Chmod'ing /opt/cloudera-manager/cm-5.10.2/run/cloudera-scm-agent/process to 0751 [19/Nov/2018 16:16:04 +0000] 2789 MainThread agent INFO Created /opt/cloudera-manager/cm-5.10.2/run/cloudera-scm-agent/supervisor [19/Nov/2018 16:16:04 +0000] 2789 MainThread agent INFO Chmod'ing /opt/cloudera-manager/cm-5.10.2/run/cloudera-scm-agent/supervisor to 0751 [19/Nov/2018 16:16:04 +0000] 2789 MainThread agent INFO Created /opt/cloudera-manager/cm-5.10.2/run/cloudera-scm-agent/flood [19/Nov/2018 16:16:04 +0000] 2789 MainThread agent INFO Chowning /opt/cloudera-manager/cm-5.10.2/run/cloudera-scm-agent/flood to cloudera-scm (498) cloudera-scm (498) [19/Nov/2018 16:16:04 +0000] 2789 MainThread agent INFO Chmod'ing /opt/cloudera-manager/cm-5.10.2/run/cloudera-scm-agent/flood to 0751 [19/Nov/2018 16:16:04 +0000] 2789 MainThread agent INFO Created /opt/cloudera-manager/cm-5.10.2/run/cloudera-scm-agent/supervisor/include [19/Nov/2018 16:16:04 +0000] 2789 MainThread agent INFO Chmod'ing /opt/cloudera-manager/cm-5.10.2/run/cloudera-scm-agent/supervisor/include to 0751 [19/Nov/2018 16:16:04 +0000] 2789 MainThread agent ERROR Failed to connect to previous supervisor. Traceback (most recent call last): File "/opt/cloudera-manager/cm-5.10.2/lib64/cmf/agent/build/env/lib/python2.6/site-packages/cmf-5.10.2-py2.6.egg/cmf/agent.py", line 2073, in find_or_start_supervisor self.configure_supervisor_clients() File "/opt/cloudera-manager/cm-5.10.2/lib64/cmf/agent/build/env/lib/python2.6/site-packages/cmf-5.10.2-py2.6.egg/cmf/agent.py", line 2254, in configure_supervisor_clients supervisor_options.realize(args=["-c", os.path.join(self.supervisor_dir, "supervisord.conf")]) File "/opt/cloudera-manager/cm-5.10.2/lib64/cmf/agent/build/env/lib/python2.6/site-packages/supervisor-3.0-py2.6.egg/supervisor/options.py", line 1599, in realize Options.realize(self, *arg, **kw) File "/opt/cloudera-manager/cm-5.10.2/lib64/cmf/agent/build/env/lib/python2.6/site-packages/supervisor-3.0-py2.6.egg/supervisor/options.py", line 333, in realize self.process_config() File "/opt/cloudera-manager/cm-5.10.2/lib64/cmf/agent/build/env/lib/python2.6/site-packages/supervisor-3.0-py2.6.egg/supervisor/options.py", line 341, in process_config self.process_config_file(do_usage) File "/opt/cloudera-manager/cm-5.10.2/lib64/cmf/agent/build/env/lib/python2.6/site-packages/supervisor-3.0-py2.6.egg/supervisor/options.py", line 376, in process_config_file self.usage(str(msg)) File "/opt/cloudera-manager/cm-5.10.2/lib64/cmf/agent/build/env/lib/python2.6/site-packages/supervisor-3.0-py2.6.egg/supervisor/options.py", line 164, in usage self.exit(2) SystemExit: 2 [19/Nov/2018 16:16:04 +0000] 2789 MainThread tmpfs INFO Successfully mounted tmpfs at /opt/cloudera-manager/cm-5.10.2/run/cloudera-scm-agent/process [19/Nov/2018 16:16:05 +0000] 2789 MainThread agent INFO Trying to connect to newly launched supervisor (Attempt 1) [19/Nov/2018 16:16:05 +0000] 2789 MainThread agent INFO Supervisor version: 3.0, pid: 2821 [19/Nov/2018 16:16:05 +0000] 2789 MainThread agent INFO Successfully connected to supervisor [19/Nov/2018 16:16:05 +0000] 2789 MainThread status_server INFO Using maximum impala profile bundle size of 1073741824 bytes. [19/Nov/2018 16:16:05 +0000] 2789 MainThread status_server INFO Using maximum stacks log bundle size of 1073741824 bytes. [19/Nov/2018 16:16:05 +0000] 2789 MainThread _cplogging INFO [19/Nov/2018:16:16:05] ENGINE Bus STARTING [19/Nov/2018 16:16:05 +0000] 2789 MainThread _cplogging INFO [19/Nov/2018:16:16:05] ENGINE Started monitor thread '_TimeoutMonitor'. [19/Nov/2018 16:16:06 +0000] 2789 MainThread _cplogging INFO [19/Nov/2018:16:16:06] ENGINE Serving on yingzhi01.com:9000 [19/Nov/2018 16:16:06 +0000] 2789 MainThread _cplogging INFO [19/Nov/2018:16:16:06] ENGINE Bus STARTED [19/Nov/2018 16:16:06 +0000] 2789 MainThread __init__ INFO New monitor: (<cmf.monitor.host.HostMonitor object at 0x2990c50>,) [19/Nov/2018 16:16:06 +0000] 2789 MonitorDaemon-Scheduler __init__ INFO Monitor ready to report: ('HostMonitor',) [19/Nov/2018 16:16:06 +0000] 2789 MainThread agent INFO Setting default socket timeout to 30 [19/Nov/2018 16:16:06 +0000] 2789 Monitor-HostMonitor network_interfaces INFO NIC iface eth0 doesn't support ETHTOOL (95) [19/Nov/2018 16:16:06 +0000] 2789 Monitor-HostMonitor throttling_logger ERROR Error getting directory attributes for /opt/cloudera-manager/cm-5.10.2/log/cloudera-scm-agent Traceback (most recent call last): File "/opt/cloudera-manager/cm-5.10.2/lib64/cmf/agent/build/env/lib/python2.6/site-packages/cmf-5.10.2-py2.6.egg/cmf/monitor/dir_monitor.py", line 90, in _get_directory_attributes name = pwd.getpwuid(uid)[0] KeyError: 'getpwuid(): uid not found: 1106' [19/Nov/2018 16:16:06 +0000] 2789 MainThread heartbeat_tracker INFO HB stats (seconds): num:1 LIFE_MIN:0.22 min:0.22 mean:0.22 max:0.22 LIFE_MAX:0.22 [19/Nov/2018 16:16:06 +0000] 2789 MainThread agent INFO CM server guid: dceeafae-a884-42f1-ba7b-4ee187ef3bef [19/Nov/2018 16:16:06 +0000] 2789 MainThread agent INFO Using parcels directory from server provided value: /opt/cloudera/parcels [19/Nov/2018 16:16:06 +0000] 2789 MainThread agent WARNING Expected user root for /opt/cloudera/parcels but was cloudera-scm [19/Nov/2018 16:16:06 +0000] 2789 MainThread agent WARNING Expected group root for /opt/cloudera/parcels but was cloudera-scm [19/Nov/2018 16:16:06 +0000] 2789 MainThread agent INFO Created /opt/cloudera/parcel-cache [19/Nov/2018 16:16:06 +0000] 2789 MainThread agent INFO Chowning /opt/cloudera/parcel-cache to root (0) root (0) [19/Nov/2018 16:16:06 +0000] 2789 MainThread agent INFO Chmod'ing /opt/cloudera/parcel-cache to 0755 [19/Nov/2018 16:16:06 +0000] 2789 MainThread parcel INFO Agent does create users/groups and apply file permissions [19/Nov/2018 16:16:06 +0000] 2789 MainThread downloader INFO Downloader path: /opt/cloudera/parcel-cache [19/Nov/2018 16:16:06 +0000] 2789 MainThread parcel_cache INFO Using /opt/cloudera/parcel-cache for parcel cache [19/Nov/2018 16:16:06 +0000] 2789 MainThread agent INFO Flood daemon (re)start attempt [19/Nov/2018 16:16:06 +0000] 2789 MainThread agent INFO Created /opt/cloudera/parcels/.flood [19/Nov/2018 16:16:06 +0000] 2789 MainThread agent INFO Chowning /opt/cloudera/parcels/.flood to cloudera-scm (498) cloudera-scm (498) [19/Nov/2018 16:16:06 +0000] 2789 MainThread agent INFO Chmod'ing /opt/cloudera/parcels/.flood to 0755 [19/Nov/2018 16:16:06 +0000] 2789 MainThread agent INFO Triggering supervisord update. [19/Nov/2018 16:16:36 +0000] 2789 MainThread downloader ERROR Failed rack peer update: timed out [19/Nov/2018 16:16:36 +0000] 2789 MainThread agent INFO Active parcel list updated; recalculating component info. [19/Nov/2018 16:16:36 +0000] 2789 MainThread throttling_logger WARNING CMF_AGENT_JAVA_HOME environment variable host override will be deprecated in future. JAVA_HOME setting configured from CM server takes precedence over host agent override. Configure JAVA_HOME setting from CM server. [19/Nov/2018 16:16:36 +0000] 2789 MainThread throttling_logger INFO Identified java component java8 with full version JAVA_HOME=/opt/modules/jdk1.8.0_144 java version "1.8.0_144" Java(TM) SE Runtime Environment (build 1.8.0_144-b01) Java HotSpot(TM) 64-Bit Server VM (build 25.144-b01, mixed mode) for requested version . [19/Nov/2018 16:16:36 +0000] 2789 MainThread agent WARNING Long HB processing time: 30.6659779549 [19/Nov/2018 16:16:36 +0000] 2789 MainThread agent WARNING Delayed HB: 15s since last [19/Nov/2018 16:16:44 +0000] 2789 Monitor-HostMonitor throttling_logger ERROR Timeout with args ['ntpdc', '-np'] None [19/Nov/2018 16:16:44 +0000] 2789 Monitor-HostMonitor throttling_logger ERROR Failed to collect NTP metrics Traceback (most recent call last): File "/opt/cloudera-manager/cm-5.10.2/lib64/cmf/agent/build/env/lib/python2.6/site-packages/cmf-5.10.2-py2.6.egg/cmf/monitor/host/ntp_monitor.py", line 48, in collect self.collect_ntpd() File "/opt/cloudera-manager/cm-5.10.2/lib64/cmf/agent/build/env/lib/python2.6/site-packages/cmf-5.10.2-py2.6.egg/cmf/monitor/host/ntp_monitor.py", line 66, in collect_ntpd result, stdout, stderr = self._subprocess_with_timeout(args, self._timeout) File "/opt/cloudera-manager/cm-5.10.2/lib64/cmf/agent/build/env/lib/python2.6/site-packages/cmf-5.10.2-py2.6.egg/cmf/monitor/host/ntp_monitor.py", line 38, in _subprocess_with_timeout return subprocess_with_timeout(args, timeout) File "/opt/cloudera-manager/cm-5.10.2/lib64/cmf/agent/build/env/lib/python2.6/site-packages/cmf-5.10.2-py2.6.egg/cmf/subprocess_timeout.py", line 94, in subprocess_with_timeout raise Exception("timeout with args %s" % args) Exception: timeout with args ['ntpdc', '-np'] [19/Nov/2018 16:17:06 +0000] 2789 DnsResolutionMonitor throttling_logger INFO Using java location: '/opt/modules/jdk1.8.0_144/bin/java'. [19/Nov/2018 16:17:06 +0000] 2789 MainThread downloader ERROR Failed rack peer update: timed out [19/Nov/2018 16:17:06 +0000] 2789 MainThread agent WARNING Long HB processing time: 30.1082139015 [19/Nov/2018 16:17:06 +0000] 2789 MainThread agent WARNING Delayed HB: 15s since last [19/Nov/2018 16:17:36 +0000] 2789 MainThread downloader ERROR Failed rack peer update: timed out [19/Nov/2018 16:17:36 +0000] 2789 MainThread agent WARNING Long HB processing time: 30.1235852242 [19/Nov/2018 16:17:36 +0000] 2789 MainThread agent WARNING Delayed HB: 15s since last [19/Nov/2018 16:18:07 +0000] 2789 MainThread downloader ERROR Failed rack peer update: timed out [19/Nov/2018 16:18:07 +0000] 2789 MainThread agent WARNING Long HB processing time: 30.1040799618 [19/Nov/2018 16:18:07 +0000] 2789 MainThread agent WARNING Delayed HB: 15s since last [19/Nov/2018 16:18:37 +0000] 2789 MainThread downloader ERROR Failed rack peer update: timed out [19/Nov/2018 16:18:37 +0000] 2789 MainThread agent WARNING Long HB processing time: 30.1849529743 [19/Nov/2018 16:18:37 +0000] 2789 MainThread agent WARNING Delayed HB: 15s since last [19/Nov/2018 16:19:07 +0000] 2789 MainThread downloader ERROR Failed rack peer update: timed out [19/Nov/2018 16:19:07 +0000] 2789 MainThread agent WARNING Long HB processing time: 30.1211960316 [19/Nov/2018 16:19:07 +0000] 2789 MainThread agent WARNING Delayed HB: 15s since last [19/Nov/2018 16:19:37 +0000] 2789 MainThread downloader ERROR Failed rack peer update: timed out [19/Nov/2018 16:19:37 +0000] 2789 MainThread agent WARNING Long HB processing time: 30.1215620041 [19/Nov/2018 16:19:37 +0000] 2789 MainThread agent WARNING Delayed HB: 15s since last [19/Nov/2018 16:20:01 +0000] 2789 CP Server Thread-4 _cplogging INFO 192.168.164.35 - - [19/Nov/2018:16:20:01] "GET /heartbeat HTTP/1.1" 200 2 "" "NING/1.0" [19/Nov/2018 16:20:04 +0000] 2789 CP Server Thread-5 _cplogging INFO 192.168.164.35 - - [19/Nov/2018:16:20:04] "GET /heartbeat HTTP/1.1" 200 2 "" "NING/1.0" [19/Nov/2018 16:20:07 +0000] 2789 MainThread downloader ERROR Failed rack peer update: timed out [19/Nov/2018 16:20:07 +0000] 2789 MainThread agent WARNING Long HB processing time: 30.1212861538 [19/Nov/2018 16:20:07 +0000] 2789 MainThread agent WARNING Delayed HB: 15s since last [19/Nov/2018 16:20:37 +0000] 2789 MainThread downloader ERROR Failed rack peer update: timed out [19/Nov/2018 16:20:37 +0000] 2789 MainThread agent WARNING Long HB processing time: 30.1753029823 [19/Nov/2018 16:20:37 +0000] 2789 MainThread agent WARNING Delayed HB: 15s since last [19/Nov/2018 16:20:37 +0000] 2789 Thread-13 downloader INFO Fetching torrent: http://yingzhi01.com:7180/cmf/parcel/download/CDH-5.10.2-1.cdh5.10.2.p0.5-el6.parcel.torrent [19/Nov/2018 16:20:37 +0000] 2789 Thread-13 downloader INFO Starting download of: http://yingzhi01.com:7180/cmf/parcel/download/CDH-5.10.2-1.cdh5.10.2.p0.5-el6.parcel [19/Nov/2018 16:21:07 +0000] 2789 Thread-13 downloader ERROR Unexpected exception during download Traceback (most recent call last): File "/opt/cloudera-manager/cm-5.10.2/lib64/cmf/agent/build/env/lib/python2.6/site-packages/cmf-5.10.2-py2.6.egg/cmf/downloader.py", line 279, in download self.client.AddTorrent(torrent_url) File "/opt/cloudera-manager/cm-5.10.2/lib64/cmf/agent/build/env/lib/python2.6/site-packages/cmf-5.10.2-py2.6.egg/flood/util/cmd.py", line 159, in __call__ return self.fn.__get__(self.binding)(*args, **kwargs) File "/opt/cloudera-manager/cm-5.10.2/lib64/cmf/agent/build/env/lib/python2.6/site-packages/cmf-5.10.2-py2.6.egg/flood/util/rpc.py", line 68, in <lambda> return lambda *pargs, **kwargs: self._invoke(*pargs, **kwargs) File "/opt/cloudera-manager/cm-5.10.2/lib64/cmf/agent/build/env/lib/python2.6/site-packages/cmf-5.10.2-py2.6.egg/flood/util/rpc.py", line 77, in _invoke return rpcClient.requestor.request(self.schema.name, msg) File "/opt/cloudera-manager/cm-5.10.2/lib64/cmf/agent/build/env/lib/python2.6/site-packages/cmf-5.10.2-py2.6.egg/flood/util/rpc.py", line 129, in requestor return avro.ipc.Requestor(self.SCHEMA, self.transceiver) File "/opt/cloudera-manager/cm-5.10.2/lib64/cmf/agent/build/env/lib/python2.6/site-packages/cmf-5.10.2-py2.6.egg/flood/util/rpc.py", line 125, in transceiver return avro.ipc.HTTPTransceiver(self.server.host, self.server.port) File "/opt/cloudera-manager/cm-5.10.2/lib64/cmf/agent/build/env/lib/python2.6/site-packages/avro-1.6.3-py2.6.egg/avro/ipc.py", line 469, in __init__ self.conn.connect() File "/usr/lib64/python2.6/httplib.py", line 771, in connect self.timeout) File "/usr/lib64/python2.6/socket.py", line 567, in create_connection raise error, msg timeout: timed out [19/Nov/2018 16:21:07 +0000] 2789 Thread-13 downloader INFO Finished download [ url: http://yingzhi01.com:7180/cmf/parcel/download/CDH-5.10.2-1.cdh5.10.2.p0.5-el6.parcel, state: exception, total_bytes: 0, downloaded_bytes: 0, start_time: 2018-11-19 16:20:37, download_end_time: , end_time: 2018-11-19 16:21:07, code: 600, exception_msg: timed out, path: None ] [19/Nov/2018 16:21:07 +0000] 2789 MainThread downloader ERROR Failed rack peer update: timed out [19/Nov/2018 16:21:07 +0000] 2789 MainThread agent WARNING Long HB processing time: 30.1247620583 [19/Nov/2018 16:21:07 +0000] 2789 MainThread agent WARNING Delayed HB: 15s since last [19/Nov/2018 16:21:07 +0000] 2789 Thread-13 downloader INFO Fetching torrent: http://yingzhi01.com:7180/cmf/parcel/download/CDH-5.10.2-1.cdh5.10.2.p0.5-el6.parcel.torrent [19/Nov/2018 16:21:08 +0000] 2789 Thread-13 downloader INFO Starting download of: http://yingzhi01.com:7180/cmf/parcel/download/CDH-5.10.2-1.cdh5.10.2.p0.5-el6.parcel [19/Nov/2018 16:21:38 +0000] 2789 Thread-13 downloader ERROR Unexpected exception during download 然后就是不断重复超时错误求大神指点。。。
java项目在本地上运行无问题,打war包到服务器上,部分功能有问题
tomcat日志: 23-Nov-2017 00:49:07.772 INFO [main] org.apache.catalina.core.StandardServer.await A valid shutdown command was received via the shutdown port. Stopping the Server instance. 23-Nov-2017 00:49:07.773 INFO [main] org.apache.coyote.AbstractProtocol.pause Pausing ProtocolHandler ["http-nio-8080"] 23-Nov-2017 00:49:07.824 INFO [main] org.apache.coyote.AbstractProtocol.pause Pausing ProtocolHandler ["ajp-nio-8009"] 23-Nov-2017 00:49:07.874 INFO [main] org.apache.catalina.core.StandardService.stopInternal Stopping service Catalina 23-Nov-2017 00:49:08.461 WARNING [localhost-startStop-2] org.apache.catalina.loader.WebappClassLoaderBase.clearReferencesJdbc The web application [MaintainSystem] registered the JDBC driver [com.mysql.jdbc.Driver] but failed to unregister it when the web application was stopped. To prevent a memory leak, the JDBC Driver has been forcibly unregistered. 23-Nov-2017 00:49:08.462 WARNING [localhost-startStop-2] org.apache.catalina.loader.WebappClassLoaderBase.clearReferencesThreads The web application [MaintainSystem] appears to have started a thread named [FileWatchdog] but has failed to stop it. This is very likely to create a memory leak. Stack trace of thread: java.lang.Thread.sleep(Native Method) org.apache.log4j.helpers.FileWatchdog.run(FileWatchdog.java:104) 23-Nov-2017 00:49:08.462 WARNING [localhost-startStop-2] org.apache.catalina.loader.WebappClassLoaderBase.clearReferencesThreads The web application [MaintainSystem] appears to have started a thread named [cluster-ClusterId{value='5a1593f852e92b26c5f5d86c', description='null'}-192.168.7.178:27017] but has failed to stop it. This is very likely to create a memory leak. Stack trace of thread: java.net.PlainSocketImpl.socketConnect(Native Method) java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350) java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206) java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188) java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392) java.net.Socket.connect(Socket.java:589) com.mongodb.connection.SocketStreamHelper.initialize(SocketStreamHelper.java:50) com.mongodb.connection.SocketStream.open(SocketStream.java:58) com.mongodb.connection.InternalStreamConnection.open(InternalStreamConnection.java:114) com.mongodb.connection.DefaultServerMonitor$ServerMonitorRunnable.run(DefaultServerMonitor.java:128) java.lang.Thread.run(Thread.java:745) 23-Nov-2017 00:49:08.463 WARNING [localhost-startStop-2] org.apache.catalina.loader.WebappClassLoaderBase.clearReferencesThreads The web application [MaintainSystem] appears to have started a thread named [Thread-5] but has failed to stop it. This is very likely to create a memory leak. Stack trace of thread: java.lang.Thread.sleep(Native Method) com.xd.util.DatabaseCleaner.run(DatabaseCleaner.java:50) 23-Nov-2017 00:49:08.480 INFO [main] org.apache.coyote.AbstractProtocol.stop Stopping ProtocolHandler ["http-nio-8080"] 23-Nov-2017 00:49:08.490 INFO [main] org.apache.coyote.AbstractProtocol.stop Stopping ProtocolHandler ["ajp-nio-8009"] 23-Nov-2017 00:49:08.589 INFO [main] org.apache.coyote.AbstractProtocol.destroy Destroying ProtocolHandler ["http-nio-8080"] 23-Nov-2017 00:49:08.591 INFO [main] org.apache.coyote.AbstractProtocol.destroy Destroying ProtocolHandler ["ajp-nio-8009"] 23-Nov-2017 00:49:21.473 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Server version: Apache Tomcat/8.0.20 23-Nov-2017 00:49:21.475 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Server built: Feb 15 2015 18:10:42 UTC 23-Nov-2017 00:49:21.475 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Server number: 8.0.20.0 23-Nov-2017 00:49:21.475 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log OS Name: Linux 23-Nov-2017 00:49:21.475 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log OS Version: 2.6.32-573.el6.x86_64 23-Nov-2017 00:49:21.475 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Architecture: amd64 23-Nov-2017 00:49:21.475 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Java Home: /home/dabai/jdk1.8.0_91/jre 23-Nov-2017 00:49:21.475 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log JVM Version: 1.8.0_91-b14 23-Nov-2017 00:49:21.476 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log JVM Vendor: Oracle Corporation 23-Nov-2017 00:49:21.476 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log CATALINA_BASE: /home/dabai/apache-tomcat-8.0.20 23-Nov-2017 00:49:21.476 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log CATALINA_HOME: /home/dabai/apache-tomcat-8.0.20 23-Nov-2017 00:49:21.476 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Djava.util.logging.config.file=/home/dabai/apache-tomcat-8.0.20/conf/logging.properties 23-Nov-2017 00:49:21.477 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Djava.util.logging.manager=org.apache.juli.ClassLoaderLogManager 23-Nov-2017 00:49:21.477 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Djava.endorsed.dirs=/home/dabai/apache-tomcat-8.0.20/endorsed 23-Nov-2017 00:49:21.477 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Dcatalina.base=/home/dabai/apache-tomcat-8.0.20 23-Nov-2017 00:49:21.477 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Dcatalina.home=/home/dabai/apache-tomcat-8.0.20 23-Nov-2017 00:49:21.478 INFO [main] org.apache.catalina.startup.VersionLoggerListener.log Command line argument: -Djava.io.tmpdir=/home/dabai/apache-tomcat-8.0.20/temp 23-Nov-2017 00:49:21.478 INFO [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent The APR based Apache Tomcat Native library which allows optimal performance in production environments was not found on the java.library.path: /usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib 23-Nov-2017 00:49:21.607 INFO [main] org.apache.coyote.AbstractProtocol.init Initializing ProtocolHandler ["http-nio-8080"] 23-Nov-2017 00:49:21.621 INFO [main] org.apache.tomcat.util.net.NioSelectorPool.getSharedSelector Using a shared selector for servlet write/read 23-Nov-2017 00:49:21.624 INFO [main] org.apache.coyote.AbstractProtocol.init Initializing ProtocolHandler ["ajp-nio-8009"] 23-Nov-2017 00:49:21.625 INFO [main] org.apache.tomcat.util.net.NioSelectorPool.getSharedSelector Using a shared selector for servlet write/read 23-Nov-2017 00:49:21.626 INFO [main] org.apache.catalina.startup.Catalina.load Initialization processed in 632 ms 23-Nov-2017 00:49:21.658 INFO [main] org.apache.catalina.core.StandardService.startInternal Starting service Catalina 23-Nov-2017 00:49:21.658 INFO [main] org.apache.catalina.core.StandardEngine.startInternal Starting Servlet Engine: Apache Tomcat/8.0.20 23-Nov-2017 00:49:21.681 INFO [localhost-startStop-1] org.apache.catalina.startup.HostConfig.deployWAR Deploying web application archive /home/dabai/apache-tomcat-8.0.20/webapps/MaintainSystem.war 23-Nov-2017 00:50:06.881 INFO [localhost-startStop-1] org.apache.catalina.startup.HostConfig.deployWAR Deployment of web application archive /home/dabai/apache-tomcat-8.0.20/webapps/MaintainSystem.war has finished in 45,198 ms 23-Nov-2017 00:50:06.882 INFO [localhost-startStop-1] org.apache.catalina.startup.HostConfig.deployDirectory Deploying web application directory /home/dabai/apache-tomcat-8.0.20/webapps/examples 23-Nov-2017 00:50:07.077 INFO [localhost-startStop-1] org.apache.catalina.startup.HostConfig.deployDirectory Deployment of web application directory /home/dabai/apache-tomcat-8.0.20/webapps/examples has finished in 195 ms 23-Nov-2017 00:50:07.078 INFO [localhost-startStop-1] org.apache.catalina.startup.HostConfig.deployDirectory Deploying web application directory /home/dabai/apache-tomcat-8.0.20/webapps/manager 23-Nov-2017 00:50:07.103 INFO [localhost-startStop-1] org.apache.catalina.startup.HostConfig.deployDirectory Deployment of web application directory /home/dabai/apache-tomcat-8.0.20/webapps/manager has finished in 25 ms 23-Nov-2017 00:50:07.104 INFO [localhost-startStop-1] org.apache.catalina.startup.HostConfig.deployDirectory Deploying web application directory /home/dabai/apache-tomcat-8.0.20/webapps/host-manager 23-Nov-2017 00:50:07.132 INFO [localhost-startStop-1] org.apache.catalina.startup.HostConfig.deployDirectory Deployment of web application directory /home/dabai/apache-tomcat-8.0.20/webapps/host-manager has finished in 28 ms 23-Nov-2017 00:50:07.133 INFO [localhost-startStop-1] org.apache.catalina.startup.HostConfig.deployDirectory Deploying web application directory /home/dabai/apache-tomcat-8.0.20/webapps/docs 23-Nov-2017 00:50:07.156 INFO [localhost-startStop-1] org.apache.catalina.startup.HostConfig.deployDirectory Deployment of web application directory /home/dabai/apache-tomcat-8.0.20/webapps/docs has finished in 23 ms 23-Nov-2017 00:50:07.157 INFO [localhost-startStop-1] org.apache.catalina.startup.HostConfig.deployDirectory Deploying web application directory /home/dabai/apache-tomcat-8.0.20/webapps/ROOT 23-Nov-2017 00:50:07.177 INFO [localhost-startStop-1] org.apache.catalina.startup.HostConfig.deployDirectory Deployment of web application directory /home/dabai/apache-tomcat-8.0.20/webapps/ROOT has finished in 19 ms 23-Nov-2017 00:50:07.185 INFO [main] org.apache.coyote.AbstractProtocol.start Starting ProtocolHandler ["http-nio-8080"] 23-Nov-2017 00:50:07.198 INFO [main] org.apache.coyote.AbstractProtocol.start Starting ProtocolHandler ["ajp-nio-8009"] 23-Nov-2017 00:50:07.201 INFO [main] org.apache.catalina.startup.Catalina.start Server startup in 45574 ms web项目运行后,报错提示: Failed to load resource: the server responded with a status of 500 (Internal Server Error) saveOpMiInfo.do?urlCode=49D4:75 Uncaught ReferenceError: maxlengthBind is not defined at saveOpMiInfo.do?urlCode=49D4:75 saveOpMiInfo.do Failed to load resource: the server responded with a status of 500 (Internal Server Error)
晕死,有人见过类似的错误没?关于的hibernate的
我没分了 5......... 先来错误提示(首先说明不是简单的表不存在,所有的映射文件都没有映射过company表[至少没有显式引用]) [code="java"]ERROR 2009-09-10 13:31:58.343 JDBCExceptionReporter:logExceptions - Table 'bfw_utf8.company' doesn't exist org.springframework.jdbc.BadSqlGrammarException: Hibernate operation: could not insert: [com.bafang.pojos.SiteAd]; bad SQL grammar [insert into site_ad (subsite_id, site_ad_location_id, site_ad_location_name, site_ad_keyword_id, site_ad_keyword, company_id, company_name, txt, url, media_type, media_file, type, charge, start_time, end_time, item_order, enabled) values (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)]; nested exception is com.mysql.jdbc.exceptions.MySQLSyntaxErrorException: Table 'bfw_utf8.company' doesn't exist Caused by: com.mysql.jdbc.exceptions.MySQLSyntaxErrorException: Table 'bfw_utf8.company' doesn't exist at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:936) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:2870) at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:1573) at com.mysql.jdbc.ServerPreparedStatement.serverExecute(ServerPreparedStatement.java:1160) at com.mysql.jdbc.ServerPreparedStatement.executeInternal(ServerPreparedStatement.java:685) at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:1400) at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:1314) at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:1299) at com.mchange.v2.c3p0.impl.NewProxyPreparedStatement.executeUpdate(NewProxyPreparedStatement.java:105) at org.hibernate.id.IdentityGenerator$GetGeneratedKeysDelegate.executeAndExtract(IdentityGenerator.java:73) at org.hibernate.id.insert.AbstractReturningDelegate.performInsert(AbstractReturningDelegate.java:33) at org.hibernate.persister.entity.AbstractEntityPersister.insert(AbstractEntityPersister.java:2158) at org.hibernate.persister.entity.AbstractEntityPersister.insert(AbstractEntityPersister.java:2638) at org.hibernate.action.EntityIdentityInsertAction.execute(EntityIdentityInsertAction.java:48) at org.hibernate.engine.ActionQueue.execute(ActionQueue.java:248) at org.hibernate.event.def.AbstractSaveEventListener.performSaveOrReplicate(AbstractSaveEventListener.java:298) at org.hibernate.event.def.AbstractSaveEventListener.performSave(AbstractSaveEventListener.java:181) at org.hibernate.event.def.AbstractSaveEventListener.saveWithGeneratedId(AbstractSaveEventListener.java:107) at org.hibernate.event.def.DefaultSaveOrUpdateEventListener.saveWithGeneratedOrRequestedId(DefaultSaveOrUpdateEventListener.java:187) at org.hibernate.event.def.DefaultSaveEventListener.saveWithGeneratedOrRequestedId(DefaultSaveEventListener.java:33) at org.hibernate.event.def.DefaultSaveOrUpdateEventListener.entityIsTransient(DefaultSaveOrUpdateEventListener.java:172) at org.hibernate.event.def.DefaultSaveEventListener.performSaveOrUpdate(DefaultSaveEventListener.java:27) at org.hibernate.event.def.DefaultSaveOrUpdateEventListener.onSaveOrUpdate(DefaultSaveOrUpdateEventListener.java:70) at org.hibernate.impl.SessionImpl.fireSave(SessionImpl.java:535) at org.hibernate.impl.SessionImpl.save(SessionImpl.java:523) at org.hibernate.impl.SessionImpl.save(SessionImpl.java:519) at org.springframework.orm.hibernate3.HibernateTemplate$12.doInHibernate(HibernateTemplate.java:598) at org.springframework.orm.hibernate3.HibernateTemplate.execute(HibernateTemplate.java:358) at org.springframework.orm.hibernate3.HibernateTemplate.save(HibernateTemplate.java:595) at com.bafang.pojos.base._BaseRootDAO.save(_BaseRootDAO.java:25) at com.bafang.pojos.base.BaseSiteAdDAO.save(BaseSiteAdDAO.java:20) at com.bafang.pojos.base.BaseSiteAdDAO$$FastClassByCGLIB$$33c630a7.invoke(<generated>) at net.sf.cglib.proxy.MethodProxy.invoke(MethodProxy.java:149) at org.springframework.aop.framework.Cglib2AopProxy$DynamicAdvisedInterceptor.intercept(Cglib2AopProxy.java:609) at com.bafang.pojos.dao.SiteAdDAO$$EnhancerByCGLIB$$29bf5095.save(<generated>) at com.bafang.web.action.admin.main.AdAction.add(AdAction.java:107) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.struts.actions.DispatchAction.dispatchMethod(DispatchAction.java:269) at org.apache.struts.actions.DispatchAction.execute(DispatchAction.java:170) at org.springframework.web.struts.DelegatingActionProxy.execute(DelegatingActionProxy.java:106) at org.apache.struts.chain.commands.servlet.ExecuteAction.execute(ExecuteAction.java:58) at org.apache.struts.chain.commands.AbstractExecuteAction.execute(AbstractExecuteAction.java:67) at org.apache.struts.chain.commands.ActionCommandBase.execute(ActionCommandBase.java:51) at org.apache.commons.chain.impl.ChainBase.execute(ChainBase.java:190) at org.apache.commons.chain.generic.LookupCommand.execute(LookupCommand.java:304) at org.apache.commons.chain.impl.ChainBase.execute(ChainBase.java:190) at org.apache.struts.chain.ComposableRequestProcessor.process(ComposableRequestProcessor.java:283) at org.apache.struts.action.ActionServlet.process(ActionServlet.java:1913) at com.bafang.web.servlet.BFActionServlet.process(BFActionServlet.java:26) at org.apache.struts.action.ActionServlet.doPost(ActionServlet.java:462) at javax.servlet.http.HttpServlet.service(HttpServlet.java:709) at javax.servlet.http.HttpServlet.service(HttpServlet.java:802) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:237) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:157) at com.bafang.web.filters.HTMLFilter.synthesisURI(HTMLFilter.java:85) at com.bafang.web.filters.HTMLFilter.doFilter(HTMLFilter.java:60) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:186) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:157) at com.bafang.web.filters.AdminFilter.doFilter(AdminFilter.java:41) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:186) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:157) at com.bafang.web.filters.SubsiteFilter.doFilter(SubsiteFilter.java:35) at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:186) at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:157) at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:214) at org.apache.catalina.core.StandardValveContext.invokeNext(StandardValveContext.java:104) at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:520) at org.apache.catalina.core.StandardContextValve.invokeInternal(StandardContextValve.java:198) at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:152) at org.apache.catalina.core.StandardValveContext.invokeNext(StandardValveContext.java:104) at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:520) at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:137) at org.apache.catalina.core.StandardValveContext.invokeNext(StandardValveContext.java:104) at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:118) at org.apache.catalina.core.StandardValveContext.invokeNext(StandardValveContext.java:102) at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:520) at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109) at org.apache.catalina.core.StandardValveContext.invokeNext(StandardValveContext.java:104) at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:520) at org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:929) at org.apache.coyote.tomcat5.CoyoteAdapter.service(CoyoteAdapter.java:160) at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:799) at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.processConnection(Http11Protocol.java:705) at org.apache.tomcat.util.net.TcpWorkerThread.runIt(PoolTcpEndpoint.java:577) at org.apache.tomcat.util.threads.ThreadPool$ControlRunnable.run(ThreadPool.java:683) at java.lang.Thread.run(Thread.java:619)[/code] 相关hibernate映射文件 SiteAd.hbm.xml [code="xml"]<?xml version="1.0"?> <!DOCTYPE hibernate-mapping PUBLIC "-//Hibernate/Hibernate Mapping DTD//EN" "http://hibernate.sourceforge.net/hibernate-mapping-3.0.dtd" > <hibernate-mapping package="com.bafang.pojos"> <class name="SiteAd" table="site_ad"> <id name="id" type="integer" column="id"> <generator class="identity" /> </id> <property name="subsiteId" column="subsite_id" type="integer" not-null="false" length="11" /> <property name="siteAdLocationId" column="site_ad_location_id" type="string" not-null="false" length="50" /> <property name="siteAdLocationName" column="site_ad_location_name" type="string" not-null="false" length="255" /> <many-to-one name="siteAdLocation" column="site_ad_location_id" class="SiteAdLocation" not-null="false" insert="false" update="false" /> <property name="siteAdKeywordId" column="site_ad_keyword_id" type="integer" not-null="false" length="11" /> <property name="siteAdKeyword" column="site_ad_keyword" type="string" not-null="false" length="255" /> <property name="companyId" column="company_id" type="integer" not-null="false" length="11" /> <property name="companyName" column="company_name" type="string" not-null="false" length="255" /> <!--<many-to-one name="company" column="company_id" class="Store" not-null="false" insert="false" update="false" />--> <property name="txt" column="txt" type="string" not-null="false" length="255" /> <property name="url" column="url" type="string" not-null="false" length="255" /> <property name="mediaType" column="media_type" type="string" not-null="false" length="255" /> <property name="mediaFile" column="media_file" type="string" not-null="false" length="255" /> <property name="type" column="type" type="string" not-null="false" length="100" /> <property name="charge" column="charge" type="java.lang.Float" not-null="false" length="12" /> <property name="startTime" column="start_time" type="timestamp" not-null="false" length="19" /> <property name="endTime" column="end_time" type="timestamp" not-null="false" length="19" /> <property name="itemOrder" column="item_order" type="integer" not-null="false" length="11" /> <property name="enabled" column="enabled" type="boolean" not-null="false" length="1" /> </class> </hibernate-mapping>[/code] pojo相关类 com.bafang.pojos.SiteAd.java [code="java"]package com.bafang.pojos; import com.bafang.pojos.base.BaseSiteAd; public class SiteAd extends BaseSiteAd { private static final long serialVersionUID = 7092206885096359130L; public SiteAd () { } public String getTypeName(){ if("1".equals(getType())) return "多媒体广告"; else if("2".equals(getType())) return "连接广告"; else if("3".equals(getType())) return "店铺广告"; else if("4".equals(getType())) return "店铺评论广告"; else return "其它广告"; } }[/code] com.bafang.pojos.base.BaseSiteAd.java [code="java"]package com.bafang.pojos.base; import java.io.Serializable; public abstract class BaseSiteAd implements Serializable { protected Integer id; protected Integer subsiteId; protected String siteAdLocationId; protected String siteAdLocationName; protected Integer siteAdKeywordId; protected String siteAdKeyword; protected Integer companyId; protected String companyName; protected String txt; protected String url; protected String mediaType; protected String mediaFile; protected String type; protected Float charge; protected java.util.Date startTime; protected java.util.Date endTime; protected Integer itemOrder; protected Boolean enabled; protected com.bafang.pojos.SiteAdLocation siteAdLocation; public Integer getId() { return id; } public void setId(Integer id) { this.id = id; } public Integer getSubsiteId() { return subsiteId; } public void setSubsiteId(Integer subsiteId) { this.subsiteId = subsiteId; } public String getSiteAdLocationId() { return siteAdLocationId; } public void setSiteAdLocationId(String siteAdLocationId) { this.siteAdLocationId = siteAdLocationId; } public String getSiteAdLocationName() { return siteAdLocationName; } public void setSiteAdLocationName(String siteAdLocationName) { this.siteAdLocationName = siteAdLocationName; } public Integer getSiteAdKeywordId() { return siteAdKeywordId; } public void setSiteAdKeywordId(Integer siteAdKeywordId) { this.siteAdKeywordId = siteAdKeywordId; } public String getSiteAdKeyword() { return siteAdKeyword; } public void setSiteAdKeyword(String siteAdKeyword) { this.siteAdKeyword = siteAdKeyword; } public Integer getCompanyId() { return companyId; } public void setCompanyId(Integer companyId) { this.companyId = companyId; } public String getCompanyName() { return companyName; } public void setCompanyName(String companyName) { this.companyName = companyName; } public String getTxt() { return txt; } public void setTxt(String txt) { this.txt = txt; } public String getUrl() { return url; } public void setUrl(String url) { this.url = url; } public String getMediaType() { return mediaType; } public void setMediaType(String mediaType) { this.mediaType = mediaType; } public String getMediaFile() { return mediaFile; } public void setMediaFile(String mediaFile) { this.mediaFile = mediaFile; } public String getType() { return type; } public void setType(String type) { this.type = type; } public Float getCharge() { return charge; } public void setCharge(Float charge) { this.charge = charge; } public java.util.Date getStartTime() { return startTime; } public void setStartTime(java.util.Date startTime) { this.startTime = startTime; } public java.util.Date getEndTime() { return endTime; } public void setEndTime(java.util.Date endTime) { this.endTime = endTime; } public Integer getItemOrder() { return itemOrder; } public void setItemOrder(Integer itemOrder) { this.itemOrder = itemOrder; } public Boolean getEnabled() { return enabled; } public void setEnabled(Boolean enabled) { this.enabled = enabled; } public com.bafang.pojos.SiteAdLocation getSiteAdLocation() { return siteAdLocation; } public void setSiteAdLocation(com.bafang.pojos.SiteAdLocation siteAdLocation) { this.siteAdLocation = siteAdLocation; } }[/code] 保存动作是直接调用spring的工具类 org.springframework.orm.hibernate3.support.HibernateDaoSupport.save(Object obj); 我debug过,就是保存的动作后报的错 数据库query 日志 [code="java"]090910 13:31:50 2 Query SET autocommit=0 2 Prepare select siteadloca0_.id as id17_, siteadloca0_.subsite_id as subsite2_17_, siteadloca0_.name as name17_, siteadloca0_.type as type17_, siteadloca0_.width as width17_, siteadloca0_.height as height17_, siteadloca0_.item_count as item7_17_, siteadloca0_.url as url17_, siteadloca0_.enabled as enabled17_ from site_ad_location siteadloca0_ where 1=1 and siteadloca0_.subsite_id=1 limit ? 2 Execute select siteadloca0_.id as id17_, siteadloca0_.subsite_id as subsite2_17_, siteadloca0_.name as name17_, siteadloca0_.type as type17_, siteadloca0_.width as width17_, siteadloca0_.height as height17_, siteadloca0_.item_count as item7_17_, siteadloca0_.url as url17_, siteadloca0_.enabled as enabled17_ from site_ad_location siteadloca0_ where 1=1 and siteadloca0_.subsite_id=1 limit 20 2 Close stmt 2 Query commit 2 Query SET autocommit=1 2 Query SET autocommit=0 2 Prepare select count(*) as col_0_0_ from site_ad_location siteadloca0_ where 1=1 and siteadloca0_.subsite_id=? 2 Execute select count(*) as col_0_0_ from site_ad_location siteadloca0_ where 1=1 and siteadloca0_.subsite_id=1 2 Close stmt 2 Query commit 2 Query SET autocommit=1 090910 13:31:58 2 Prepare insert into site_ad (subsite_id, site_ad_location_id, site_ad_location_name, site_ad_keyword_id, site_ad_keyword, company_id, company_name, txt, url, media_type, media_file, type, charge, start_time, end_time, item_order, enabled) values (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) 2 Query select name into str from site_ad_location where id=new.site_ad_location_id 2 Query select name into str from company where id=new.company_id 2 Query SELECT 1 2 Close stmt D:\Program Files\MySQL\MySQL Server 5.1\bin\mysqld, Version: 5.1.37-community-log (MySQL Community Server (GPL)). started with: TCP Port: 3306, Named Pipe: (null) Time Id Command Argument 090910 13:33:15 1 Connect root@localhost on 1 Query SET NAMES utf8 2 Connect root@localhost on 2 Query SELECT @@sql_mode 2 Query SET SESSION sql_mode='STRICT_TRANS_TABLES,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION' 2 Query SET NAMES utf8 2 Quit 090910 13:33:17 1 Quit 3 Connect root@localhost on 3 Query SELECT @@sql_mode 3 Query SET SESSION sql_mode='STRICT_TRANS_TABLES,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION' 3 Query SET NAMES utf8[/code] [b]问题补充:[/b] [quote]dwangel[/quote]已经清理过项目 还是一样 谢谢啊 已经找到问题了 以前写成的程序员 弄了个触发器在数据库里 刚刚发现的 谢谢了 分不多都给你了
Linux磁盘100%已用求解答
系统版本 [root@localhost mysql]# cat /etc/centos-release CentOS Linux release 7.3.1611 (Core) [root@localhost mysql]# df -h 文件系统 容量 已用 可用 已用% 挂载点 /dev/mapper/cl-root 50G 50G 20K 100% / devtmpfs 3.9G 0 3.9G 0% /dev tmpfs 3.9G 0 3.9G 0% /dev/shm tmpfs 3.9G 377M 3.5G 10% /run tmpfs 3.9G 0 3.9G 0% /sys/fs/cgroup /dev/sda1 1014M 139M 876M 14% /boot /dev/mapper/cl-home 198G 23G 175G 12% /home tmpfs 783M 0 783M 0% /run/user/0 [root@localhost mysql]# 这里inode值比较大,是因为/usr文件中文件比较多 [root@localhost mysql]# df -i 文件系统 Inode 已用(I) 可用(I) 已用(I)% 挂载点 /dev/mapper/cl-root 55680 49112 6568 89% / devtmpfs 998610 364 998246 1% /dev tmpfs 1001351 1 1001350 1% /dev/shm tmpfs 1001351 576 1000775 1% /run tmpfs 1001351 16 1001335 1% /sys/fs/cgroup /dev/sda1 524288 330 523958 1% /boot /dev/mapper/cl-home 103346176 25421 103320755 1% /home tmpfs 1001351 1 1001350 1% /run/user/0 [root@localhost mysql]# find /usr | wc -l 44229 注:/home由于挂在另外一个磁盘下,所以可以无视。 那在/dev/mapper/cl-root这块盘下大致就用了(29-23=6G) [root@localhost mysql]# du -sh / 29G [root@localhost mysql]# du -sh /* 0 /bin 106M /boot 0 /dev 30M /etc 23G /home 0 /lib 0 /lib64 0 /media 0 /mnt 0 /proc 69M /root 377M /run 0 /sbin 0 /srv 0 /sys 2.0M /tmp 2.0G /usr 3.5G /var 前面这台机器不归我管,后来出了这个问题让我来弄。 经理说,前面这台机器还好好的,有一天重启了一下,然后就发现mysql启动不了了,查看一下磁盘用了100%了,然后就删了一些日志文件,大概清理了2G,然后重启一下,结果重启完又变成100%,清理出来的2G不见了 已尝试: 1)怀疑过中毒了,但是CPU没有异样,就磁盘有问题 [root@localhost mysql]# top top - 18:43:57 up 27 days, 19:01, 2 users, load average: 0.12, 0.13, 0.27 Tasks: 126 total, 1 running, 125 sleeping, 0 stopped, 0 zombie %Cpu(s): 1.1 us, 2.3 sy, 0.0 ni, 96.7 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st KiB Mem : 8010812 total, 141016 free, 2661320 used, 5208476 buff/cache KiB Swap: 8257532 total, 8223976 free, 33556 used. 4563024 avail Mem PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND 9220 root 20 0 6698700 359604 9476 S 5.6 4.5 733:59.96 java 12082 root 20 0 6449520 1.702g 12528 S 2.6 22.3 36:43.26 java 25297 root 20 0 357720 9564 2252 S 2.3 0.1 442:00.22 xfrpc 32298 root 20 0 430396 9236 2256 S 2.3 0.1 437:20.19 xfrpc 3133 root 20 0 169688 10560 1104 S 0.7 0.1 114:14.54 redis-server 2993 root 20 0 157704 2216 1536 R 0.3 0.0 0:00.05 top 1 root 20 0 45816 5408 3004 S 0.0 0.1 1:05.65 systemd 2 root 20 0 0 0 0 S 0.0 0.0 0:01.04 kthreadd 3 root 20 0 0 0 0 S 0.0 0.0 0:13.22 ksoftirqd/0 7 root rt 0 0 0 0 S 0.0 0.0 0:03.42 migration/0 8 root 20 0 0 0 0 S 0.0 0.0 0:00.00 rcu_bh 9 root 20 0 0 0 0 S 0.0 0.0 29:27.01 rcu_sched 10 root rt 0 0 0 0 S 0.0 0.0 0:22.34 watchdog/0 11 root rt 0 0 0 0 S 0.0 0.0 0:22.23 watchdog/1 12 root rt 0 0 0 0 S 0.0 0.0 0:01.89 migration/1 13 root 20 0 0 0 0 S 0.0 0.0 0:23.27 ksoftirqd/1 16 root rt 0 0 0 0 S 0.0 0.0 0:23.70 watchdog/2 17 root rt 0 0 0 0 S 0.0 0.0 0:02.59 migration/2 18 root 20 0 0 0 0 S 0.0 0.0 0:22.26 ksoftirqd/2 2)主要运行的程序有 tomcat,redis,mosquitto 我已经排除了tomcat日志过大的问题,我重启过tomcat,如果过大的话,重启的时候已经没掉了。 redis占用的内存看了下,也不是很大,才百来兆。 3)losf | grep delete 也已经试过了,没发现有什么占用着 因为重启之后磁盘还是满的,也不太可能是什么进程占用着文件没释放 [root@localhost conf]# lsof | grep deleted tuned 930 root 7u REG 253,0 4096 67161125 /tmp/ffirMlJJF (deleted) gmain 930 2261 root 7u REG 253,0 4096 67161125 /tmp/ffirMlJJF (deleted) tuned 930 2262 root 7u REG 253,0 4096 67161125 /tmp/ffirMlJJF (deleted) tuned 930 2263 root 7u REG 253,0 4096 67161125 /tmp/ffirMlJJF (deleted) tuned 930 2264 root 7u REG 253,0 4096 67161125 /tmp/ffirMlJJF (deleted) [root@localhost conf]# 4)磁盘坏了~ 如果是这种情况,就莫得办法咯。 求大神帮忙~先谢过了。
Springboot 使用aop ,切controll层可以,切service层不行,请问是为什么
Springboot 使用aop ,切controll层可以,切service层不行,找了好久不知道是为啥。 pom.xml: ``` <dependencies> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-web</artifactId> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-test</artifactId> <scope>test</scope> </dependency> <!--mybatis --> <dependency> <groupId>org.mybatis.spring.boot</groupId> <artifactId>mybatis-spring-boot-starter</artifactId> <version>1.3.1</version> </dependency> <dependency> <groupId>mysql</groupId> <artifactId>mysql-connector-java</artifactId> </dependency> <!--end mybatis--> <!--通用mapper--> <dependency> <groupId>tk.mybatis</groupId> <artifactId>mapper-spring-boot-starter</artifactId> <version>1.1.5</version> </dependency> <!--&lt;!&ndash;pagehelper 分页插件&ndash;&gt;--> <!--<dependency>--> <!--<groupId>com.github.pagehelper</groupId>--> <!--<artifactId>pagehelper-spring-boot-starter</artifactId>--> <!--<version>1.2.3</version>--> <!--</dependency>--> <!--增加alibaba的连接池--> <dependency> <groupId>mysql</groupId> <artifactId>mysql-connector-java</artifactId> </dependency> <dependency> <groupId>com.alibaba</groupId> <artifactId>druid</artifactId> <version>1.0.19</version> </dependency> <!--end 增加连接池--> <!--热部署--> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-devtools</artifactId> <optional>true</optional> </dependency> <!--end 热部署--> <!--默认日志Logback--> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-logging</artifactId> </dependency> <dependency> <groupId>org.apache.commons</groupId> <artifactId>commons-lang3</artifactId> <version>3.4</version> </dependency> <!--shiro依赖包--> <dependency> <groupId>org.apache.shiro</groupId> <artifactId>shiro-spring</artifactId> <version>1.4.0</version> </dependency> <dependency> <groupId>com.alibaba</groupId> <artifactId>fastjson</artifactId> <version>1.2.31</version> </dependency> <!--&lt;!&ndash;redis 的spring支持&ndash;&gt;--> <!--<dependency>--> <!--<groupId>org.springframework.boot</groupId>--> <!--<artifactId>spring-boot-starter-data-redis</artifactId>--> <!--</dependency>--> <!-- 整合redis --> <!-- https://mvnrepository.com/artifact/org.springframework.boot/spring-boot-starter-data-redis --> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-data-redis</artifactId> <version>2.0.1.RELEASE</version> </dependency> <!-- https://mvnrepository.com/artifact/org.springframework.data/spring-data-redis --> <dependency> <groupId>org.springframework.data</groupId> <artifactId>spring-data-redis</artifactId> <version>2.0.6.RELEASE</version> </dependency> <!-- redis客户端操作 --> <dependency> <groupId>redis.clients</groupId> <artifactId>jedis</artifactId> <version>2.9.0</version> </dependency> <!--aop--> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-aop</artifactId> </dependency> <!--<dependency>--> <!--<groupId>org.aspectj</groupId>--> <!--<artifactId>aspectjrt</artifactId>--> <!--<version>1.7.4</version>--> <!--</dependency>--> </dependencies> ```
定时任务执行一段时间JVM会自动退出问题,急求大神帮忙。
最近程序很不稳定,运行一段时间后,JVM会自动退出,急求ITeye大神帮忙看下。 下面是错误日志: # # A fatal error has been detected by the Java Runtime Environment: # # Internal Error (safepoint.cpp:308), pid=16953, tid=139961315587840 # guarantee(PageArmed == 0) failed: invariant # # JRE version: 6.0_31-b04 # Java VM: Java HotSpot(TM) 64-Bit Server VM (20.6-b01 mixed mode linux-amd64 compressed oops) # If you would like to submit a bug report, please visit: # http://java.sun.com/webapps/bugreport/crash.jsp # --------------- T H R E A D --------------- Current thread (0x00007f4b4c070000): VMThread [stack: 0x00007f4b486f7000,0x00007f4b487f8000] [id=16963] Stack: [0x00007f4b486f7000,0x00007f4b487f8000], sp=0x00007f4b487f6b30, free space=1022k Native frames: (J=compiled Java code, j=interpreted, Vv=VM code, C=native code) V [libjvm.so+0x85eba5] VMError::report_and_die()+0x265 V [libjvm.so+0x3e40d6] report_vm_error(char const*, int, char const*, char const*)+0x56 V [libjvm.so+0x783666] SafepointSynchronize::begin()+0x4f6 V [libjvm.so+0x86d01c] VMThread::loop()+0x18c V [libjvm.so+0x86cb2e] VMThread::run()+0x6e V [libjvm.so+0x710bdf] java_start(Thread*)+0x13f VM_Operation (0x00007f4b11fb9380): ParallelGCFailedAllocation, mode: safepoint, requested by thread 0x00007f4b4c90c000 --------------- P R O C E S S --------------- Java Threads: ( => current thread ) 0x00007f4ab8004800 JavaThread "Keep-Alive-Timer" daemon [_thread_blocked, id=5410, stack(0x00007f4a4f3f4000,0x00007f4a4f4f5000)] 0x00007f4ae8014000 JavaThread "FetchRunnable-0" [_thread_blocked, id=5171, stack(0x00007f4a4d5b8000,0x00007f4a4d6b9000)] 0x00007f4b0001a800 JavaThread "Thread-20-EventThread" daemon [_thread_blocked, id=5168, stack(0x00007f4a4e4c7000,0x00007f4a4e5c8000)] 0x00007f4b0001a000 JavaThread "Thread-20-SendThread(mota30.domain:2181)" daemon [_thread_blocked, id=5167, stack(0x00007f4a4cdb0000,0x00007f4a4ceb1000)] 0x00007f4af8005800 JavaThread "Thread-20-EventThread" daemon [_thread_blocked, id=5166, stack(0x00007f4a4ceb1000,0x00007f4a4cfb2000)] 0x00007f4af8010800 JavaThread "Thread-20-SendThread(mota31.domain:2181)" daemon [_thread_blocked, id=5165, stack(0x00007f4a4d6b9000,0x00007f4a4d7ba000)] 0x00007f4adc03f800 JavaThread "Thread-20-EventThread" daemon [_thread_blocked, id=5164, stack(0x00007f4a4dbbe000,0x00007f4a4dcbf000)] 0x00007f4adc03f000 JavaThread "Thread-20-SendThread(mota32.domain:2181)" daemon [_thread_blocked, id=5163, stack(0x00007f4a4e5c8000,0x00007f4a4e6c9000)] 0x00007f4acc01c800 JavaThread "Thread-20-EventThread" daemon [_thread_blocked, id=5162, stack(0x00007f4a4c9ac000,0x00007f4a4caad000)] 0x00007f4acc020000 JavaThread "Thread-20-SendThread(mota31.domain:2181)" daemon [_thread_blocked, id=5161, stack(0x00007f4a4e1c4000,0x00007f4a4e2c5000)] 0x00007f4aa001b000 JavaThread "Thread-20-EventThread" daemon [_thread_blocked, id=5160, stack(0x00007f4a4d8bb000,0x00007f4a4d9bc000)] 0x00007f4aa001a000 JavaThread "Thread-20-SendThread(mota31.domain:2181)" daemon [_thread_blocked, id=5159, stack(0x00007f4a4e2c5000,0x00007f4a4e3c6000)] 0x00007f4ac401e800 JavaThread "Thread-20-EventThread" daemon [_thread_blocked, id=5074, stack(0x00007f4a4ccaf000,0x00007f4a4cdb0000)] 0x00007f4ac401c000 JavaThread "Thread-20-SendThread(mota31.domain:2181)" daemon [_thread_blocked, id=5073, stack(0x00007f4a4dcbf000,0x00007f4a4ddc0000)] 0x00007f4a94073000 JavaThread "MySQL Statement Cancellation Timer" daemon [_thread_blocked, id=17076, stack(0x00007f4a4d0b3000,0x00007f4a4d1b4000)] 0x00007f4aa000a800 JavaThread "Thread-82" daemon [_thread_blocked, id=17075, stack(0x00007f4a4d1b4000,0x00007f4a4d2b5000)] 0x00007f4ab028c800 JavaThread "prism_mota28-1399511679861-89dd70f0_watcher_executor" [_thread_blocked, id=17073, stack(0x00007f4a4d3b6000,0x00007f4a4d4b7000)] 0x00007f4ab0295800 JavaThread "Kafka-consumer-autocommit-0" [_thread_blocked, id=17072, stack(0x00007f4a4d4b7000,0x00007f4a4d5b8000)] 0x00007f4ab0226800 JavaThread "ZkClient-EventThread-101-mota30:2181,mota31:2181,mota32:2181" daemon [_thread_blocked, id=17069, stack(0x00007f4a4d7ba000,0x00007f4a4d8bb000)] 0x00007f4ab01f5800 JavaThread "ZkClient-EventThread-98-mota30:2181,mota31:2181,mota32:2181" daemon [_thread_blocked, id=17066, stack(0x00007f4a4dabd000,0x00007f4a4dbbe000)] 0x00007f4ab01d6800 JavaThread "ZkClient-EventThread-95-mota30:2181,mota31:2181,mota32:2181" daemon [_thread_blocked, id=17063, stack(0x00007f4a4ddc0000,0x00007f4a4dec1000)] 0x00007f4ab01af000 JavaThread "ZkClient-EventThread-92-mota30:2181,mota31:2181,mota32:2181" daemon [_thread_blocked, id=17060, stack(0x00007f4a4e0c3000,0x00007f4a4e1c4000)] 0x00007f4ab007d000 JavaThread "ZkClient-EventThread-89-mota30:2181,mota31:2181,mota32:2181" daemon [_thread_blocked, id=17057, stack(0x00007f4a4e3c6000,0x00007f4a4e4c7000)] 0x00007f4ab0024000 JavaThread "ZkClient-EventThread-86-mota30:2181,mota31:2181,mota32:2181" daemon [_thread_blocked, id=17054, stack(0x00007f4a4e7d1000,0x00007f4a4e8d2000)] 0x00007f4af0019000 JavaThread "Timer-0" daemon [_thread_blocked, id=17045, stack(0x00007f4a4f4f5000,0x00007f4a4f5f6000)] 0x00007f4a98003800 JavaThread "Thread-74" [_thread_blocked, id=17043, stack(0x00007f4a4f8f9000,0x00007f4a4f9fa000)] 0x00007f4a8c03f000 JavaThread "Thread-73" [_thread_blocked, id=17042, stack(0x00007f4a4f9fa000,0x00007f4a4fafb000)] 0x00007f4a80011000 JavaThread "Thread-72" [_thread_blocked, id=17041, stack(0x00007f4a4fafb000,0x00007f4a4fbfc000)] 0x00007f4a84004800 JavaThread "Thread-71" [_thread_blocked, id=17040, stack(0x00007f4a4f6f7000,0x00007f4a4f7f8000)] 0x00007f4a80004000 JavaThread "Thread-69" daemon [_thread_blocked, id=17039, stack(0x00007f4a4f5f6000,0x00007f4a4f6f7000)] 0x00007f4b4c00a800 JavaThread "DestroyJavaVM" [_thread_blocked, id=16954, stack(0x00007f4b51ff0000,0x00007f4b520f1000)] 0x00007f4b4c958000 JavaThread "Thread-64" [_thread_blocked, id=17034, stack(0x00007f4a4f7f8000,0x00007f4a4f8f9000)] 0x00007f4b4c950800 JavaThread "Thread-60" [_thread_blocked, id=17030, stack(0x00007f4a4fbfc000,0x00007f4a4fcfd000)] 0x00007f4b4c94e800 JavaThread "Thread-59" [_thread_blocked, id=17029, stack(0x00007f4a4fcfd000,0x00007f4a4fdfe000)] 0x00007f4b4c94c800 JavaThread "Thread-58" [_thread_blocked, id=17028, stack(0x00007f4a4fdfe000,0x00007f4a4feff000)] 0x00007f4b4c94a800 JavaThread "Thread-57" [_thread_blocked, id=17027, stack(0x00007f4a4feff000,0x00007f4a50000000)] 0x00007f4b4c948800 JavaThread "Thread-56" [_thread_blocked, id=17026, stack(0x00007f4b1009c000,0x00007f4b1019d000)] 0x00007f4b4c946800 JavaThread "Thread-55" [_thread_blocked, id=17025, stack(0x00007f4b1019d000,0x00007f4b1029e000)] 0x00007f4b4c944800 JavaThread "Thread-54" [_thread_blocked, id=17024, stack(0x00007f4b1029e000,0x00007f4b1039f000)] 0x00007f4b4c942800 JavaThread "Thread-53" [_thread_blocked, id=17023, stack(0x00007f4b1039f000,0x00007f4b104a0000)] 0x00007f4b4c940800 JavaThread "Thread-52" [_thread_blocked, id=17022, stack(0x00007f4b104a0000,0x00007f4b105a1000)] 0x00007f4b4c93e800 JavaThread "Thread-51" [_thread_blocked, id=17021, stack(0x00007f4b105a1000,0x00007f4b106a2000)] 0x00007f4b4c93c800 JavaThread "Thread-50" [_thread_blocked, id=17020, stack(0x00007f4b106a2000,0x00007f4b107a3000)] 0x00007f4b4c93a800 JavaThread "Thread-49" [_thread_blocked, id=17019, stack(0x00007f4b107a3000,0x00007f4b108a4000)] 0x00007f4b4c938800 JavaThread "Thread-48" [_thread_blocked, id=17018, stack(0x00007f4b108a4000,0x00007f4b109a5000)] 0x00007f4b4c936800 JavaThread "Thread-47" [_thread_blocked, id=17017, stack(0x00007f4b109a5000,0x00007f4b10aa6000)] 0x00007f4b4c934800 JavaThread "Thread-46" [_thread_blocked, id=17016, stack(0x00007f4b10aa6000,0x00007f4b10ba7000)] 0x00007f4b4c932800 JavaThread "Thread-45" [_thread_blocked, id=17015, stack(0x00007f4b10ba7000,0x00007f4b10ca8000)] 0x00007f4b4c930800 JavaThread "Thread-44" [_thread_blocked, id=17014, stack(0x00007f4b10ca8000,0x00007f4b10da9000)] 0x00007f4b4c92e800 JavaThread "Thread-43" [_thread_in_Java, id=17013, stack(0x00007f4b10da9000,0x00007f4b10eaa000)] 0x00007f4b4c92c800 JavaThread "Thread-42" [_thread_blocked, id=17012, stack(0x00007f4b10eaa000,0x00007f4b10fab000)] 0x00007f4b4c92a800 JavaThread "Thread-41" [_thread_blocked, id=17011, stack(0x00007f4b10fab000,0x00007f4b110ac000)] 0x00007f4b4c928800 JavaThread "Thread-40" [_thread_blocked, id=17010, stack(0x00007f4b110ac000,0x00007f4b111ad000)] 0x00007f4b4c926800 JavaThread "Thread-39" [_thread_blocked, id=17009, stack(0x00007f4b111ad000,0x00007f4b112ae000)] 0x00007f4b4c924800 JavaThread "Thread-38" [_thread_blocked, id=17008, stack(0x00007f4b112ae000,0x00007f4b113af000)] 0x00007f4b4c922800 JavaThread "Thread-37" [_thread_blocked, id=17007, stack(0x00007f4b113af000,0x00007f4b114b0000)] 0x00007f4b4c920800 JavaThread "Thread-36" [_thread_blocked, id=17006, stack(0x00007f4b114b0000,0x00007f4b115b1000)] 0x00007f4b4c91e800 JavaThread "Thread-35" [_thread_blocked, id=17005, stack(0x00007f4b115b1000,0x00007f4b116b2000)] 0x00007f4b4c91c800 JavaThread "Thread-34" [_thread_blocked, id=17004, stack(0x00007f4b116b2000,0x00007f4b117b3000)] 0x00007f4b4c91a800 JavaThread "Thread-33" [_thread_blocked, id=17003, stack(0x00007f4b117b3000,0x00007f4b118b4000)] 0x00007f4b4c918800 JavaThread "Thread-32" [_thread_blocked, id=17002, stack(0x00007f4b118b4000,0x00007f4b119b5000)] 0x00007f4b4c916800 JavaThread "Thread-31" [_thread_blocked, id=17001, stack(0x00007f4b119b5000,0x00007f4b11ab6000)] 0x00007f4b4c914000 JavaThread "Thread-30" [_thread_blocked, id=17000, stack(0x00007f4b11ab6000,0x00007f4b11bb7000)] 0x00007f4b4c912000 JavaThread "Thread-29" [_thread_blocked, id=16999, stack(0x00007f4b11bb7000,0x00007f4b11cb8000)] 0x00007f4b4c910000 JavaThread "Thread-28" [_thread_blocked, id=16998, stack(0x00007f4b11cb8000,0x00007f4b11db9000)] 0x00007f4b4c90e000 JavaThread "Thread-27" [_thread_blocked, id=16997, stack(0x00007f4b11db9000,0x00007f4b11eba000)] 0x00007f4b4c90c000 JavaThread "Thread-26" [_thread_blocked, id=16996, stack(0x00007f4b11eba000,0x00007f4b11fbb000)] 0x00007f4b4c90a000 JavaThread "Thread-25" [_thread_blocked, id=16995, stack(0x00007f4b11fbb000,0x00007f4b120bc000)] 0x00007f4b4c908000 JavaThread "Thread-24" [_thread_blocked, id=16994, stack(0x00007f4b120bc000,0x00007f4b121bd000)] 0x00007f4b4c906000 JavaThread "Thread-23" [_thread_blocked, id=16993, stack(0x00007f4b121bd000,0x00007f4b122be000)] 0x00007f4b4c904000 JavaThread "Thread-22" [_thread_blocked, id=16992, stack(0x00007f4b122be000,0x00007f4b123bf000)] 0x00007f4b4c902800 JavaThread "Thread-21" [_thread_blocked, id=16991, stack(0x00007f4b123bf000,0x00007f4b124c0000)] 0x00007f4b4c900800 JavaThread "Thread-20" [_thread_blocked, id=16990, stack(0x00007f4b124c0000,0x00007f4b125c1000)] 0x00007f4b4c8ff000 JavaThread "Thread-19" [_thread_blocked, id=16989, stack(0x00007f4b125c1000,0x00007f4b126c2000)] 0x00007f4b4c462800 JavaThread "Thread-18" [_thread_blocked, id=16988, stack(0x00007f4b126c2000,0x00007f4b127c3000)] 0x00007f4b4c460800 JavaThread "Thread-17" [_thread_blocked, id=16987, stack(0x00007f4b127c3000,0x00007f4b128c4000)] 0x00007f4b4c45e800 JavaThread "Thread-16" [_thread_blocked, id=16986, stack(0x00007f4b128c4000,0x00007f4b129c5000)] 0x00007f4b4c45d000 JavaThread "Thread-15" [_thread_blocked, id=16985, stack(0x00007f4b129c5000,0x00007f4b12ac6000)] 0x00007f4b4c8d6000 JavaThread "Thread-14" [_thread_blocked, id=16984, stack(0x00007f4b12ac6000,0x00007f4b12bc7000)] 0x00007f4b4c8d4000 JavaThread "Thread-13" [_thread_blocked, id=16983, stack(0x00007f4b12bc7000,0x00007f4b12cc8000)] 0x00007f4b4c8d2800 JavaThread "Thread-12" [_thread_blocked, id=16982, stack(0x00007f4b12cc8000,0x00007f4b12dc9000)] 0x00007f4b4c8d1000 JavaThread "Thread-11" [_thread_blocked, id=16981, stack(0x00007f4b12dc9000,0x00007f4b12eca000)] 0x00007f4b4c8b2800 JavaThread "Thread-10" [_thread_blocked, id=16980, stack(0x00007f4b12eca000,0x00007f4b12fcb000)] 0x00007f4b4c8b1800 JavaThread "Thread-9" [_thread_blocked, id=16979, stack(0x00007f4b12fcb000,0x00007f4b130cc000)] 0x00007f4b4c64c000 JavaThread "Thread-8" [_thread_blocked, id=16978, stack(0x00007f4b130cc000,0x00007f4b131cd000)] 0x00007f4b4c8ae000 JavaThread "Thread-7" [_thread_blocked, id=16977, stack(0x00007f4b131cd000,0x00007f4b132ce000)] 0x00007f4b4c8f2000 JavaThread "Thread-6" [_thread_blocked, id=16976, stack(0x00007f4b132ce000,0x00007f4b133cf000)] 0x00007f4b4c8c3000 JavaThread "Thread-5" [_thread_blocked, id=16975, stack(0x00007f4b133cf000,0x00007f4b134d0000)] 0x00007f4b4c8bf000 JavaThread "Thread-4" [_thread_blocked, id=16974, stack(0x00007f4b134d0000,0x00007f4b135d1000)] 0x00007f4b4c883000 JavaThread "Thread-3" [_thread_blocked, id=16973, stack(0x00007f4b135d1000,0x00007f4b136d2000)] 0x00007f4b4c8e0800 JavaThread "Thread-2" [_thread_blocked, id=16972, stack(0x00007f4b136d2000,0x00007f4b137d3000)] 0x00007f4b4c467000 JavaThread "Thread-1" [_thread_blocked, id=16971, stack(0x00007f4b137d3000,0x00007f4b138d4000)] 0x00007f4b4c09c000 JavaThread "Low Memory Detector" daemon [_thread_blocked, id=16969, stack(0x00007f4b13eff000,0x00007f4b14000000)] 0x00007f4b4c099800 JavaThread "C2 CompilerThread1" daemon [_thread_blocked, id=16968, stack(0x00007f4b48069000,0x00007f4b4816a000)] 0x00007f4b4c096800 JavaThread "C2 CompilerThread0" daemon [_thread_blocked, id=16967, stack(0x00007f4b4816a000,0x00007f4b4826b000)] 0x00007f4b4c094800 JavaThread "Signal Dispatcher" daemon [_thread_blocked, id=16966, stack(0x00007f4b4826b000,0x00007f4b4836c000)] 0x00007f4b4c078800 JavaThread "Finalizer" daemon [_thread_blocked, id=16965, stack(0x00007f4b484f5000,0x00007f4b485f6000)] 0x00007f4b4c076800 JavaThread "Reference Handler" daemon [_thread_blocked, id=16964, stack(0x00007f4b485f6000,0x00007f4b486f7000)] Other Threads: =>0x00007f4b4c070000 VMThread [stack: 0x00007f4b486f7000,0x00007f4b487f8000] [id=16963] 0x00007f4b4c0ae800 WatcherThread [stack: 0x00007f4b13dfe000,0x00007f4b13eff000] [id=16970] VM state:synchronizing (normal execution) VM Mutex/Monitor currently owned by a thread: ([mutex/lock_event]) [0x00007f4b4c0071f0] Safepoint_lock - owner thread: 0x00007f4b4c070000 [0x00007f4b4c007270] Threads_lock - owner thread: 0x00007f4b4c070000 [0x00007f4b4c007770] Heap_lock - owner thread: 0x00007f4b4c90c000 Heap PSYoungGen total 613504K, used 544384K [0x00000007d6560000, 0x00000007ffe80000, 0x0000000800000000) eden space 544384K, 100% used [0x00000007d6560000,0x00000007f7900000,0x00000007f7900000) from space 69120K, 0% used [0x00000007f7900000,0x00000007f7900000,0x00000007fbc80000) to space 67200K, 0% used [0x00000007fbce0000,0x00000007fbce0000,0x00000007ffe80000) PSOldGen total 440320K, used 197954K [0x0000000783000000, 0x000000079de00000, 0x00000007d6560000) object space 440320K, 44% used [0x0000000783000000,0x000000078f150960,0x000000079de00000) PSPermGen total 65792K, used 42880K [0x000000077de00000, 0x0000000781e40000, 0x0000000783000000) object space 65792K, 65% used [0x000000077de00000,0x00000007807e00e0,0x0000000781e40000) Code Cache [0x00007f4b49000000, 0x00007f4b49a20000, 0x00007f4b4c000000) total_blobs=3270 nmethods=2829 adapters=393 free_code_cache=39915648 largest_free_block=25536 Dynamic libraries: 40000000-40009000 r-xp 00000000 08:02 1680687 /usr/lib/jvm/j2sdk1.6-oracle/bin/java 40108000-4010a000 rwxp 00008000 08:02 1680687 /usr/lib/jvm/j2sdk1.6-oracle/bin/java 41c2b000-41d12000 rwxp 00000000 00:00 0 [heap] 77de00000-781e40000 rwxp 00000000 00:00 0 781e40000-782d80000 ---p 00000000 00:00 0 782d80000-783000000 rwxp 00000000 00:00 0 783000000-79de00000 rwxp 00000000 00:00 0 79de00000-7d6560000 rwxp 00000000 00:00 0 7d6560000-7ffe80000 rwxp 00000000 00:00 0 7ffe80000-800000000 ---p 00000000 00:00 0 7f4a44000000-7f4a45217000 rwxp 00000000 00:00 0 7f4a45217000-7f4a48000000 ---p 00000000 00:00 0 7f4a48000000-7f4a4823d000 rwxp 00000000 00:00 0 7f4a4823d000-7f4a4c000000 ---p 00000000 00:00 0 7f4a4c8ab000-7f4a4c8ae000 ---p 00000000 00:00 0 7f4a4c8ae000-7f4a4c9ac000 rwxp 00000000 00:00 0 7f4a4c9ac000-7f4a4c9af000 ---p 00000000 00:00 0 7f4a4c9af000-7f4a4caad000 rwxp 00000000 00:00 0 7f4a4caad000-7f4a4cab0000 ---p 00000000 00:00 0 7f4a4cab0000-7f4a4cbae000 rwxp 00000000 00:00 0 7f4a4cbae000-7f4a4cbb1000 ---p 00000000 00:00 0 7f4a4cbb1000-7f4a4ccaf000 rwxp 00000000 00:00 0 7f4a4ccaf000-7f4a4ccb2000 ---p 00000000 00:00 0 7f4a4ccb2000-7f4a4cdb0000 rwxp 00000000 00:00 0 7f4a4cdb0000-7f4a4cdb3000 ---p 00000000 00:00 0 7f4a4cdb3000-7f4a4ceb1000 rwxp 00000000 00:00 0 7f4a4ceb1000-7f4a4ceb4000 ---p 00000000 00:00 0 7f4a4ceb4000-7f4a4cfb2000 rwxp 00000000 00:00 0 7f4a4cfb2000-7f4a4cfb5000 ---p 00000000 00:00 0 7f4a4cfb5000-7f4a4d0b3000 rwxp 00000000 00:00 0 7f4a4d0b3000-7f4a4d0b6000 ---p 00000000 00:00 0 7f4a4d0b6000-7f4a4d1b4000 rwxp 00000000 00:00 0 7f4a4d1b4000-7f4a4d1b7000 ---p 00000000 00:00 0 7f4a4d1b7000-7f4a4d2b5000 rwxp 00000000 00:00 0 7f4a4d2b5000-7f4a4d2b8000 ---p 00000000 00:00 0 7f4a4d2b8000-7f4a4d3b6000 rwxp 00000000 00:00 0 7f4a4d3b6000-7f4a4d3b9000 ---p 00000000 00:00 0 7f4a4d3b9000-7f4a4d4b7000 rwxp 00000000 00:00 0 7f4a4d4b7000-7f4a4d4ba000 ---p 00000000 00:00 0 7f4a4d4ba000-7f4a4d5b8000 rwxp 00000000 00:00 0 7f4a4d5b8000-7f4a4d5bb000 ---p 00000000 00:00 0 7f4a4d5bb000-7f4a4d6b9000 rwxp 00000000 00:00 0 7f4a4d6b9000-7f4a4d6bc000 ---p 00000000 00:00 0 7f4a4d6bc000-7f4a4d7ba000 rwxp 00000000 00:00 0 7f4a4d7ba000-7f4a4d7bd000 ---p 00000000 00:00 0 7f4a4d7bd000-7f4a4d8bb000 rwxp 00000000 00:00 0 7f4a4d8bb000-7f4a4d8be000 ---p 00000000 00:00 0 7f4a4d8be000-7f4a4d9bc000 rwxp 00000000 00:00 0 7f4a4d9bc000-7f4a4d9bf000 ---p 00000000 00:00 0 7f4a4d9bf000-7f4a4dabd000 rwxp 00000000 00:00 0 7f4a4dabd000-7f4a4dac0000 ---p 00000000 00:00 0 7f4a4dac0000-7f4a4dbbe000 rwxp 00000000 00:00 0 7f4a4dbbe000-7f4a4dbc1000 ---p 00000000 00:00 0 7f4a4dbc1000-7f4a4dcbf000 rwxp 00000000 00:00 0 7f4a4dcbf000-7f4a4dcc2000 ---p 00000000 00:00 0 7f4a4dcc2000-7f4a4ddc0000 rwxp 00000000 00:00 0 7f4a4ddc0000-7f4a4ddc3000 ---p 00000000 00:00 0 7f4a4ddc3000-7f4a4dec1000 rwxp 00000000 00:00 0 7f4a4dec1000-7f4a4dec4000 ---p 00000000 00:00 0 7f4a4dec4000-7f4a4dfc2000 rwxp 00000000 00:00 0 7f4a4dfc2000-7f4a4dfc5000 ---p 00000000 00:00 0 7f4a4dfc5000-7f4a4e0c3000 rwxp 00000000 00:00 0 7f4a4e0c3000-7f4a4e0c6000 ---p 00000000 00:00 0 7f4a4e0c6000-7f4a4e1c4000 rwxp 00000000 00:00 0 7f4a4e1c4000-7f4a4e1c7000 ---p 00000000 00:00 0 7f4a4e1c7000-7f4a4e2c5000 rwxp 00000000 00:00 0 7f4a4e2c5000-7f4a4e2c8000 ---p 00000000 00:00 0 7f4a4e2c8000-7f4a4e3c6000 rwxp 00000000 00:00 0 7f4a4e3c6000-7f4a4e3c9000 ---p 00000000 00:00 0 7f4a4e3c9000-7f4a4e4c7000 rwxp 00000000 00:00 0 7f4a4e4c7000-7f4a4e4ca000 ---p 00000000 00:00 0 7f4a4e4ca000-7f4a4e5c8000 rwxp 00000000 00:00 0 7f4a4e5c8000-7f4a4e5cb000 ---p 00000000 00:00 0 7f4a4e5cb000-7f4a4e6c9000 rwxp 00000000 00:00 0 7f4a4e6c9000-7f4a4e6d0000 r-xp 00000000 08:02 1680052 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/amd64/libnio.so 7f4a4e6d0000-7f4a4e7cf000 ---p 00007000 08:02 1680052 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/amd64/libnio.so 7f4a4e7cf000-7f4a4e7d1000 rwxp 00006000 08:02 1680052 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/amd64/libnio.so 7f4a4e7d1000-7f4a4e7d4000 ---p 00000000 00:00 0 7f4a4e7d4000-7f4a4e8d2000 rwxp 00000000 00:00 0 7f4a4e8d2000-7f4a4e8d8000 r-xp 00000000 08:02 1680007 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/amd64/libmanagement.so 7f4a4e8d8000-7f4a4e9d7000 ---p 00006000 08:02 1680007 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/amd64/libmanagement.so 7f4a4e9d7000-7f4a4e9d9000 rwxp 00005000 08:02 1680007 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/amd64/libmanagement.so 7f4a4e9d9000-7f4a4e9f1000 r-xp 00000000 08:02 1490969 /lib/x86_64-linux-gnu/libresolv-2.15.so 7f4a4e9f1000-7f4a4ebf1000 ---p 00018000 08:02 1490969 /lib/x86_64-linux-gnu/libresolv-2.15.so 7f4a4ebf1000-7f4a4ebf2000 r-xp 00018000 08:02 1490969 /lib/x86_64-linux-gnu/libresolv-2.15.so 7f4a4ebf2000-7f4a4ebf3000 rwxp 00019000 08:02 1490969 /lib/x86_64-linux-gnu/libresolv-2.15.so 7f4a4ebf3000-7f4a4ebf5000 rwxp 00000000 00:00 0 7f4a4ebf5000-7f4a4ebfc000 r-xp 00000000 08:02 1490981 /lib/x86_64-linux-gnu/libnss_dns-2.15.so 7f4a4ebfc000-7f4a4edfb000 ---p 00007000 08:02 1490981 /lib/x86_64-linux-gnu/libnss_dns-2.15.so 7f4a4edfb000-7f4a4edfc000 r-xp 00006000 08:02 1490981 /lib/x86_64-linux-gnu/libnss_dns-2.15.so 7f4a4edfc000-7f4a4edfd000 rwxp 00007000 08:02 1490981 /lib/x86_64-linux-gnu/libnss_dns-2.15.so 7f4a4edfd000-7f4a4ee02000 r-xp 00000000 08:02 1680013 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/amd64/headless/libmawt.so 7f4a4ee02000-7f4a4ef01000 ---p 00005000 08:02 1680013 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/amd64/headless/libmawt.so 7f4a4ef01000-7f4a4ef03000 rwxp 00004000 08:02 1680013 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/amd64/headless/libmawt.so 7f4a4ef03000-7f4a4ef95000 r-xp 00000000 08:02 1680009 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/amd64/libawt.so 7f4a4ef95000-7f4a4f094000 ---p 00092000 08:02 1680009 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/amd64/libawt.so 7f4a4f094000-7f4a4f0ae000 rwxp 00091000 08:02 1680009 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/amd64/libawt.so 7f4a4f0ae000-7f4a4f0d2000 rwxp 00000000 00:00 0 7f4a4f15c000-7f4a4f2dd000 rwxp 00000000 00:00 0 7f4a4f2dd000-7f4a4f2f0000 r-xp 00000000 08:02 1680053 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/amd64/libnet.so 7f4a4f2f0000-7f4a4f3f1000 ---p 00013000 08:02 1680053 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/amd64/libnet.so 7f4a4f3f1000-7f4a4f3f4000 rwxp 00014000 08:02 1680053 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/amd64/libnet.so 7f4a4f3f4000-7f4a4f3f7000 ---p 00000000 00:00 0 7f4a4f3f7000-7f4a4f4f5000 rwxp 00000000 00:00 0 7f4a4f4f5000-7f4a4f4f8000 ---p 00000000 00:00 0 7f4a4f4f8000-7f4a4f5f6000 rwxp 00000000 00:00 0 7f4a4f5f6000-7f4a4f5f9000 ---p 00000000 00:00 0 7f4a4f5f9000-7f4a4f6f7000 rwxp 00000000 00:00 0 7f4a4f6f7000-7f4a4f6fa000 ---p 00000000 00:00 0 7f4a4f6fa000-7f4a4f7f8000 rwxp 00000000 00:00 0 7f4a4f7f8000-7f4a4f7fb000 ---p 00000000 00:00 0 7f4a4f7fb000-7f4a4f8f9000 rwxp 00000000 00:00 0 7f4a4f8f9000-7f4a4f8fc000 ---p 00000000 00:00 0 7f4a4f8fc000-7f4a4f9fa000 rwxp 00000000 00:00 0 7f4a4f9fa000-7f4a4f9fd000 ---p 00000000 00:00 0 7f4a4f9fd000-7f4a4fafb000 rwxp 00000000 00:00 0 7f4a4fafb000-7f4a4fafe000 ---p 00000000 00:00 0 7f4a4fafe000-7f4a4fbfc000 rwxp 00000000 00:00 0 7f4a4fbfc000-7f4a4fbff000 ---p 00000000 00:00 0 7f4a4fbff000-7f4a4fcfd000 rwxp 00000000 00:00 0 7f4a4fcfd000-7f4a4fd00000 ---p 00000000 00:00 0 7f4a4fd00000-7f4a4fdfe000 rwxp 00000000 00:00 0 7f4a4fdfe000-7f4a4fe01000 ---p 00000000 00:00 0 7f4a4fe01000-7f4a4feff000 rwxp 00000000 00:00 0 7f4a4feff000-7f4a4ff02000 ---p 00000000 00:00 0 7f4a4ff02000-7f4a50000000 rwxp 00000000 00:00 0 7f4a50000000-7f4a5019c000 rwxp 00000000 00:00 0 7f4a5019c000-7f4a54000000 ---p 00000000 00:00 0 7f4a54000000-7f4a54153000 rwxp 00000000 00:00 0 7f4a54153000-7f4a58000000 ---p 00000000 00:00 0 7f4a58000000-7f4a581e0000 rwxp 00000000 00:00 0 7f4a581e0000-7f4a5c000000 ---p 00000000 00:00 0 7f4a5c000000-7f4a5c15d000 rwxp 00000000 00:00 0 7f4a5c15d000-7f4a60000000 ---p 00000000 00:00 0 7f4a60000000-7f4a6016a000 rwxp 00000000 00:00 0 7f4a6016a000-7f4a64000000 ---p 00000000 00:00 0 7f4a64000000-7f4a641aa000 rwxp 00000000 00:00 0 7f4a641aa000-7f4a68000000 ---p 00000000 00:00 0 7f4a68000000-7f4a68159000 rwxp 00000000 00:00 0 7f4a68159000-7f4a6c000000 ---p 00000000 00:00 0 7f4a6c000000-7f4a6c170000 rwxp 00000000 00:00 0 7f4a6c170000-7f4a70000000 ---p 00000000 00:00 0 7f4a70000000-7f4a701fc000 rwxp 00000000 00:00 0 7f4a701fc000-7f4a74000000 ---p 00000000 00:00 0 7f4a74000000-7f4a74176000 rwxp 00000000 00:00 0 7f4a74176000-7f4a78000000 ---p 00000000 00:00 0 7f4a78000000-7f4a781b7000 rwxp 00000000 00:00 0 7f4a781b7000-7f4a7c000000 ---p 00000000 00:00 0 7f4a7c000000-7f4a7c1b0000 rwxp 00000000 00:00 0 7f4a7c1b0000-7f4a80000000 ---p 00000000 00:00 0 7f4a80000000-7f4a800f3000 rwxp 00000000 00:00 0 7f4a800f3000-7f4a84000000 ---p 00000000 00:00 0 7f4a84000000-7f4a84334000 rwxp 00000000 00:00 0 7f4a84334000-7f4a88000000 ---p 00000000 00:00 0 7f4a88000000-7f4a880fc000 rwxp 00000000 00:00 0 7f4a880fc000-7f4a8c000000 ---p 00000000 00:00 0 7f4a8c000000-7f4a8c174000 rwxp 00000000 00:00 0 7f4a8c174000-7f4a90000000 ---p 00000000 00:00 0 7f4a90000000-7f4a900b9000 rwxp 00000000 00:00 0 7f4a900b9000-7f4a94000000 ---p 00000000 00:00 0 7f4a94000000-7f4a94211000 rwxp 00000000 00:00 0 7f4a94211000-7f4a98000000 ---p 00000000 00:00 0 7f4a98000000-7f4a980e5000 rwxp 00000000 00:00 0 7f4a980e5000-7f4a9c000000 ---p 00000000 00:00 0 7f4a9c000000-7f4a9c12d000 rwxp 00000000 00:00 0 7f4a9c12d000-7f4aa0000000 ---p 00000000 00:00 0 7f4aa0000000-7f4aa00f0000 rwxp 00000000 00:00 0 7f4aa00f0000-7f4aa4000000 ---p 00000000 00:00 0 7f4aa4000000-7f4aa40e4000 rwxp 00000000 00:00 0 7f4aa40e4000-7f4aa8000000 ---p 00000000 00:00 0 7f4aa8000000-7f4aa80e4000 rwxp 00000000 00:00 0 7f4aa80e4000-7f4aac000000 ---p 00000000 00:00 0 7f4aac000000-7f4aac0c3000 rwxp 00000000 00:00 0 7f4aac0c3000-7f4ab0000000 ---p 00000000 00:00 0 7f4ab0000000-7f4ab0502000 rwxp 00000000 00:00 0 7f4ab0502000-7f4ab4000000 ---p 00000000 00:00 0 7f4ab4000000-7f4ab40ee000 rwxp 00000000 00:00 0 7f4ab40ee000-7f4ab8000000 ---p 00000000 00:00 0 7f4ab8000000-7f4ab8135000 rwxp 00000000 00:00 0 7f4ab8135000-7f4abc000000 ---p 00000000 00:00 0 7f4abc000000-7f4abc130000 rwxp 00000000 00:00 0 7f4abc130000-7f4ac0000000 ---p 00000000 00:00 0 7f4ac0000000-7f4ac00a8000 rwxp 00000000 00:00 0 7f4ac00a8000-7f4ac4000000 ---p 00000000 00:00 0 7f4ac4000000-7f4ac4102000 rwxp 00000000 00:00 0 7f4ac4102000-7f4ac8000000 ---p 00000000 00:00 0 7f4ac8000000-7f4ac80de000 rwxp 00000000 00:00 0 7f4ac80de000-7f4acc000000 ---p 00000000 00:00 0 7f4acc000000-7f4acc0a4000 rwxp 00000000 00:00 0 7f4acc0a4000-7f4ad0000000 ---p 00000000 00:00 0 7f4ad0000000-7f4ad0100000 rwxp 00000000 00:00 0 7f4ad0100000-7f4ad4000000 ---p 00000000 00:00 0 7f4ad4000000-7f4ad408a000 rwxp 00000000 00:00 0 7f4ad408a000-7f4ad8000000 ---p 00000000 00:00 0 7f4ad8000000-7f4ad8050000 rwxp 00000000 00:00 0 7f4ad8050000-7f4adc000000 ---p 00000000 00:00 0 7f4adc000000-7f4adc157000 rwxp 00000000 00:00 0 7f4adc157000-7f4ae0000000 ---p 00000000 00:00 0 7f4ae0000000-7f4ae0054000 rwxp 00000000 00:00 0 7f4ae0054000-7f4ae4000000 ---p 00000000 00:00 0 7f4ae4000000-7f4ae4061000 rwxp 00000000 00:00 0 7f4ae4061000-7f4ae8000000 ---p 00000000 00:00 0 7f4ae8000000-7f4ae80f9000 rwxp 00000000 00:00 0 7f4ae80f9000-7f4aec000000 ---p 00000000 00:00 0 7f4aec000000-7f4aec02e000 rwxp 00000000 00:00 0 7f4aec02e000-7f4af0000000 ---p 00000000 00:00 0 7f4af0000000-7f4af012f000 rwxp 00000000 00:00 0 7f4af012f000-7f4af4000000 ---p 00000000 00:00 0 7f4af4000000-7f4af407a000 rwxp 00000000 00:00 0 7f4af407a000-7f4af8000000 ---p 00000000 00:00 0 7f4af8000000-7f4af8036000 rwxp 00000000 00:00 0 7f4af8036000-7f4afc000000 ---p 00000000 00:00 0 7f4afc000000-7f4afc178000 rwxp 00000000 00:00 0 7f4afc178000-7f4b00000000 ---p 00000000 00:00 0 7f4b00000000-7f4b0003f000 rwxp 00000000 00:00 0 7f4b0003f000-7f4b04000000 ---p 00000000 00:00 0 7f4b04000000-7f4b0407e000 rwxp 00000000 00:00 0 7f4b0407e000-7f4b08000000 ---p 00000000 00:00 0 7f4b08000000-7f4b08021000 rwxp 00000000 00:00 0 7f4b08021000-7f4b0c000000 ---p 00000000 00:00 0 7f4b0c000000-7f4b0ff0a000 rwxp 00000000 00:00 0 7f4b0ff0a000-7f4b10000000 ---p 00000000 00:00 0 7f4b10070000-7f4b10072000 r-xs 00001000 08:02 1680158 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/ext/dnsns.jar 7f4b10072000-7f4b10075000 r-xs 000cc000 08:02 1680154 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/ext/localedata.jar 7f4b10075000-7f4b1007d000 r-xs 00115000 08:02 1679941 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/resources.jar 7f4b1007d000-7f4b10081000 r-xs 00035000 08:02 1680156 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/ext/sunpkcs11.jar 7f4b10081000-7f4b10084000 r-xs 00013000 08:02 1679871 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/jce.jar 7f4b10084000-7f4b10087000 r-xs 00027000 08:02 1680155 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/ext/sunjce_provider.jar 7f4b10087000-7f4b10096000 r-xs 00667000 08:02 1679865 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/charsets.jar 7f4b10096000-7f4b1009c000 r-xs 00095000 08:02 1679866 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/jsse.jar 7f4b1009c000-7f4b1009f000 ---p 00000000 00:00 0 7f4b1009f000-7f4b1019d000 rwxp 00000000 00:00 0 7f4b1019d000-7f4b101a0000 ---p 00000000 00:00 0 7f4b101a0000-7f4b1029e000 rwxp 00000000 00:00 0 7f4b1029e000-7f4b102a1000 ---p 00000000 00:00 0 7f4b102a1000-7f4b1039f000 rwxp 00000000 00:00 0 7f4b1039f000-7f4b103a2000 ---p 00000000 00:00 0 7f4b103a2000-7f4b104a0000 rwxp 00000000 00:00 0 7f4b104a0000-7f4b104a3000 ---p 00000000 00:00 0 7f4b104a3000-7f4b105a1000 rwxp 00000000 00:00 0 7f4b105a1000-7f4b105a4000 ---p 00000000 00:00 0 7f4b105a4000-7f4b106a2000 rwxp 00000000 00:00 0 7f4b106a2000-7f4b106a5000 ---p 00000000 00:00 0 7f4b106a5000-7f4b107a3000 rwxp 00000000 00:00 0 7f4b107a3000-7f4b107a6000 ---p 00000000 00:00 0 7f4b107a6000-7f4b108a4000 rwxp 00000000 00:00 0 7f4b108a4000-7f4b108a7000 ---p 00000000 00:00 0 7f4b108a7000-7f4b109a5000 rwxp 00000000 00:00 0 7f4b109a5000-7f4b109a8000 ---p 00000000 00:00 0 7f4b109a8000-7f4b10aa6000 rwxp 00000000 00:00 0 7f4b10aa6000-7f4b10aa9000 ---p 00000000 00:00 0 7f4b10aa9000-7f4b10ba7000 rwxp 00000000 00:00 0 7f4b10ba7000-7f4b10baa000 ---p 00000000 00:00 0 7f4b10baa000-7f4b10ca8000 rwxp 00000000 00:00 0 7f4b10ca8000-7f4b10cab000 ---p 00000000 00:00 0 7f4b10cab000-7f4b10da9000 rwxp 00000000 00:00 0 7f4b10da9000-7f4b10dac000 ---p 00000000 00:00 0 7f4b10dac000-7f4b10eaa000 rwxp 00000000 00:00 0 7f4b10eaa000-7f4b10ead000 ---p 00000000 00:00 0 7f4b10ead000-7f4b10fab000 rwxp 00000000 00:00 0 7f4b10fab000-7f4b10fae000 ---p 00000000 00:00 0 7f4b10fae000-7f4b110ac000 rwxp 00000000 00:00 0 7f4b110ac000-7f4b110af000 ---p 00000000 00:00 0 7f4b110af000-7f4b111ad000 rwxp 00000000 00:00 0 7f4b111ad000-7f4b111b0000 ---p 00000000 00:00 0 7f4b111b0000-7f4b112ae000 rwxp 00000000 00:00 0 7f4b112ae000-7f4b112b1000 ---p 00000000 00:00 0 7f4b112b1000-7f4b113af000 rwxp 00000000 00:00 0 7f4b113af000-7f4b113b2000 ---p 00000000 00:00 0 7f4b113b2000-7f4b114b0000 rwxp 00000000 00:00 0 7f4b114b0000-7f4b114b3000 ---p 00000000 00:00 0 7f4b114b3000-7f4b115b1000 rwxp 00000000 00:00 0 7f4b115b1000-7f4b115b4000 ---p 00000000 00:00 0 7f4b115b4000-7f4b116b2000 rwxp 00000000 00:00 0 7f4b116b2000-7f4b116b5000 ---p 00000000 00:00 0 7f4b116b5000-7f4b117b3000 rwxp 00000000 00:00 0 7f4b117b3000-7f4b117b6000 ---p 00000000 00:00 0 7f4b117b6000-7f4b118b4000 rwxp 00000000 00:00 0 7f4b118b4000-7f4b118b7000 ---p 00000000 00:00 0 7f4b118b7000-7f4b119b5000 rwxp 00000000 00:00 0 7f4b119b5000-7f4b119b8000 ---p 00000000 00:00 0 7f4b119b8000-7f4b11ab6000 rwxp 00000000 00:00 0 7f4b11ab6000-7f4b11ab9000 ---p 00000000 00:00 0 7f4b11ab9000-7f4b11bb7000 rwxp 00000000 00:00 0 7f4b11bb7000-7f4b11bba000 ---p 00000000 00:00 0 7f4b11bba000-7f4b11cb8000 rwxp 00000000 00:00 0 7f4b11cb8000-7f4b11cbb000 ---p 00000000 00:00 0 7f4b11cbb000-7f4b11db9000 rwxp 00000000 00:00 0 7f4b11db9000-7f4b11dbc000 ---p 00000000 00:00 0 7f4b11dbc000-7f4b11eba000 rwxp 00000000 00:00 0 7f4b11eba000-7f4b11ebd000 ---p 00000000 00:00 0 7f4b11ebd000-7f4b11fbb000 rwxp 00000000 00:00 0 7f4b11fbb000-7f4b11fbe000 ---p 00000000 00:00 0 7f4b11fbe000-7f4b120bc000 rwxp 00000000 00:00 0 7f4b120bc000-7f4b120bf000 ---p 00000000 00:00 0 7f4b120bf000-7f4b121bd000 rwxp 00000000 00:00 0 7f4b121bd000-7f4b121c0000 ---p 00000000 00:00 0 7f4b121c0000-7f4b122be000 rwxp 00000000 00:00 0 7f4b122be000-7f4b122c1000 ---p 00000000 00:00 0 7f4b122c1000-7f4b123bf000 rwxp 00000000 00:00 0 7f4b123bf000-7f4b123c2000 ---p 00000000 00:00 0 7f4b123c2000-7f4b124c0000 rwxp 00000000 00:00 0 7f4b124c0000-7f4b124c3000 ---p 00000000 00:00 0 7f4b124c3000-7f4b125c1000 rwxp 00000000 00:00 0 7f4b125c1000-7f4b125c4000 ---p 00000000 00:00 0 7f4b125c4000-7f4b126c2000 rwxp 00000000 00:00 0 7f4b126c2000-7f4b126c5000 ---p 00000000 00:00 0 7f4b126c5000-7f4b127c3000 rwxp 00000000 00:00 0 7f4b127c3000-7f4b127c6000 ---p 00000000 00:00 0 7f4b127c6000-7f4b128c4000 rwxp 00000000 00:00 0 7f4b128c4000-7f4b128c7000 ---p 00000000 00:00 0 7f4b128c7000-7f4b129c5000 rwxp 00000000 00:00 0 7f4b129c5000-7f4b129c8000 ---p 00000000 00:00 0 7f4b129c8000-7f4b12ac6000 rwxp 00000000 00:00 0 7f4b12ac6000-7f4b12ac9000 ---p 00000000 00:00 0 7f4b12ac9000-7f4b12bc7000 rwxp 00000000 00:00 0 7f4b12bc7000-7f4b12bca000 ---p 00000000 00:00 0 7f4b12bca000-7f4b12cc8000 rwxp 00000000 00:00 0 7f4b12cc8000-7f4b12ccb000 ---p 00000000 00:00 0 7f4b12ccb000-7f4b12dc9000 rwxp 00000000 00:00 0 7f4b12dc9000-7f4b12dcc000 ---p 00000000 00:00 0 7f4b12dcc000-7f4b12eca000 rwxp 00000000 00:00 0 7f4b12eca000-7f4b12ecd000 ---p 00000000 00:00 0 7f4b12ecd000-7f4b12fcb000 rwxp 00000000 00:00 0 7f4b12fcb000-7f4b12fce000 ---p 00000000 00:00 0 7f4b12fce000-7f4b130cc000 rwxp 00000000 00:00 0 7f4b130cc000-7f4b130cf000 ---p 00000000 00:00 0 7f4b130cf000-7f4b131cd000 rwxp 00000000 00:00 0 7f4b131cd000-7f4b131d0000 ---p 00000000 00:00 0 7f4b131d0000-7f4b132ce000 rwxp 00000000 00:00 0 7f4b132ce000-7f4b132d1000 ---p 00000000 00:00 0 7f4b132d1000-7f4b133cf000 rwxp 00000000 00:00 0 7f4b133cf000-7f4b133d2000 ---p 00000000 00:00 0 7f4b133d2000-7f4b134d0000 rwxp 00000000 00:00 0 7f4b134d0000-7f4b134d3000 ---p 00000000 00:00 0 7f4b134d3000-7f4b135d1000 rwxp 00000000 00:00 0 7f4b135d1000-7f4b135d4000 ---p 00000000 00:00 0 7f4b135d4000-7f4b136d2000 rwxp 00000000 00:00 0 7f4b136d2000-7f4b136d5000 ---p 00000000 00:00 0 7f4b136d5000-7f4b137d3000 rwxp 00000000 00:00 0 7f4b137d3000-7f4b137d6000 ---p 00000000 00:00 0 7f4b137d6000-7f4b138d4000 rwxp 00000000 00:00 0 7f4b138d4000-7f4b138df000 r-xs 00091000 08:02 2916355 /home/mota/usr/prism/client/spider-client-0.0.1-SNAPSHOT-runnable-production.jar 7f4b138df000-7f4b138eb000 r-xs 000b3000 08:02 1982496 /home/mota/usr/prism/client/lib/zookeeper-3.4.5.jar 7f4b138eb000-7f4b138ed000 r-xs 0000e000 08:02 1982481 /home/mota/usr/prism/client/lib/zkclient-0.1.jar 7f4b138ed000-7f4b138ee000 r-xs 00001000 08:02 1982590 /home/mota/usr/prism/client/lib/xmlpull-1.1.3.1.jar 7f4b138ee000-7f4b138f4000 r-xs 00019000 08:02 1982498 /home/mota/usr/prism/client/lib/xmlParserAPIs-2.6.2.jar 7f4b138f4000-7f4b138f5000 r-xs 00003000 08:02 1982490 /home/mota/usr/prism/client/lib/xmlenc-0.52.jar 7f4b138f5000-7f4b138fd000 r-xs 00028000 08:02 1982500 /home/mota/usr/prism/client/lib/xml-apis-1.3.04.jar 7f4b138fd000-7f4b13915000 r-xs 00115000 08:02 1982514 /home/mota/usr/prism/client/lib/xercesImpl-2.9.1.jar 7f4b13915000-7f4b1393a000 r-xs 002e3000 08:02 1982503 /home/mota/usr/prism/client/lib/xalan-2.7.1.jar 7f4b1393a000-7f4b13942000 r-xs 0007c000 08:02 1982611 /home/mota/usr/prism/client/lib/webfetcher-3.2.4.jar 7f4b13942000-7f4b1394a000 r-xs 00066000 08:02 1982532 /home/mota/usr/prism/client/lib/velocity-1.7.jar 7f4b1394a000-7f4b13972000 r-xs 00241000 08:02 1982582 /home/mota/usr/prism/client/lib/trove4j-3.0.3.jar 7f4b13972000-7f4b13979000 r-xs 0006e000 08:02 1982478 /home/mota/usr/prism/client/lib/tika-core-1.3.jar 7f4b13979000-7f4b1397b000 r-xs 00005000 08:02 1982560 /home/mota/usr/prism/client/lib/stax-api-1.0.1.jar 7f4b1397b000-7f4b1397e000 r-xs 0000e000 08:02 1982566 /home/mota/usr/prism/client/lib/statistics-console-0.0.3.jar 7f4b1397e000-7f4b139a2000 r-xs 00265000 08:02 1982528 /home/mota/usr/prism/client/lib/standfordparser-1.0.0.jar 7f4b139a2000-7f4b139a6000 r-xs 00038000 08:02 1982539 /home/mota/usr/prism/client/lib/ssh2-2.1.0.jar 7f4b139a6000-7f4b139aa000 r-xs 00020000 08:02 1982563 /home/mota/usr/prism/client/lib/spider-base-0.0.1-SNAPSHOT.jar 7f4b139aa000-7f4b139ac000 r-xs 000f2000 08:02 1982468 /home/mota/usr/prism/client/lib/snappy-java-1.0.3.2.jar 7f4b139ac000-7f4b139ad000 r-xs 00002000 08:02 1982558 /home/mota/usr/prism/client/lib/slf4j-log4j12-1.6.1.jar 7f4b139ad000-7f4b139af000 r-xs 00005000 08:02 1982584 /home/mota/usr/prism/client/lib/slf4j-api-1.6.6.jar 7f4b139af000-7f4b139b0000 r-xs 00013000 08:02 1982616 /home/mota/usr/prism/client/lib/shadow-0.0.6.jar 7f4b139b0000-7f4b139b2000 r-xs 0000a000 08:02 1982565 /home/mota/usr/prism/client/lib/shadow-0.0.5.jar 7f4b139b2000-7f4b139b5000 r-xs 0001e000 08:02 1982600 /home/mota/usr/prism/client/lib/servlet-api-2.5-6.1.14.jar 7f4b139b5000-7f4b139b7000 r-xs 0001f000 08:02 1982479 /home/mota/usr/prism/client/lib/servlet-api-2.5-20081211.jar 7f4b139b7000-7f4b139ba000 r-xs 00041000 08:02 1982614 /home/mota/usr/prism/client/lib/serializer-2.7.1.jar 7f4b139ba000-7f4b13a52000 r-xs 007ae000 08:02 1982486 /home/mota/usr/prism/client/lib/scala-compiler-1.0.jar 7f4b13a52000-7f4b13a53000 r-xs 00003000 08:02 1982581 /home/mota/usr/prism/client/lib/sac-1.3.jar 7f4b13a53000-7f4b13a57000 r-xs 00037000 08:02 1982548 /home/mota/usr/prism/client/lib/rome-2.0.0.jar 7f4b13a57000-7f4b13a5b000 r-xs 00032000 08:02 1982477 /home/mota/usr/prism/client/lib/rome-1.0.jar 7f4b13a5b000-7f4b13a5e000 r-xs 0001b000 08:02 1982501 /home/mota/usr/prism/client/lib/reflections-0.9.9-RC1.jar 7f4b13a5e000-7f4b13a5f000 r-xs 00001000 08:02 1982507 /home/mota/usr/prism/client/lib/redis-cluster-0.1.0.jar 7f4b13a5f000-7f4b13a61000 r-xs 00007000 08:02 1982561 /home/mota/usr/prism/client/lib/redis-bloomfilter-0.0.1.jar 7f4b13a61000-7f4b13a6a000 r-xs 00081000 08:02 1982530 /home/mota/usr/prism/client/lib/quartz-2.1.6.jar 7f4b13a6a000-7f4b13a70000 r-xs 00068000 08:02 1982506 /home/mota/usr/prism/client/lib/protobuf-java-2.4.0a.jar 7f4b13a70000-7f4b13a71000 r-xs 00001000 08:02 1982580 /home/mota/usr/prism/client/lib/prism-common-0.0.1-SNAPSHOT.jar 7f4b13a71000-7f4b13a73000 r-xs 00008000 08:02 1982574 /home/mota/usr/prism/client/lib/parser-0.0.1.jar 7f4b13a73000-7f4b13a75000 r-xs 0000e000 08:02 1982531 /home/mota/usr/prism/client/lib/oro-2.0.8.jar 7f4b13a75000-7f4b13a7c000 r-xs 00069000 08:02 1982469 /home/mota/usr/prism/client/lib/nutch-2.2.1.jar 7f4b13a7c000-7f4b13a85000 r-xs 00059000 08:02 1982608 /home/mota/usr/prism/client/lib/nlp-fudan-1.0.0.jar 7f4b13a85000-7f4b13a94000 r-xs 000b1000 08:02 1982604 /home/mota/usr/prism/client/lib/netty-3.2.2.Final.jar 7f4b13a94000-7f4b13a97000 r-xs 0001c000 08:02 1982606 /home/mota/usr/prism/client/lib/nekohtml-1.9.19.jar 7f4b13a97000-7f4b13a9a000 r-xs 0001e000 08:02 1982610 /home/mota/usr/prism/client/lib/mysql-Interface-0.0.1-SNAPSHOT.jar 7f4b13a9a000-7f4b13aa0000 r-xs 000ab000 08:02 1982549 /home/mota/usr/prism/client/lib/mysql-connector-java-5.1.9.jar 7f4b13aa0000-7f4b13aa4000 r-xs 00066000 08:02 1982499 /home/mota/usr/prism/client/lib/mota-nlp-0.1.6.jar 7f4b13aa4000-7f4b13aa5000 r-xs 00006000 08:02 1982529 /home/mota/usr/prism/client/lib/mota-kafka-0.0.4.jar 7f4b13aa5000-7f4b13aa8000 r-xs 00012000 08:02 1982509 /home/mota/usr/prism/client/lib/metrics-core-2.1.2.jar 7f4b13aa8000-7f4b13aaa000 r-xs 0000a000 08:02 1982472 /home/mota/usr/prism/client/lib/lucene-sandbox-4.0.0.jar 7f4b13aaa000-7f4b13ab3000 r-xs 00055000 08:02 1982474 /home/mota/usr/prism/client/lib/lucene-queryparser-4.0.0.jar 7f4b13ab3000-7f4b13ab8000 r-xs 0002b000 08:02 1982497 /home/mota/usr/prism/client/lib/lucene-queries-4.0.0.jar 7f4b13ab8000-7f4b13ada000 r-xs 001d0000 08:02 1982615 /home/mota/usr/prism/client/lib/lucene-core-4.0.0.jar 7f4b13ada000-7f4b13ae9000 r-xs 00166000 08:02 1982605 /home/mota/usr/prism/client/lib/lucene-analyzers-common-4.0.0.jar 7f4b13ae9000-7f4b13af2000 r-xs 0006d000 08:02 1982516 /home/mota/usr/prism/client/lib/log4j-1.2.16.jar 7f4b13af2000-7f4b13af9000 r-xs 0004c000 08:02 1982471 /home/mota/usr/prism/client/lib/libthrift-0.8.0.jar 7f4b13af9000-7f4b13b58000 r-xs 00582000 08:02 1982557 /home/mota/usr/prism/client/lib/library-1.0.jar 7f4b13b58000-7f4b13b59000 r-xs 00002000 08:02 1982598 /home/mota/usr/prism/client/lib/kfs-0.3.jar 7f4b13b59000-7f4b13b74000 r-xs 00126000 08:02 1982476 /home/mota/usr/prism/client/lib/kafka-0.7.2.jar 7f4b13b74000-7f4b13b77000 r-xs 00033000 08:02 1982489 /home/mota/usr/prism/client/lib/juniversalchardet-1.0.3.jar 7f4b13b77000-7f4b13b7d000 r-xs 00036000 08:02 1982541 /home/mota/usr/prism/client/lib/junit-4.11.jar 7f4b13b7d000-7f4b13b80000 r-xs 0001e000 08:02 1982522 /home/mota/usr/prism/client/lib/jsp-api-2.1-6.1.14.jar 7f4b13b80000-7f4b13b91000 r-xs 000ea000 08:02 1982573 /home/mota/usr/prism/client/lib/jsp-2.1-6.1.14.jar 7f4b13b91000-7f4b13b97000 r-xs 00043000 08:02 1982513 /home/mota/usr/prism/client/lib/jsoup-1.7.3.jar 7f4b13b97000-7f4b13b98000 r-xs 0000b000 08:02 1982572 /home/mota/usr/prism/client/lib/jsonplugin-0.34.jar 7f4b13b98000-7f4b13b9b000 r-xs 00024000 08:02 1982576 /home/mota/usr/prism/client/lib/json-lib-2.4-jdk15.jar 7f4b13b9b000-7f4b13c81000 r-xs 00bcd000 08:02 1982542 /home/mota/usr/prism/client/lib/jruby-complete-1.6.5.jar 7f4b13c81000-7f4b13c83000 r-xs 00014000 08:02 1982482 /home/mota/usr/prism/client/lib/jline-0.9.94.jar 7f4b13c83000-7f4b13c86000 r-xs 0001c000 08:02 1982551 /home/mota/usr/prism/client/lib/jetty-websocket-8.1.12.v20130726.jar 7f4b13c86000-7f4b13c8c000 r-xs 00047000 08:02 1982569 /home/mota/usr/prism/client/lib/jetty-util-9.0.0.RC2.jar 7f4b13c8c000-7f4b13c91000 r-xs 00042000 08:02 1982475 /home/mota/usr/prism/client/lib/jetty-util-8.1.12.v20130726.jar 7f4b13c91000-7f4b13c95000 r-xs 00028000 08:02 1982546 /home/mota/usr/prism/client/lib/jetty-util-6.1.26.jar 7f4b13c95000-7f4b13c96000 r-xs 00005000 08:02 1982571 /home/mota/usr/prism/client/lib/jetty-util5-6.1.26.jar 7f4b13c96000-7f4b13c97000 r-xs 00004000 08:02 1982505 /home/mota/usr/prism/client/lib/jetty-sslengine-6.1.26.jar 7f4b13c97000-7f4b13c9a000 r-xs 00017000 08:02 1982535 /home/mota/usr/prism/client/lib/jetty-io-8.1.12.v20130726.jar 7f4b13c9a000-7f4b13c9c000 r-xs 00016000 08:02 1982466 /home/mota/usr/prism/client/lib/jetty-http-8.1.12.v20130726.jar 7f4b13c9c000-7f4b13c9f000 r-xs 0000f000 08:02 1982470 /home/mota/usr/prism/client/lib/jetty-client-6.1.26.jar 7f4b13c9f000-7f4b13ca6000 r-xs 0007d000 08:02 1982586 /home/mota/usr/prism/client/lib/jetty-6.1.26.jar 7f4b13ca6000-7f4b13ca8000 r-xs 0000f000 08:02 1982567 /home/mota/usr/prism/client/lib/jettison-1.1.jar 7f4b13ca8000-7f4b13cae000 r-xs 00057000 08:02 1982555 /home/mota/usr/prism/client/lib/jets3t-0.7.1.jar 7f4b13cae000-7f4b13cbf000 r-xs 00099000 08:02 1982502 /home/mota/usr/prism/client/lib/jersey-server-1.8.jar 7f4b13cbf000-7f4b13cc3000 r-xs 00021000 08:02 1982602 /home/mota/usr/prism/client/lib/jersey-json-1.8.jar 7f4b13cc3000-7f4b13ccd000 r-xs 00066000 08:02 1982587 /home/mota/usr/prism/client/lib/jersey-core-1.8.jar 7f4b13ccd000-7f4b13cd0000 r-xs 00020000 08:02 1982593 /home/mota/usr/prism/client/lib/jedis-2.1.0.jar 7f4b13cd0000-7f4b13cd5000 r-xs 00044000 08:02 1982595 /home/mota/usr/prism/client/lib/jdom2-2.0.4.jar 7f4b13cd5000-7f4b13cd8000 r-xs 00023000 08:02 1982556 /home/mota/usr/prism/client/lib/jdom-1.0.jar 7f4b13cd8000-7f4b13cda000 r-xs 00003000 08:02 1982495 /home/mota/usr/prism/client/lib/jcl-over-slf4j-1.6.6.jar 7f4b13cda000-7f4b13ced000 r-xs 000c7000 08:02 1982524 /home/mota/usr/prism/client/lib/jaxb-impl-2.2.3-1.jar 7f4b13ced000-7f4b13cf0000 r-xs 00013000 08:02 1982523 /home/mota/usr/prism/client/lib/jaxb-api-2.1.jar 7f4b13cf0000-7f4b13cf9000 r-xs 00098000 08:02 1982591 /home/mota/usr/prism/client/lib/javassist-3.16.1-GA.jar 7f4b13cf9000-7f4b13cfb000 r-xs 00011000 08:02 1982512 /home/mota/usr/prism/client/lib/jasper-runtime-5.5.12.jar 7f4b13cfb000-7f4b13d00000 r-xs 0005e000 08:02 1982540 /home/mota/usr/prism/client/lib/jasper-compiler-5.5.12.jar 7f4b13d00000-7f4b13d02000 r-xs 00004000 08:02 1982578 /home/mota/usr/prism/client/lib/jamon-runtime-2.3.1.jar 7f4b13d02000-7f4b13d03000 r-xs 00006000 08:02 1982601 /home/mota/usr/prism/client/lib/jakarta-regexp-1.4.jar 7f4b13d03000-7f4b13d04000 r-xs 00007000 08:02 1982533 /home/mota/usr/prism/client/lib/jackson-xc-1.8.8.jar 7f4b13d04000-7f4b13d0b000 r-xs 0003c000 08:02 1982508 /home/mota/usr/prism/client/lib/jackson-mapper-asl-1.0.1.jar 7f4b13d0b000-7f4b13d0c000 r-xs 00004000 08:02 1982525 /home/mota/usr/prism/client/lib/jackson-jaxrs-1.8.8.jar 7f4b13d0c000-7f4b13d10000 r-xs 00034000 08:02 1982579 /home/mota/usr/prism/client/lib/jackson-core-asl-1.8.8.jar 7f4b13d10000-7f4b13d2e000 r-xs 00546000 08:02 1982559 /home/mota/usr/prism/client/lib/icu4j-4.0.1.jar 7f4b13d2e000-7f4b13d30000 r-xs 00008000 08:02 1982607 /home/mota/usr/prism/client/lib/httpmime-4.3.1.jar 7f4b13d30000-7f4b13d36000 r-xs 0003f000 08:02 1982467 /home/mota/usr/prism/client/lib/httpcore-4.3.jar 7f4b13d36000-7f4b13d41000 r-xs 00084000 08:02 1982552 /home/mota/usr/prism/client/lib/httpclient-4.3.1.jar 7f4b13d41000-7f4b13d4f000 r-xs 000ef000 08:02 1982537 /home/mota/usr/prism/client/lib/htmlunit-core-js-2.13.jar 7f4b13d4f000-7f4b13d66000 r-xs 00124000 08:02 1982568 /home/mota/usr/prism/client/lib/htmlunit-2.13.jar 7f4b13d66000-7f4b13d6f000 r-xs 000a4000 08:02 1982589 /home/mota/usr/prism/client/lib/hsqldb-1.8.0.10.jar 7f4b13d6f000-7f4b13d71000 r-xs 0000b000 08:02 1982510 /home/mota/usr/prism/client/lib/hornet-utils-4.1.0.jar 7f4b13d71000-7f4b13d73000 r-xs 00012000 08:02 1982494 /home/mota/usr/prism/client/lib/hornet-parse-4.1.0.jar 7f4b13d73000-7f4b13d75000 r-xs 00011000 08:02 1982488 /home/mota/usr/prism/client/lib/hornet-dao-4.1.0.jar 7f4b13d75000-7f4b13d77000 r-xs 00012000 08:02 1982484 /home/mota/usr/prism/client/lib/hornet-client-4.1.0.jar 7f4b13d77000-7f4b13d79000 r-xs 0000d000 08:02 1982515 /home/mota/usr/prism/client/lib/hornet-basic-dao-4.1.0.jar 7f4b13d79000-7f4b13d7b000 r-xs 0000c000 08:02 1982543 /home/mota/usr/prism/client/lib/hornet-base-4.1.0.jar 7f4b13d7b000-7f4b13dc7000 r-xs 004bd000 08:02 1982596 /home/mota/usr/prism/client/lib/hbase-0.94.12.jar 7f4b13dc7000-7f4b13dfe000 r-xs 00389000 08:02 1982517 /home/mota/usr/prism/client/lib/hadoop-core-1.0.4.jar 7f4b13dfe000-7f4b13dff000 ---p 00000000 00:00 0 7f4b13dff000-7f4b13eff000 rwxp 00000000 00:00 0 7f4b13eff000-7f4b13f02000 ---p 00000000 00:00 0 7f4b13f02000-7f4b14000000 rwxp 00000000 00:00 0 7f4b14000000-7f4b14021000 rwxp 00000000 00:00 0 7f4b14021000-7f4b18000000 ---p 00000000 00:00 0 7f4b18000000-7f4b1bff5000 rwxp 00000000 00:00 0 7f4b1bff5000-7f4b1c000000 ---p 00000000 00:00 0 7f4b1c000000-7f4b1c021000 rwxp 00000000 00:00 0 7f4b1c021000-7f4b20000000 ---p 00000000 00:00 0 7f4b20000000-7f4b20043000 rwxp 00000000 00:00 0 7f4b20043000-7f4b24000000 ---p 00000000 00:00 0 7f4b24000000-7f4b24027000 rwxp 00000000 00:00 0 7f4b24027000-7f4b28000000 ---p 00000000 00:00 0 7f4b28000000-7f4b282fa000 rwxp 00000000 00:00 0 7f4b282fa000-7f4b2c000000 ---p 00000000 00:00 0 7f4b2c000000-7f4b2c02b000 rwxp 00000000 00:00 0 7f4b2c02b000-7f4b30000000 ---p 00000000 00:00 0 7f4b30000000-7f4b3002e000 rwxp 00000000 00:00 0 7f4b3002e000-7f4b34000000 ---p 00000000 00:00 0 7f4b34000000-7f4b3402e000 rwxp 00000000 00:00 0 7f4b3402e000-7f4b38000000 ---p 00000000 00:00 0 7f4b38000000-7f4b3803e000 rwxp 00000000 00:00 0 7f4b3803e000-7f4b3c000000 ---p 00000000 00:00 0 7f4b3c000000-7f4b3c033000 rwxp 00000000 00:00 0 7f4b3c033000-7f4b40000000 ---p 00000000 00:00 0 7f4b40000000-7f4b40024000 rwxp 00000000 00:00 0 7f4b40024000-7f4b44000000 ---p 00000000 00:00 0 7f4b44000000-7f4b44038000 rwxp 00000000 00:00 0 7f4b44038000-7f4b48000000 ---p 00000000 00:00 0 7f4b48001000-7f4b48004000 r-xs 00015000 08:02 1982564 /home/mota/usr/prism/client/lib/high-scale-lib-1.1.1.jar 7f4b48004000-7f4b4800d000 r-xs 000c6000 08:02 1982594 /home/mota/usr/prism/client/lib/hessian-4.0.37.jar 7f4b4800d000-7f4b4800f000 r-xs 00011000 08:02 1982550 /home/mota/usr/prism/client/lib/hbase-common-2.2.1.jar 7f4b4800f000-7f4b48039000 r-xs 001e9000 08:02 1982554 /home/mota/usr/prism/client/lib/guava-15.0.jar 7f4b48039000-7f4b4803b000 r-xs 00004000 08:02 1982480 /home/mota/usr/prism/client/lib/gson-xml-java-0.1.7.jar 7f4b4803b000-7f4b48040000 r-xs 0002a000 08:02 1982553 /home/mota/usr/prism/client/lib/gson-2.2.2.jar 7f4b48040000-7f4b48046000 r-xs 00052000 08:02 1982544 /home/mota/usr/prism/client/lib/fastjson-1.1.38.jar 7f4b48046000-7f4b48069000 r-xs 00344000 08:02 1982518 /home/mota/usr/prism/client/lib/core-3.1.1.jar 7f4b48069000-7f4b4806c000 ---p 00000000 00:00 0 7f4b4806c000-7f4b4816a000 rwxp 00000000 00:00 0 7f4b4816a000-7f4b4816d000 ---p 00000000 00:00 0 7f4b4816d000-7f4b4826b000 rwxp 00000000 00:00 0 7f4b4826b000-7f4b4826e000 ---p 00000000 00:00 0 7f4b4826e000-7f4b4836c000 rwxp 00000000 00:00 0 7f4b4836c000-7f4b484f5000 r-xp 00000000 08:02 1478310 /usr/lib/locale/locale-archive 7f4b484f5000-7f4b484f8000 ---p 00000000 00:00 0 7f4b484f8000-7f4b485f6000 rwxp 00000000 00:00 0 7f4b485f6000-7f4b485f9000 ---p 00000000 00:00 0 7f4b485f9000-7f4b486f7000 rwxp 00000000 00:00 0 7f4b486f7000-7f4b486f8000 ---p 00000000 00:00 0 7f4b486f8000-7f4b49a20000 rwxp 00000000 00:00 0 7f4b49a20000-7f4b4c964000 rwxp 00000000 00:00 0 7f4b4c964000-7f4b50000000 ---p 00000000 00:00 0 7f4b50000000-7f4b50002000 r-xs 00009000 08:02 1982588 /home/mota/usr/prism/client/lib/hamcrest-core-1.3.jar 7f4b50002000-7f4b50005000 r-xs 00013000 08:02 1982547 /home/mota/usr/prism/client/lib/ezmorph-1.0.6.jar 7f4b50005000-7f4b50007000 r-xs 00016000 08:02 1982577 /home/mota/usr/prism/client/lib/event-3.0.0.jar 7f4b50007000-7f4b5000c000 r-xs 00038000 08:02 1982609 /home/mota/usr/prism/client/lib/dozer-5.4.0.jar 7f4b5000c000-7f4b50011000 r-xs 00048000 08:02 1982562 /home/mota/usr/prism/client/lib/dom4j-1.6.1.jar 7f4b50011000-7f4b50014000 r-xs 00016000 08:02 1982491 /home/mota/usr/prism/client/lib/commons-pool-1.5.5.jar 7f4b50014000-7f4b50019000 r-xs 00028000 08:02 1982487 /home/mota/usr/prism/client/lib/commons-net-1.4.1.jar 7f4b50019000-7f4b5002a000 r-xs 000bb000 08:02 1982597 /home/mota/usr/prism/client/lib/commons-math-2.1.jar 7f4b5002a000-7f4b50032000 r-xs 0004b000 08:02 1982511 /home/mota/usr/prism/client/lib/commons-math-1.2.jar 7f4b50032000-7f4b50034000 r-xs 0000e000 08:02 1982526 /home/mota/usr/prism/client/lib/commons-logging-1.1.3.jar 7f4b50034000-7f4b5003a000 r-xs 00048000 08:02 1982493 /home/mota/usr/prism/client/lib/commons-lang3-3.1.jar 7f4b5003a000-7f4b5003f000 r-xs 00040000 08:02 1982485 /home/mota/usr/prism/client/lib/commons-lang-2.5.jar 7f4b5003f000-7f4b50043000 r-xs 0002a000 08:02 1982603 /home/mota/usr/prism/client/lib/commons-io-2.4.jar 7f4b50043000-7f4b50048000 r-xs 00040000 08:02 1982534 /home/mota/usr/prism/client/lib/commons-httpclient-3.0.1.jar 7f4b50048000-7f4b5004b000 r-xs 00019000 08:02 1982545 /home/mota/usr/prism/client/lib/commons-el-1.0.jar 7f4b5004b000-7f4b5004f000 r-xs 00020000 08:02 1982599 /home/mota/usr/prism/client/lib/commons-digester-1.8.jar 7f4b5004f000-7f4b50055000 r-xs 00043000 08:02 1982592 /home/mota/usr/prism/client/lib/commons-configuration-1.6.jar 7f4b50055000-7f4b50063000 r-xs 0007f000 08:02 1982575 /home/mota/usr/prism/client/lib/commons-collections-3.2.1.jar 7f4b50063000-7f4b5006a000 r-xs 0003a000 08:02 1982483 /home/mota/usr/prism/client/lib/commons-codec-1.8.jar 7f4b5006a000-7f4b5006c000 r-xs 00009000 08:02 1982492 /home/mota/usr/prism/client/lib/commons-cli-1.2.jar 7f4b5006c000-7f4b50070000 r-xs 0002f000 08:02 1982504 /home/mota/usr/prism/client/lib/commons-beanutils-core-1.8.0.jar 7f4b50070000-7f4b50075000 r-xs 00034000 08:02 1982527 /home/mota/usr/prism/client/lib/commons-beanutils-1.8.3.jar 7f4b50075000-7f4b50078000 r-xs 0001e000 08:02 1982521 /home/mota/usr/prism/client/lib/common-0.1.6.jar 7f4b50078000-7f4b50083000 r-xs 0008a000 08:02 1982585 /home/mota/usr/prism/client/lib/c3p0-0.9.1.1.jar 7f4b50083000-7f4b501b8000 rwxp 00000000 00:00 0 7f4b501b8000-7f4b50354000 r-xs 030c2000 08:02 1679867 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/rt.jar 7f4b50354000-7f4b5037c000 rwxp 00000000 00:00 0 7f4b5037c000-7f4b5037d000 ---p 00000000 00:00 0 7f4b5037d000-7f4b5047d000 rwxp 00000000 00:00 0 7f4b5047d000-7f4b5047e000 ---p 00000000 00:00 0 7f4b5047e000-7f4b5057e000 rwxp 00000000 00:00 0 7f4b5057e000-7f4b5057f000 ---p 00000000 00:00 0 7f4b5057f000-7f4b5067f000 rwxp 00000000 00:00 0 7f4b5067f000-7f4b50680000 ---p 00000000 00:00 0 7f4b50680000-7f4b50780000 rwxp 00000000 00:00 0 7f4b50780000-7f4b50781000 ---p 00000000 00:00 0 7f4b50781000-7f4b50881000 rwxp 00000000 00:00 0 7f4b50881000-7f4b50882000 ---p 00000000 00:00 0 7f4b50882000-7f4b50982000 rwxp 00000000 00:00 0 7f4b50982000-7f4b50983000 ---p 00000000 00:00 0 7f4b50983000-7f4b50a83000 rwxp 00000000 00:00 0 7f4b50a83000-7f4b50a84000 ---p 00000000 00:00 0 7f4b50a84000-7f4b50ba5000 rwxp 00000000 00:00 0 7f4b50ba5000-7f4b50bac000 ---p 00000000 00:00 0 7f4b50bac000-7f4b50bad000 rwxp 00000000 00:00 0 7f4b50bad000-7f4b50c84000 rwxp 00000000 00:00 0 7f4b50c84000-7f4b50e48000 rwxp 00000000 00:00 0 7f4b50e48000-7f4b50e69000 rwxp 00000000 00:00 0 7f4b50e69000-7f4b50e70000 ---p 00000000 00:00 0 7f4b50e70000-7f4b50e71000 rwxp 00000000 00:00 0 7f4b50e71000-7f4b50f48000 rwxp 00000000 00:00 0 7f4b50f48000-7f4b5110b000 rwxp 00000000 00:00 0 7f4b5110b000-7f4b51259000 rwxp 00000000 00:00 0 7f4b51259000-7f4b5125a000 rwxp 00000000 00:00 0 7f4b5125a000-7f4b51268000 r-xp 00000000 08:02 1680042 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/amd64/libzip.so 7f4b51268000-7f4b5136a000 ---p 0000e000 08:02 1680042 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/amd64/libzip.so 7f4b5136a000-7f4b5136d000 rwxp 00010000 08:02 1680042 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/amd64/libzip.so 7f4b5136d000-7f4b5136e000 rwxp 00000000 00:00 0 7f4b5136e000-7f4b5137a000 r-xp 00000000 08:02 1490975 /lib/x86_64-linux-gnu/libnss_files-2.15.so 7f4b5137a000-7f4b51579000 ---p 0000c000 08:02 1490975 /lib/x86_64-linux-gnu/libnss_files-2.15.so 7f4b51579000-7f4b5157a000 r-xp 0000b000 08:02 1490975 /lib/x86_64-linux-gnu/libnss_files-2.15.so 7f4b5157a000-7f4b5157b000 rwxp 0000c000 08:02 1490975 /lib/x86_64-linux-gnu/libnss_files-2.15.so 7f4b5157b000-7f4b51585000 r-xp 00000000 08:02 1490976 /lib/x86_64-linux-gnu/libnss_nis-2.15.so 7f4b51585000-7f4b51785000 ---p 0000a000 08:02 1490976 /lib/x86_64-linux-gnu/libnss_nis-2.15.so 7f4b51785000-7f4b51786000 r-xp 0000a000 08:02 1490976 /lib/x86_64-linux-gnu/libnss_nis-2.15.so 7f4b51786000-7f4b51787000 rwxp 0000b000 08:02 1490976 /lib/x86_64-linux-gnu/libnss_nis-2.15.so 7f4b51787000-7f4b5178f000 r-xp 00000000 08:02 1490978 /lib/x86_64-linux-gnu/libnss_compat-2.15.so 7f4b5178f000-7f4b5198e000 ---p 00008000 08:02 1490978 /lib/x86_64-linux-gnu/libnss_compat-2.15.so 7f4b5198e000-7f4b5198f000 r-xp 00007000 08:02 1490978 /lib/x86_64-linux-gnu/libnss_compat-2.15.so 7f4b5198f000-7f4b51990000 rwxp 00008000 08:02 1490978 /lib/x86_64-linux-gnu/libnss_compat-2.15.so 7f4b51990000-7f4b519a7000 r-xp 00000000 08:02 1490970 /lib/x86_64-linux-gnu/libnsl-2.15.so 7f4b519a7000-7f4b51ba6000 ---p 00017000 08:02 1490970 /lib/x86_64-linux-gnu/libnsl-2.15.so 7f4b51ba6000-7f4b51ba7000 r-xp 00016000 08:02 1490970 /lib/x86_64-linux-gnu/libnsl-2.15.so 7f4b51ba7000-7f4b51ba8000 rwxp 00017000 08:02 1490970 /lib/x86_64-linux-gnu/libnsl-2.15.so 7f4b51ba8000-7f4b51baa000 rwxp 00000000 00:00 0 7f4b51baa000-7f4b51bd3000 r-xp 00000000 08:02 1680029 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/amd64/libjava.so 7f4b51bd3000-7f4b51cd2000 ---p 00029000 08:02 1680029 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/amd64/libjava.so 7f4b51cd2000-7f4b51cd9000 rwxp 00028000 08:02 1680029 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/amd64/libjava.so 7f4b51cd9000-7f4b51ce6000 r-xp 00000000 08:02 1680047 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/amd64/libverify.so 7f4b51ce6000-7f4b51de5000 ---p 0000d000 08:02 1680047 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/amd64/libverify.so 7f4b51de5000-7f4b51de8000 rwxp 0000c000 08:02 1680047 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/amd64/libverify.so 7f4b51de8000-7f4b51def000 r-xp 00000000 08:02 1490972 /lib/x86_64-linux-gnu/librt-2.15.so 7f4b51def000-7f4b51fee000 ---p 00007000 08:02 1490972 /lib/x86_64-linux-gnu/librt-2.15.so 7f4b51fee000-7f4b51fef000 r-xp 00006000 08:02 1490972 /lib/x86_64-linux-gnu/librt-2.15.so 7f4b51fef000-7f4b51ff0000 rwxp 00007000 08:02 1490972 /lib/x86_64-linux-gnu/librt-2.15.so 7f4b51ff0000-7f4b51ff3000 ---p 00000000 00:00 0 7f4b51ff3000-7f4b520f1000 rwxp 00000000 00:00 0 7f4b520f1000-7f4b521ea000 r-xp 00000000 08:02 1490961 /lib/x86_64-linux-gnu/libm-2.15.so 7f4b521ea000-7f4b523e9000 ---p 000f9000 08:02 1490961 /lib/x86_64-linux-gnu/libm-2.15.so 7f4b523e9000-7f4b523ea000 r-xp 000f8000 08:02 1490961 /lib/x86_64-linux-gnu/libm-2.15.so 7f4b523ea000-7f4b523eb000 rwxp 000f9000 08:02 1490961 /lib/x86_64-linux-gnu/libm-2.15.so 7f4b523eb000-7f4b52d05000 r-xp 00000000 08:02 1680023 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/amd64/server/libjvm.so 7f4b52d05000-7f4b52e07000 ---p 0091a000 08:02 1680023 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/amd64/server/libjvm.so 7f4b52e07000-7f4b52fbc000 rwxp 0091c000 08:02 1680023 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/amd64/server/libjvm.so 7f4b52fbc000-7f4b52ff6000 rwxp 00000000 00:00 0 7f4b52ff6000-7f4b531a9000 r-xp 00000000 08:02 1490962 /lib/x86_64-linux-gnu/libc-2.15.so 7f4b531a9000-7f4b533a8000 ---p 001b3000 08:02 1490962 /lib/x86_64-linux-gnu/libc-2.15.so 7f4b533a8000-7f4b533ac000 r-xp 001b2000 08:02 1490962 /lib/x86_64-linux-gnu/libc-2.15.so 7f4b533ac000-7f4b533ae000 rwxp 001b6000 08:02 1490962 /lib/x86_64-linux-gnu/libc-2.15.so 7f4b533ae000-7f4b533b3000 rwxp 00000000 00:00 0 7f4b533b3000-7f4b533b5000 r-xp 00000000 08:02 1490982 /lib/x86_64-linux-gnu/libdl-2.15.so 7f4b533b5000-7f4b535b5000 ---p 00002000 08:02 1490982 /lib/x86_64-linux-gnu/libdl-2.15.so 7f4b535b5000-7f4b535b6000 r-xp 00002000 08:02 1490982 /lib/x86_64-linux-gnu/libdl-2.15.so 7f4b535b6000-7f4b535b7000 rwxp 00003000 08:02 1490982 /lib/x86_64-linux-gnu/libdl-2.15.so 7f4b535b7000-7f4b535cf000 r-xp 00000000 08:02 1490986 /lib/x86_64-linux-gnu/libpthread-2.15.so 7f4b535cf000-7f4b537ce000 ---p 00018000 08:02 1490986 /lib/x86_64-linux-gnu/libpthread-2.15.so 7f4b537ce000-7f4b537cf000 r-xp 00017000 08:02 1490986 /lib/x86_64-linux-gnu/libpthread-2.15.so 7f4b537cf000-7f4b537d0000 rwxp 00018000 08:02 1490986 /lib/x86_64-linux-gnu/libpthread-2.15.so 7f4b537d0000-7f4b537d4000 rwxp 00000000 00:00 0 7f4b537d4000-7f4b537f6000 r-xp 00000000 08:02 1490966 /lib/x86_64-linux-gnu/ld-2.15.so 7f4b537f7000-7f4b537fb000 r-xs 00053000 08:02 1982583 /home/mota/usr/prism/client/lib/cssparser-0.9.11.jar 7f4b537fb000-7f4b537fd000 r-xs 00016000 08:02 1982613 /home/mota/usr/prism/client/lib/crawler-commons-0.2.jar 7f4b537fd000-7f4b53801000 r-xs 00026000 08:02 1982536 /home/mota/usr/prism/client/lib/avro-ipc-1.5.3.jar 7f4b53801000-7f4b53807000 r-xs 0003b000 08:02 1982519 /home/mota/usr/prism/client/lib/avro-1.5.3.jar 7f4b53807000-7f4b53808000 r-xs 0000a000 08:02 1982473 /home/mota/usr/prism/client/lib/asm-3.1.jar 7f4b53808000-7f4b53817000 r-xs 000ee000 08:02 1982612 /home/mota/usr/prism/client/lib/ant-1.6.5.jar 7f4b53817000-7f4b53819000 r-xs 00002000 08:02 1982570 /home/mota/usr/prism/client/lib/ahocorasick-1.2.jar 7f4b53819000-7f4b53842000 rwxp 00000000 00:00 0 7f4b53842000-7f4b538d9000 rwxp 00000000 00:00 0 7f4b538d9000-7f4b538e1000 rwxs 00000000 08:02 2367662 /tmp/hsperfdata_mota/16953 7f4b538e1000-7f4b538e4000 rwxp 00000000 00:00 0 7f4b538e4000-7f4b538eb000 r-xp 00000000 08:02 1680058 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/amd64/jli/libjli.so 7f4b538eb000-7f4b539ec000 ---p 00007000 08:02 1680058 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/amd64/jli/libjli.so 7f4b539ec000-7f4b539ee000 rwxp 00008000 08:02 1680058 /usr/lib/jvm/j2sdk1.6-oracle/jre/lib/amd64/jli/libjli.so 7f4b539ee000-7f4b539ef000 rwxp 00000000 00:00 0 7f4b539ef000-7f4b539f0000 r-xs 00005000 08:02 1982520 /home/mota/usr/prism/client/lib/forumfetcher-0.0.8.jar 7f4b539f0000-7f4b539f2000 r-xs 0000e000 08:02 1982538 /home/mota/usr/prism/client/lib/activation-1.1.jar 7f4b539f2000-7f4b539f3000 rwxp 00000000 00:00 0 7f4b539f3000-7f4b539f4000 ---p 00000000 00:00 0 7f4b539f4000-7f4b539f6000 rwxp 00000000 00:00 0 7f4b539f6000-7f4b539f7000 r-xp 00022000 08:02 1490966 /lib/x86_64-linux-gnu/ld-2.15.so 7f4b539f7000-7f4b539f9000 rwxp 00023000 08:02 1490966 /lib/x86_64-linux-gnu/ld-2.15.so 7fffdde20000-7fffdde43000 rwxp 00000000 00:00 0
spring&ibatis事务配置问题
xml配置 [code="xml"] <!-- 默认的数据源配置 --> <bean id="talent.defaultDataSource" class="org.springframework.jdbc.datasource.DriverManagerDataSource"> <property name="driverClassName" value="${jdbc.default.driverClassName}" /> <property name="url" value="${jdbc.default.url}" /> <property name="username" value="${jdbc.default.username}" /> <property name="password" value="${jdbc.default.password}" /> </bean> <!-- 事务配置 --> <bean id="talent.defaultTransactionManager" class="org.springframework.jdbc.datasource.DataSourceTransactionManager"> <property name="dataSource" ref="talent.defaultDataSource" /> </bean> <!-- 配置事务特性 --> <tx:advice id="txAdvice" transaction-manager="talent.defaultTransactionManager"> <tx:attributes> <tx:method name="add*" propagation="REQUIRED" /> <tx:method name="save*" propagation="REQUIRED" /> <tx:method name="insert*" propagation="REQUIRED" /> <tx:method name="del*" propagation="REQUIRED" /> <tx:method name="update*" propagation="REQUIRED" /> <tx:method name="main*" propagation="REQUIRED" /> <tx:method name="*" read-only="true" /> </tx:attributes> </tx:advice> <!-- 配置哪些类的方法需要进行事务管理 --> <aop:config> <aop:pointcut id="allManagerMethod" expression="execution(* com.jstrd.talent.manager.MyTransactionTemplate.*(..))" /> <aop:advisor advice-ref="txAdvice" pointcut-ref="allManagerMethod" /> </aop:config> <!-- 默认的DaoFactory --> <bean id="talent.defaultDaoFactory" class="com.jstrd.talent.dao.DaoFactory"> <constructor-arg value="${jdbc.default.db.dialect}" /> </bean> <!-- 默认的SqlMapClient --> <bean id="talent.defaultSqlMapClient" class="org.springframework.orm.ibatis.SqlMapClientFactoryBean"> <property name="dataSource" ref="talent.defaultDataSource" /> <property name="configLocation" value="classpath:talent/ibatis/sql-map-config.xml" /> <!-- 自动加载sql-mapping文件 --> <property name="mappingLocations"> <value>${ibatis.mappingLocations}</value> </property> <property name="useTransactionAwareDataSource" value="true"></property> </bean> <!-- 默认的SqlMapDao --> <bean id="talent.defaultSqlMapDao" class="com.jstrd.talent.dao.SqlMapDao"> <property name="sqlMapClient" ref="talent.defaultSqlMapClient" /> </bean> <!-- 默认的SqlMapClientTemplate --> <bean id="talent.defaultSqlMapClientTemplate" class="org.springframework.orm.ibatis.SqlMapClientTemplate"> <property name="sqlMapClient" ref="talent.defaultSqlMapClient" /> </bean> [/code] java代码 [code="java"] public class MyTransactionTemplate { public void addXX() throws Exception { SqlMapDao dao = DaoFactory.getSqlMapDao(); SqlMapClientTemplate sqlMap = dao.getSqlMapClientTemplate();//dao.getSqlMapClientTemplate();//(SqlMapClientTemplate)BeanFactory.getBean("talent.defaultSqlMapClientTemplate"); sqlMap.update("t_user.delete"); sqlMap.update("t_user.insert", new TUser(29, "tan29")); sqlMap.update("t_user.insert", new TUser(30, "tan30")); sqlMap.update("t_user.insert", new TUser(32, "tan32")); sqlMap.update("t_user.updateById", new TUser(29, "tan30"));//这里违反唯一约束,会抛异常的 sqlMap.update("t_user.updateById", new TUser(29, "tan28")); } } [/code] java代码中,连续几个insert,然后两上update,其中一个update是会抛异常的。但是发现前面的insert已经在数据库中生效了,并不回滚 [b]问题补充:[/b] 数据库是mysql5 "MyTransactionTemplate根本没有处于事务中啊"??已经配置在其中了的: execution(* com.jstrd.talent.manager.MyTransactionTemplate.*(..))" [b]问题补充:[/b] 谢谢kyo100900的回答!我改成InnoDb后,事务还是没控制住, 日志发现sqlMap.update()每次都会 Fetching JDBC Connection from DataSource 然后 Returning JDBC Connection to DataSource 这里可能也有问题?! [b]问题补充:[/b] 换了oracle环境还是一样,我怀疑是配错了 [b]问题补充:[/b] xml配置 <pre name="code" class="xml"> &lt;!-- 默认的数据源配置 --> &lt;bean id="talent.defaultDataSource" class="org.springframework.jdbc.datasource.DriverManagerDataSource"> &lt;property name="driverClassName" value="${jdbc.default.driverClassName}" /> &lt;property name="url" value="${jdbc.default.url}" /> &lt;property name="username" value="${jdbc.default.username}" /> &lt;property name="password" value="${jdbc.default.password}" /> &lt;/bean> &lt;!-- 事务配置 --> &lt;bean id="talent.defaultTransactionManager" class="org.springframework.jdbc.datasource.DataSourceTransactionManager"> &lt;property name="dataSource" ref="talent.defaultDataSource" /> &lt;/bean> &lt;!-- 配置事务特性 --> &lt;tx:advice id="txAdvice" transaction-manager="talent.defaultTransactionManager"> &lt;tx:attributes> &lt;tx:method name="add*" propagation="REQUIRED" /> &lt;tx:method name="save*" propagation="REQUIRED" /> &lt;tx:method name="insert*" propagation="REQUIRED" /> &lt;tx:method name="del*" propagation="REQUIRED" /> &lt;tx:method name="update*" propagation="REQUIRED" /> &lt;tx:method name="main*" propagation="REQUIRED" /> &lt;tx:method name="*" read-only="true" /> &lt;/tx:attributes> &lt;/tx:advice> &lt;!-- 配置哪些类的方法需要进行事务管理 --> &lt;aop:config> &lt;aop:pointcut id="allManagerMethod" expression="execution(* com.jstrd.talent.manager.MyTransactionTemplate.*(..))" /> &lt;aop:advisor advice-ref="txAdvice" pointcut-ref="allManagerMethod" /> &lt;/aop:config> &lt;!-- 默认的DaoFactory --> &lt;bean id="talent.defaultDaoFactory" class="com.jstrd.talent.dao.DaoFactory"> &lt;constructor-arg value="${jdbc.default.db.dialect}" /> &lt;/bean> &lt;!-- 默认的SqlMapClient --> &lt;bean id="talent.defaultSqlMapClient" class="org.springframework.orm.ibatis.SqlMapClientFactoryBean"> &lt;property name="dataSource" ref="talent.defaultDataSource" /> &lt;property name="configLocation" value="classpath:talent/ibatis/sql-map-config.xml" /> &lt;!-- 自动加载sql-mapping文件 --> &lt;property name="mappingLocations"> &lt;value>${ibatis.mappingLocations}&lt;/value> &lt;/property> &lt;property name="useTransactionAwareDataSource" value="true">&lt;/property> &lt;/bean> &lt;!-- 默认的SqlMapDao --> &lt;bean id="talent.defaultSqlMapDao" class="com.jstrd.talent.dao.SqlMapDao"> &lt;property name="sqlMapClient" ref="talent.defaultSqlMapClient" /> &lt;/bean> &lt;!-- 默认的SqlMapClientTemplate --> &lt;bean id="talent.defaultSqlMapClientTemplate" class="org.springframework.orm.ibatis.SqlMapClientTemplate"> &lt;property name="sqlMapClient" ref="talent.defaultSqlMapClient" /> &lt;/bean> </pre> java代码 <pre name="code" class="java"> public class MyTransactionTemplate { public void addXX() throws Exception { SqlMapDao dao = DaoFactory.getSqlMapDao(); SqlMapClientTemplate sqlMap = dao.getSqlMapClientTemplate();//dao.getSqlMapClientTemplate();//(SqlMapClientTemplate)BeanFactory.getBean("talent.defaultSqlMapClientTemplate"); sqlMap.update("t_user.delete"); sqlMap.update("t_user.insert", new TUser(29, "tan29")); sqlMap.update("t_user.insert", new TUser(30, "tan30")); sqlMap.update("t_user.insert", new TUser(32, "tan32")); sqlMap.update("t_user.updateById", new TUser(29, "tan30"));//这里违反唯一约束,会抛异常的 sqlMap.update("t_user.updateById", new TUser(29, "tan28")); } } </pre> java代码中,连续几个insert,然后两上update,其中一个update是会抛异常的。但是发现前面的insert已经在数据库中生效了,并不回滚 <strong>问题补充:</strong> 数据库是mysql5 "MyTransactionTemplate根本没有处于事务中啊"??已经配置在其中了的: execution(* com.jstrd.talent.manager.MyTransactionTemplate.*(..))" <strong>问题补充:</strong> 谢谢kyo100900的回答!我改成InnoDb后,事务还是没控制住, 日志发现sqlMap.update()每次都会 Fetching JDBC Connection from DataSource 然后 Returning JDBC Connection to DataSource 这里可能也有问题?! <strong>问题补充:</strong> 换了oracle环境还是一样,我怀疑是配错了 调试发现SqlMapClientTemplate的事务管理器为com.ibatis.sqlmap.engine.transaction.TransactionManager@19f9c7a.跟配置的不一样啊 [b]问题补充:[/b] "MyTransactionTemplate根本没有处于事务中啊"??已经配置在其中了的: execution(* com.jstrd.talent.manager.MyTransactionTemplate.*(..))" &lt;strong>问题补充:&lt;/strong> 谢谢kyo100900的回答!我改成InnoDb后,事务还是没控制住, 日志发现sqlMap.update()每次都会 Fetching JDBC Connection from DataSource 然后 Returning JDBC Connection to DataSource 这里可能也有问题?! &lt;strong>问题补充:&lt;/strong> 换了oracle环境还是一样,我怀疑是配错了 调试发现SqlMapClientTemplate的事务管理器为com.ibatis.sqlmap.engine.transaction.TransactionManager@19f9c7a.跟配置的不一样啊
oozie调用sqoop import任务,出现异常
oozie4.3.1 sqoop1.4.7 workflow.xml ``` <workflow-app xmlns="uri:oozie:workflow:0.5" name="sqoop-import-wf"> <start to="sqoop-node"/> <action name="sqoop-node"> <sqoop xmlns="uri:oozie:sqoop-action:0.3"> <job-tracker>${jobTracker}</job-tracker> <name-node>${nameNode}</name-node> <configuration> <property> <name>mapred.job.queue.name</name> <value>${queueName}</value> </property> <property> <name>oozie.sqoop.log.level</name> <value>WARN</value> </property> </configuration> <command>import --connect jdbc:mysql://study:3306/test --username root --password 123456 --table terminal_info --where "update_time between 20180615230000 and 20180616225959" --target-dir "/user/hive/warehouse/temp_terminal_info" --append --fields-terminated-by "," --lines-terminated-by "\n" --num-mappers 1 --direct</command> </sqoop> <ok to="end"/> <error to="fail"/> </action> <kill name="fail"> <message>Sqoop failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message> </kill> <end name="end"/> </workflow-app> ``` oozie job log 2018-06-19 10:12:10,615 WARN SqoopActionExecutor:523 - SERVER[study] USER[root] GROUP[-] TOKEN[] APP[sqoop-import-wf] JOB[0000017-180619092621453-oozie-root-W] ACTION[0000017-180619092621453-oozie-root-W@sqoop-node] Launcher ERROR, reason: Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1] mapreduce job 执行成功了 ``` Job Name: oozie:launcher:T=sqoop:W=sqoop-import-wf:A=sqoop-node:ID=0000017-180619092621453-oozie-root-W User Name: root Queue: root.root State: SUCCEEDED Uberized: true Submitted: Tue Jun 19 10:11:59 CST 2018 Started: Tue Jun 19 10:12:07 CST 2018 Finished: Tue Jun 19 10:12:08 CST 2018 Elapsed: 1sec Diagnostics: Average Map Time 1sec ``` 但是oozie 出现了 Launcher ERROR, reason: Main class [org.apache.oozie.action.hadoop.SqoopMain], exit code [1] 也没有其他的日志,请问有碰到这样问题的朋友吗 或者怎么去排查这个问题 求大神帮忙
请教mysql关于用户连续登陆天数的sql语句,用户在一天内可多次登陆
![图片说明](https://img-ask.csdn.net/upload/201609/11/1473569207_44549.png) 用户登录日志表,如何获取用户连续登陆的天数,用户在一天内可多次登陆;附上建表语句; CREATE TABLE `ap_user_login_logs` ( `id` bigint(20) NOT NULL AUTO_INCREMENT, `user_id` bigint(20) NOT NULL DEFAULT '0', `ip` varchar(20) NOT NULL DEFAULT '', `login_time` datetime NOT NULL DEFAULT '0000-00-00 00:00:00', `updated` datetime NOT NULL DEFAULT '0000-00-00 00:00:00', `created` datetime NOT NULL DEFAULT '0000-00-00 00:00:00', PRIMARY KEY (`id`) ) ENGINE=InnoDB AUTO_INCREMENT=26 DEFAULT CHARSET=utf8; /*Data for the table `ap_user_login_logs` */ insert into `ap_user_login_logs`(`id`,`user_id`,`ip`,`login_time`,`updated`,`created`) values (2,1,'','2016-09-07 19:50:16','0000-00-00 00:00:00','0000-00-00 00:00:00'),(3,1,'','2016-09-08 19:50:22','0000-00-00 00:00:00','0000-00-00 00:00:00'),(4,1,'','2016-09-09 19:50:24','0000-00-00 00:00:00','0000-00-00 00:00:00'),(5,2,'','2016-09-07 19:50:27','0000-00-00 00:00:00','0000-00-00 00:00:00'),(6,2,'','2016-09-08 19:50:31','0000-00-00 00:00:00','0000-00-00 00:00:00'),(7,2,'','2016-09-08 19:50:35','0000-00-00 00:00:00','0000-00-00 00:00:00'),(8,3,'','2016-09-08 19:52:08','0000-00-00 00:00:00','0000-00-00 00:00:00'),(9,3,'','2016-09-09 19:52:12','0000-00-00 00:00:00','0000-00-00 00:00:00'),(10,4,'','2016-09-09 19:52:21','0000-00-00 00:00:00','0000-00-00 00:00:00'),(12,1,'','2016-09-10 16:09:45','0000-00-00 00:00:00','0000-00-00 00:00:00'),(13,1,'','2016-09-10 21:09:54','0000-00-00 00:00:00','0000-00-00 00:00:00'),(14,2,'','2016-09-10 16:10:07','0000-00-00 00:00:00','0000-00-00 00:00:00'),(15,3,'','2016-09-10 16:10:16','0000-00-00 00:00:00','0000-00-00 00:00:00'),(16,4,'','2016-09-10 16:11:15','0000-00-00 00:00:00','0000-00-00 00:00:00'); /*Table structure for table `ap_user_praise` */
Java学习的正确打开方式
在博主认为,对于入门级学习java的最佳学习方法莫过于视频+博客+书籍+总结,前三者博主将淋漓尽致地挥毫于这篇博客文章中,至于总结在于个人,实际上越到后面你会发现学习的最好方式就是阅读参考官方文档其次就是国内的书籍,博客次之,这又是一个层次了,这里暂时不提后面再谈。博主将为各位入门java保驾护航,各位只管冲鸭!!!上天是公平的,只要不辜负时间,时间自然不会辜负你。 何谓学习?博主所理解的学习,它是一个过程,是一个不断累积、不断沉淀、不断总结、善于传达自己的个人见解以及乐于分享的过程。
大学四年自学走来,这些私藏的实用工具/学习网站我贡献出来了
大学四年,看课本是不可能一直看课本的了,对于学习,特别是自学,善于搜索网上的一些资源来辅助,还是非常有必要的,下面我就把这几年私藏的各种资源,网站贡献出来给你们。主要有:电子书搜索、实用工具、在线视频学习网站、非视频学习网站、软件下载、面试/求职必备网站。 注意:文中提到的所有资源,文末我都给你整理好了,你们只管拿去,如果觉得不错,转发、分享就是最大的支持了。 一、电子书搜索 对于大部分程序员...
linux系列之常用运维命令整理笔录
本博客记录工作中需要的linux运维命令,大学时候开始接触linux,会一些基本操作,可是都没有整理起来,加上是做开发,不做运维,有些命令忘记了,所以现在整理成博客,当然vi,文件操作等就不介绍了,慢慢积累一些其它拓展的命令,博客不定时更新 free -m 其中:m表示兆,也可以用g,注意都要小写 Men:表示物理内存统计 total:表示物理内存总数(total=used+free) use...
Vue + Spring Boot 项目实战(十四):用户认证方案与完善的访问拦截
本篇文章主要讲解 token、session 等用户认证方案的区别并分析常见误区,以及如何通过前后端的配合实现完善的访问拦截,为下一步权限控制的实现打下基础。
比特币原理详解
一、什么是比特币 比特币是一种电子货币,是一种基于密码学的货币,在2008年11月1日由中本聪发表比特币白皮书,文中提出了一种去中心化的电子记账系统,我们平时的电子现金是银行来记账,因为银行的背后是国家信用。去中心化电子记账系统是参与者共同记账。比特币可以防止主权危机、信用风险。其好处不多做赘述,这一层面介绍的文章很多,本文主要从更深层的技术原理角度进行介绍。 二、问题引入 假设现有4个人...
程序员接私活怎样防止做完了不给钱?
首先跟大家说明一点,我们做 IT 类的外包开发,是非标品开发,所以很有可能在开发过程中会有这样那样的需求修改,而这种需求修改很容易造成扯皮,进而影响到费用支付,甚至出现做完了项目收不到钱的情况。 那么,怎么保证自己的薪酬安全呢? 我们在开工前,一定要做好一些证据方面的准备(也就是“讨薪”的理论依据),这其中最重要的就是需求文档和验收标准。一定要让需求方提供这两个文档资料作为开发的基础。之后开发...
网页实现一个简单的音乐播放器(大佬别看。(⊙﹏⊙))
今天闲着无事,就想写点东西。然后听了下歌,就打算写个播放器。 于是乎用h5 audio的加上js简单的播放器完工了。 演示地点演示 html代码如下` music 这个年纪 七月的风 音乐 ` 然后就是css`*{ margin: 0; padding: 0; text-decoration: none; list-...
Python十大装B语法
Python 是一种代表简单思想的语言,其语法相对简单,很容易上手。不过,如果就此小视 Python 语法的精妙和深邃,那就大错特错了。本文精心筛选了最能展现 Python 语法之精妙的十个知识点,并附上详细的实例代码。如能在实战中融会贯通、灵活使用,必将使代码更为精炼、高效,同时也会极大提升代码B格,使之看上去更老练,读起来更优雅。
数据库优化 - SQL优化
以实际SQL入手,带你一步一步走上SQL优化之路!
2019年11月中国大陆编程语言排行榜
2019年11月2日,我统计了某招聘网站,获得有效程序员招聘数据9万条。针对招聘信息,提取编程语言关键字,并统计如下: 编程语言比例 rank pl_ percentage 1 java 33.62% 2 cpp 16.42% 3 c_sharp 12.82% 4 javascript 12.31% 5 python 7.93% 6 go 7.25% 7 p...
通俗易懂地给女朋友讲:线程池的内部原理
餐盘在灯光的照耀下格外晶莹洁白,女朋友拿起红酒杯轻轻地抿了一小口,对我说:“经常听你说线程池,到底线程池到底是个什么原理?”
经典算法(5)杨辉三角
杨辉三角 是经典算法,这篇博客对它的算法思想进行了讲解,并有完整的代码实现。
腾讯算法面试题:64匹马8个跑道需要多少轮才能选出最快的四匹?
昨天,有网友私信我,说去阿里面试,彻底的被打击到了。问了为什么网上大量使用ThreadLocal的源码都会加上private static?他被难住了,因为他从来都没有考虑过这个问题。无独有偶,今天笔者又发现有网友吐槽了一道腾讯的面试题,我们一起来看看。 腾讯算法面试题:64匹马8个跑道需要多少轮才能选出最快的四匹? 在互联网职场论坛,一名程序员发帖求助到。二面腾讯,其中一个算法题:64匹...
面试官:你连RESTful都不知道我怎么敢要你?
干货,2019 RESTful最贱实践
SQL-小白最佳入门sql查询一
不要偷偷的查询我的个人资料,即使你再喜欢我,也不要这样,真的不好;
项目中的if else太多了,该怎么重构?
介绍 最近跟着公司的大佬开发了一款IM系统,类似QQ和微信哈,就是聊天软件。我们有一部分业务逻辑是这样的 if (msgType = "文本") { // dosomething } else if(msgType = "图片") { // doshomething } else if(msgType = "视频") { // doshomething } else { // doshom...
漫话:什么是平衡(AVL)树?这应该是把AVL树讲的最好的文章了
这篇文章通过对话的形式,由浅入深带你读懂 AVL 树,看完让你保证理解 AVL 树的各种操作,如果觉得不错,别吝啬你的赞哦。 1、若它的左子树不为空,则左子树上所有的节点值都小于它的根节点值。 2、若它的右子树不为空,则右子树上所有的节点值均大于它的根节点值。 3、它的左右子树也分别可以充当为二叉查找树。 例如: 例如,我现在想要查找数值为14的节点。由于二叉查找树的特性,我们可...
“狗屁不通文章生成器”登顶GitHub热榜,分分钟写出万字形式主义大作
一、垃圾文字生成器介绍 最近在浏览GitHub的时候,发现了这样一个骨骼清奇的雷人项目,而且热度还特别高。 项目中文名:狗屁不通文章生成器 项目英文名:BullshitGenerator 根据作者的介绍,他是偶尔需要一些中文文字用于GUI开发时测试文本渲染,因此开发了这个废话生成器。但由于生成的废话实在是太过富于哲理,所以最近已经被小伙伴们给玩坏了。 他的文风可能是这样的: 你发现,...
程序员:我终于知道post和get的区别
是一个老生常谈的话题,然而随着不断的学习,对于以前的认识有很多误区,所以还是需要不断地总结的,学而时习之,不亦说乎
《程序人生》系列-这个程序员只用了20行代码就拿了冠军
你知道的越多,你不知道的越多 点赞再看,养成习惯GitHub上已经开源https://github.com/JavaFamily,有一线大厂面试点脑图,欢迎Star和完善 前言 这一期不算《吊打面试官》系列的,所有没前言我直接开始。 絮叨 本来应该是没有这期的,看过我上期的小伙伴应该是知道的嘛,双十一比较忙嘛,要值班又要去帮忙拍摄年会的视频素材,还得搞个程序员一天的Vlog,还要写BU...
开源并不是你认为的那些事
点击上方蓝字 关注我们开源之道导读所以 ————想要理清开源是什么?先要厘清开源不是什么,名正言顺是句中国的古代成语,概念本身的理解非常之重要。大部分生物多样性的起源,...
加快推动区块链技术和产业创新发展,2019可信区块链峰会在京召开
11月8日,由中国信息通信研究院、中国通信标准化协会、中国互联网协会、可信区块链推进计划联合主办,科技行者协办的2019可信区块链峰会将在北京悠唐皇冠假日酒店开幕。   区块链技术被认为是继蒸汽机、电力、互联网之后,下一代颠覆性的核心技术。如果说蒸汽机释放了人类的生产力,电力解决了人类基本的生活需求,互联网彻底改变了信息传递的方式,区块链作为构造信任的技术有重要的价值。   1...
程序员把地府后台管理系统做出来了,还有3.0版本!12月7号最新消息:已在开发中有github地址
第一幕:缘起 听说阎王爷要做个生死簿后台管理系统,我们派去了一个程序员…… 996程序员做的梦: 第一场:团队招募 为了应对地府管理危机,阎王打算找“人”开发一套地府后台管理系统,于是就在地府总经办群中发了项目需求。 话说还是中国电信的信号好,地府都是满格,哈哈!!! 经常会有外行朋友问:看某网站做的不错,功能也简单,你帮忙做一下? 而这次,面对这样的需求,这个程序员...
网易云6亿用户音乐推荐算法
网易云音乐是音乐爱好者的集聚地,云音乐推荐系统致力于通过 AI 算法的落地,实现用户千人千面的个性化推荐,为用户带来不一样的听歌体验。 本次分享重点介绍 AI 算法在音乐推荐中的应用实践,以及在算法落地过程中遇到的挑战和解决方案。 将从如下两个部分展开: AI算法在音乐推荐中的应用 音乐场景下的 AI 思考 从 2013 年 4 月正式上线至今,网易云音乐平台持续提供着:乐屏社区、UGC...
【技巧总结】位运算装逼指南
位算法的效率有多快我就不说,不信你可以去用 10 亿个数据模拟一下,今天给大家讲一讲位运算的一些经典例子。不过,最重要的不是看懂了这些例子就好,而是要在以后多去运用位运算这些技巧,当然,采用位运算,也是可以装逼的,不信,你往下看。我会从最简单的讲起,一道比一道难度递增,不过居然是讲技巧,那么也不会太难,相信你分分钟看懂。 判断奇偶数 判断一个数是基于还是偶数,相信很多人都做过,一般的做法的代码如下...
《C++ Primer》学习笔记(六):C++模块设计——函数
专栏C++学习笔记 《C++ Primer》学习笔记/习题答案 总目录 https://blog.csdn.net/TeFuirnever/article/details/100700212 —————————————————————————————————————————————————————— 《C++ Primer》习题参考答案:第6章 - C++模块设计——函数 文章目录专栏C+...
8年经验面试官详解 Java 面试秘诀
作者 |胡书敏 责编 | 刘静 出品 | CSDN(ID:CSDNnews) 本人目前在一家知名外企担任架构师,而且最近八年来,在多家外企和互联网公司担任Java技术面试官,前后累计面试了有两三百位候选人。在本文里,就将结合本人的面试经验,针对Java初学者、Java初级开发和Java开发,给出若干准备简历和准备面试的建议。 Java程序员准备和投递简历的实...
面试官如何考察你的思维方式?
1.两种思维方式在求职面试中,经常会考察这种问题:北京有多少量特斯拉汽车?某胡同口的煎饼摊一年能卖出多少个煎饼?深圳有多少个产品经理?一辆公交车里能装下多少个乒乓球?一个正常成年人有多少根头发?这类估算问题,被称为费米问题,是以科学家费米命名的。为什么面试会问这种问题呢?这类问题能把两类人清楚地区分出来。一类是具有文科思维的人,擅长赞叹和模糊想象,它主要依靠的是人的第一反应和直觉,比如小孩...
so easy! 10行代码写个"狗屁不通"文章生成器
前几天,GitHub 有个开源项目特别火,只要输入标题就可以生成一篇长长的文章。 背后实现代码一定很复杂吧,里面一定有很多高深莫测的机器学习等复杂算法 不过,当我看了源代码之后 这程序不到50行 尽管我有多年的Python经验,但我竟然一时也没有看懂 当然啦,原作者也说了,这个代码也是在无聊中诞生的,平时撸码是不写中文变量名的, 中文...
知乎高赞:中国有什么拿得出手的开源软件产品?(整理自本人原创回答)
知乎高赞:中国有什么拿得出手的开源软件产品? 在知乎上,有个问题问“中国有什么拿得出手的开源软件产品(在 GitHub 等社区受欢迎度较好的)?” 事实上,还不少呢~ 本人于2019.7.6进行了较为全面的回答,对这些受欢迎的 Github 开源项目分类整理如下: 分布式计算、云平台相关工具类 1.SkyWalking,作者吴晟、刘浩杨 等等 仓库地址: apache/skywalking 更...
相关热词 c# 图片上传 c# gdi 占用内存 c#中遍历字典 c#控制台模拟dos c# 斜率 最小二乘法 c#进程延迟 c# mysql完整项目 c# grid 总行数 c# web浏览器插件 c# xml 生成xsd
立即提问