半吊子的程序狗 2020-06-29 11:34 采纳率: 0%
浏览 223
已结题

sqoop命令可以導入成功,但是用java調用sqoop數據卻沒有導入,表能創建成功,但是沒有數據。現在不知道從哪裡找原因了,求助,感謝

1,使用sqoop將informix中的數據導入到hadoop中,
可以導入成功,在hive中可以查詢表的數據量信息。
2.使用java調用sqoop,使用的是相同的命令參數,Sqoop.runSqoop(sqoop, expandArguments) 返回的結果是0,在eclipse中,顯示的結果好像也是成功的,java中執行完成后,可以在hive中查到對應的表信息,但是卻沒有數據,現在不知道從哪裡找原因了,求助,感謝!(hadoop、hive,sqoop都是windows環境下)
ECLIPSE執行的全部信息如下:
2020-06-29 11:17:39,900 main WARN Unable to instantiate org.fusesource.jansi.WindowsAnsiOutputStream
2020-06-29 11:17:39,907 main WARN Unable to instantiate org.fusesource.jansi.WindowsAnsiOutputStream
expandArguments 成功!
2020-06-29T11:17:40,063 INFO [main] org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS
2020-06-29T11:17:40,067 WARN [main] org.apache.sqoop.tool.SqoopTool - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.
2020-06-29T11:17:40,092 INFO [main] org.apache.sqoop.Sqoop - Running Sqoop version: 1.4.6
2020-06-29T11:17:40,139 INFO [main] org.apache.sqoop.tool.BaseSqoopTool - Using Hive-specific delimiters for output. You can override
2020-06-29T11:17:40,139 INFO [main] org.apache.sqoop.tool.BaseSqoopTool - delimiters with --fields-terminated-by, etc.
2020-06-29T11:17:40,150 WARN [main] org.apache.sqoop.ConnFactory - $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.
2020-06-29T11:17:40,181 WARN [main] org.apache.sqoop.ConnFactory - Parameter --driver is set to an explicit driver however appropriate connection manager is not being set (via --connection-manager). Sqoop is going to fall back to org.apache.sqoop.manager.GenericJdbcManager. Please specify explicitly which connection manager should be used next time.
2020-06-29T11:17:40,189 INFO [main] org.apache.sqoop.manager.SqlManager - Using default fetchSize of 1000
2020-06-29T11:17:40,193 INFO [main] org.apache.sqoop.tool.CodeGenTool - Beginning code generation
2020-06-29T11:17:40,566 INFO [main] org.apache.sqoop.manager.SqlManager - Executing SQL statement: SELECT t.* FROM pmc_file AS t WHERE 1=0
2020-06-29T11:17:40,575 INFO [main] org.apache.sqoop.manager.SqlManager - Executing SQL statement: SELECT t.* FROM pmc_file AS t WHERE 1=0
2020-06-29T11:17:40,603 INFO [main] org.apache.sqoop.orm.CompilationManager - $HADOOP_MAPRED_HOME is not set
Note: \tmp\sqoop-機器用戶名稱\compile\8dabd1b206bb53c6f69beab4e93619b6\pmc_file.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
2020-06-29T11:17:42,545 INFO [main] org.apache.sqoop.orm.CompilationManager - Writing jar file: \tmp\sqoop-機器用戶名稱\compile\8dabd1b206bb53c6f69beab4e93619b6\pmc_file.jar
2020-06-29T11:17:42,621 INFO [main] org.apache.sqoop.mapreduce.ImportJobBase - Beginning import of pmc_file
2020-06-29T11:17:42,807 INFO [main] org.apache.hadoop.conf.Configuration.deprecation - mapred.jar is deprecated. Instead, use mapreduce.job.jar
2020-06-29T11:17:42,814 INFO [main] org.apache.sqoop.manager.SqlManager - Executing SQL statement: SELECT t.* FROM pmc_file AS t WHERE 1=0
2020-06-29T11:17:43,509 INFO [main] org.apache.hadoop.conf.Configuration.deprecation - mapred.job.tracker is deprecated. Instead, use mapreduce.jobtracker.address
2020-06-29T11:17:43,527 INFO [main] org.apache.hadoop.conf.Configuration.deprecation - session.id is deprecated. Instead, use dfs.metrics.session-id
2020-06-29T11:17:43,528 INFO [main] org.apache.hadoop.metrics.jvm.JvmMetrics - Initializing JVM Metrics with processName=JobTracker, sessionId=
2020-06-29T11:17:45,103 INFO [main] org.apache.sqoop.mapreduce.db.DBInputFormat - Using read commited transaction isolation
2020-06-29T11:17:45,126 INFO [main] org.apache.hadoop.mapreduce.JobSubmitter - number of splits:1
2020-06-29T11:17:45,135 INFO [main] org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS
2020-06-29T11:17:45,193 INFO [main] org.apache.hadoop.mapreduce.JobSubmitter - Submitting tokens for job: job_local528795584_0001
2020-06-29T11:17:46,093 INFO [main] org.apache.hadoop.mapred.LocalDistributedCacheManager - Creating symlink: \tmp\hadoop-機器用戶名稱\mapred\local\1593400665311\hadoop-common-2.7.7.jar <- D:\WorkFiles\code\JavaPractices\myhadoop/hadoop-common-2.7.7.jar
2020-06-29T11:17:46,294 INFO [main] org.apache.hadoop.mapred.LocalDistributedCacheManager - Localized file:/D:/hadoop/job/sqoop-1.4.7/lib/hadoop-common-2.7.7.jar as file:/tmp/hadoop-機器用戶名稱/mapred/local/1593400665311/hadoop-common-2.7.7.jar
2020-06-29T11:17:46,294 INFO [main] org.apache.hadoop.mapred.LocalDistributedCacheManager - Creating symlink: \tmp\hadoop-機器用戶名稱\mapred\local\1593400665312\mysql-connector-java-8.0.20.jar <- D:\WorkFiles\code\JavaPractices\myhadoop/mysql-connector-java-8.0.20.jar
2020-06-29T11:17:46,344 INFO [main] org.apache.hadoop.mapred.LocalDistributedCacheManager - Localized file:/D:/hadoop/job/sqoop-1.4.7/lib/mysql-connector-java-8.0.20.jar as file:/tmp/hadoop-機器用戶名稱/mapred/local/1593400665312/mysql-connector-java-8.0.20.jar
2020-06-29T11:17:46,344 INFO [main] org.apache.hadoop.mapred.LocalDistributedCacheManager - Creating symlink: \tmp\hadoop-機器用戶名稱\mapred\local\1593400665313\sqoop-1.4.6.jar <- D:\WorkFiles\code\JavaPractices\myhadoop/sqoop-1.4.6.jar
2020-06-29T11:17:46,396 INFO [main] org.apache.hadoop.mapred.LocalDistributedCacheManager - Localized file:/D:/hadoop/job/sqoop-1.4.7/lib/sqoop-1.4.6.jar as file:/tmp/hadoop-機器用戶名稱/mapred/local/1593400665313/sqoop-1.4.6.jar
2020-06-29T11:17:46,396 INFO [main] org.apache.hadoop.mapred.LocalDistributedCacheManager - Creating symlink: \tmp\hadoop-機器用戶名稱\mapred\local\1593400665314\hive-exec-2.3.7.jar <- D:\WorkFiles\code\JavaPractices\myhadoop/hive-exec-2.3.7.jar
2020-06-29T11:17:46,447 INFO [main] org.apache.hadoop.mapred.LocalDistributedCacheManager - Localized file:/D:/hadoop/job/sqoop-1.4.7/lib/hive-exec-2.3.7.jar as file:/tmp/hadoop-機器用戶名稱/mapred/local/1593400665314/hive-exec-2.3.7.jar
2020-06-29T11:17:46,447 INFO [main] org.apache.hadoop.mapred.LocalDistributedCacheManager - Creating symlink: \tmp\hadoop-機器用戶名稱\mapred\local\1593400665315\ant-contrib-1.0b3.jar <- D:\WorkFiles\code\JavaPractices\myhadoop/ant-contrib-1.0b3.jar
2020-06-29T11:17:46,503 INFO [main] org.apache.hadoop.mapred.LocalDistributedCacheManager - Localized file:/D:/hadoop/job/sqoop-1.4.7/lib/ant-contrib-1.0b3.jar as file:/tmp/hadoop-機器用戶名稱/mapred/local/1593400665315/ant-contrib-1.0b3.jar
2020-06-29T11:17:46,503 INFO [main] org.apache.hadoop.mapred.LocalDistributedCacheManager - Creating symlink: \tmp\hadoop-機器用戶名稱\mapred\local\1593400665316\libthrift-0.9.3.jar <- D:\WorkFiles\code\JavaPractices\myhadoop/libthrift-0.9.3.jar
2020-06-29T11:17:46,555 INFO [main] org.apache.hadoop.mapred.LocalDistributedCacheManager - Localized file:/D:/hadoop/job/sqoop-1.4.7/lib/libthrift-0.9.3.jar as file:/tmp/hadoop-機器用戶名稱/mapred/local/1593400665316/libthrift-0.9.3.jar
2020-06-29T11:17:46,556 INFO [main] org.apache.hadoop.mapred.LocalDistributedCacheManager - Creating symlink: \tmp\hadoop-機器用戶名稱\mapred\local\1593400665317\mysql-connector-java-5.0.8-bin.jar <- D:\WorkFiles\code\JavaPractices\myhadoop/mysql-connector-java-5.0.8-bin.jar
2020-06-29T11:17:46,606 INFO [main] org.apache.hadoop.mapred.LocalDistributedCacheManager - Localized file:/D:/hadoop/job/sqoop-1.4.7/lib/mysql-connector-java-5.0.8-bin.jar as file:/tmp/hadoop-機器用戶名稱/mapred/local/1593400665317/mysql-connector-java-5.0.8-bin.jar
2020-06-29T11:17:46,606 INFO [main] org.apache.hadoop.mapred.LocalDistributedCacheManager - Creating symlink: \tmp\hadoop-機器用戶名稱\mapred\local\1593400665318\ifxjdbc.jar <- D:\WorkFiles\code\JavaPractices\myhadoop/ifxjdbc.jar
2020-06-29T11:17:46,660 INFO [main] org.apache.hadoop.mapred.LocalDistributedCacheManager - Localized file:/D:/hadoop/job/sqoop-1.4.7/lib/ifxjdbc.jar as file:/tmp/hadoop-機器用戶名稱/mapred/local/1593400665318/ifxjdbc.jar
2020-06-29T11:17:46,660 INFO [main] org.apache.hadoop.mapred.LocalDistributedCacheManager - Creating symlink: \tmp\hadoop-機器用戶名稱\mapred\local\1593400665319\ant-eclipse-1.0-jvm1.2.jar <- D:\WorkFiles\code\JavaPractices\myhadoop/ant-eclipse-1.0-jvm1.2.jar
2020-06-29T11:17:46,714 INFO [main] org.apache.hadoop.mapred.LocalDistributedCacheManager - Localized file:/D:/hadoop/job/sqoop-1.4.7/lib/ant-eclipse-1.0-jvm1.2.jar as file:/tmp/hadoop-機器用戶名稱/mapred/local/1593400665319/ant-eclipse-1.0-jvm1.2.jar
2020-06-29T11:17:46,766 INFO [main] org.apache.hadoop.mapred.LocalDistributedCacheManager - file:/D:/tmp/hadoop-機器用戶名稱/mapred/local/1593400665311/hadoop-common-2.7.7.jar
2020-06-29T11:17:46,766 INFO [main] org.apache.hadoop.mapred.LocalDistributedCacheManager - file:/D:/tmp/hadoop-機器用戶名稱/mapred/local/1593400665312/mysql-connector-java-8.0.20.jar
2020-06-29T11:17:46,766 INFO [main] org.apache.hadoop.mapred.LocalDistributedCacheManager - file:/D:/tmp/hadoop-機器用戶名稱/mapred/local/1593400665313/sqoop-1.4.6.jar
2020-06-29T11:17:46,766 INFO [main] org.apache.hadoop.mapred.LocalDistributedCacheManager - file:/D:/tmp/hadoop-機器用戶名稱/mapred/local/1593400665314/hive-exec-2.3.7.jar
2020-06-29T11:17:46,766 INFO [main] org.apache.hadoop.mapred.LocalDistributedCacheManager - file:/D:/tmp/hadoop-機器用戶名稱/mapred/local/1593400665315/ant-contrib-1.0b3.jar
2020-06-29T11:17:46,766 INFO [main] org.apache.hadoop.mapred.LocalDistributedCacheManager - file:/D:/tmp/hadoop-機器用戶名稱/mapred/local/1593400665316/libthrift-0.9.3.jar
2020-06-29T11:17:46,766 INFO [main] org.apache.hadoop.mapred.LocalDistributedCacheManager - file:/D:/tmp/hadoop-機器用戶名稱/mapred/local/1593400665317/mysql-connector-java-5.0.8-bin.jar
2020-06-29T11:17:46,766 INFO [main] org.apache.hadoop.mapred.LocalDistributedCacheManager - file:/D:/tmp/hadoop-機器用戶名稱/mapred/local/1593400665318/ifxjdbc.jar
2020-06-29T11:17:46,766 INFO [main] org.apache.hadoop.mapred.LocalDistributedCacheManager - file:/D:/tmp/hadoop-機器用戶名稱/mapred/local/1593400665319/ant-eclipse-1.0-jvm1.2.jar
2020-06-29T11:17:46,772 INFO [main] org.apache.hadoop.mapreduce.Job - The url to track the job: http://localhost:8080/
2020-06-29T11:17:46,773 INFO [main] org.apache.hadoop.mapreduce.Job - Running job: job_local528795584_0001
2020-06-29T11:17:46,775 INFO [Thread-18] org.apache.hadoop.mapred.LocalJobRunner - OutputCommitter set in config null
2020-06-29T11:17:46,794 INFO [Thread-18] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - File Output Committer Algorithm version is 1
2020-06-29T11:17:46,796 INFO [Thread-18] org.apache.hadoop.mapred.LocalJobRunner - OutputCommitter is org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter
2020-06-29T11:17:46,939 INFO [Thread-18] org.apache.hadoop.mapred.LocalJobRunner - Waiting for map tasks
2020-06-29T11:17:46,940 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.LocalJobRunner - Starting task: attempt_local528795584_0001_m_000000_0
2020-06-29T11:17:46,964 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - File Output Committer Algorithm version is 1
2020-06-29T11:17:46,971 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.yarn.util.ProcfsBasedProcessTree - ProcfsBasedProcessTree currently is supported only on Linux.
2020-06-29T11:17:47,021 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.Task - Using ResourceCalculatorProcessTree : org.apache.hadoop.yarn.util.WindowsBasedProcessTree@1a1ab93f
2020-06-29T11:17:47,041 INFO [LocalJobRunner Map Task Executor #0] org.apache.sqoop.mapreduce.db.DBInputFormat - Using read commited transaction isolation
2020-06-29T11:17:47,047 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.MapTask - Processing split: 1=1 AND 1=1
2020-06-29T11:17:47,311 INFO [LocalJobRunner Map Task Executor #0] org.apache.sqoop.mapreduce.db.DBRecordReader - Working on split: 1=1 AND 1=1
2020-06-29T11:17:47,316 INFO [LocalJobRunner Map Task Executor #0] org.apache.sqoop.mapreduce.db.DBRecordReader - Executing query: SELECT pmc01, pmc02, pmc03, pmc04, pmc05, pmc06, pmc07, pmc081, pmc082, pmc091, pmc092, pmc093, pmc094, pmc095, pmc10, pmc11, pmc12, pmc13, pmc14, pmc15, pmc16, pmc17, pmc18, pmc19, pmc20, pmc21, pmc22, pmc23, pmc24, pmc25, pmc26, pmc27, pmc28, pmc30, pmc40, pmc41, pmc42, pmc43, pmc44, pmc45, pmc46, pmc47, pmc48, pmc49, pmc50, pmc51, pmc52, pmc53, pmc54, pmc55, pmc56, pmc901, pmc902, pmc903, pmc904, pmc905, pmc906, pmc907, pmc908, pmc909, pmc910, pmc911, pmc912, pmc913, pmc914, pmc915, pmc916, pmc917, pmc918, pmcacti, pmcuser, pmcgrup, pmcmodu, pmcdate FROM pmc_file AS pmc_file WHERE ( 1=1 ) AND ( 1=1 )
2020-06-29T11:17:47,790 INFO [main] org.apache.hadoop.mapreduce.Job - Job job_local528795584_0001 running in uber mode : false
2020-06-29T11:17:47,791 INFO [main] org.apache.hadoop.mapreduce.Job - map 0% reduce 0%
2020-06-29T11:17:52,980 INFO [communication thread] org.apache.hadoop.mapred.LocalJobRunner - map > map
2020-06-29T11:17:55,980 INFO [communication thread] org.apache.hadoop.mapred.LocalJobRunner - map > map
2020-06-29T11:17:58,981 INFO [communication thread] org.apache.hadoop.mapred.LocalJobRunner - map > map
2020-06-29T11:18:01,982 INFO [communication thread] org.apache.hadoop.mapred.LocalJobRunner - map > map
2020-06-29T11:18:04,983 INFO [communication thread] org.apache.hadoop.mapred.LocalJobRunner - map > map
2020-06-29T11:18:07,983 INFO [communication thread] org.apache.hadoop.mapred.LocalJobRunner - map > map
2020-06-29T11:18:10,984 INFO [communication thread] org.apache.hadoop.mapred.LocalJobRunner - map > map
2020-06-29T11:18:13,984 INFO [communication thread] org.apache.hadoop.mapred.LocalJobRunner - map > map
2020-06-29T11:18:16,987 INFO [communication thread] org.apache.hadoop.mapred.LocalJobRunner - map > map
2020-06-29T11:18:19,989 INFO [communication thread] org.apache.hadoop.mapred.LocalJobRunner - map > map
2020-06-29T11:18:22,993 INFO [communication thread] org.apache.hadoop.mapred.LocalJobRunner - map > map
2020-06-29T11:18:25,994 INFO [communication thread] org.apache.hadoop.mapred.LocalJobRunner - map > map
2020-06-29T11:18:28,994 INFO [communication thread] org.apache.hadoop.mapred.LocalJobRunner - map > map
2020-06-29T11:18:31,994 INFO [communication thread] org.apache.hadoop.mapred.LocalJobRunner - map > map
2020-06-29T11:18:34,994 INFO [communication thread] org.apache.hadoop.mapred.LocalJobRunner - map > map
2020-06-29T11:18:37,995 INFO [communication thread] org.apache.hadoop.mapred.LocalJobRunner - map > map
2020-06-29T11:18:40,995 INFO [communication thread] org.apache.hadoop.mapred.LocalJobRunner - map > map
2020-06-29T11:18:41,050 INFO [Thread-67] org.apache.sqoop.mapreduce.AutoProgressMapper - Auto-progress thread is finished. keepGoing=false
2020-06-29T11:18:41,051 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.LocalJobRunner - map > map
2020-06-29T11:18:41,126 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.Task - Task:attempt_local528795584_0001_m_000000_0 is done. And is in the process of committing
2020-06-29T11:18:41,130 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.LocalJobRunner - map > map
2020-06-29T11:18:41,130 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.Task - Task attempt_local528795584_0001_m_000000_0 is allowed to commit now
2020-06-29T11:18:41,191 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter - Saved output of task 'attempt_local528795584_0001_m_000000_0' to hdfs://localhost:9000/user/機器用戶名稱/abc/_temporary/0/task_local528795584_0001_m_000000
2020-06-29T11:18:41,192 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.LocalJobRunner - map
2020-06-29T11:18:41,192 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.Task - Task 'attempt_local528795584_0001_m_000000_0' done.
2020-06-29T11:18:41,195 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.Task - Final Counters for attempt_local528795584_0001_m_000000_0: Counters: 20
File System Counters
FILE: Number of bytes read=42995830
FILE: Number of bytes written=43639176
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=0
HDFS: Number of bytes written=95659129
HDFS: Number of read operations=4
HDFS: Number of large read operations=0
HDFS: Number of write operations=3
Map-Reduce Framework
Map input records=119212
Map output records=119212
Input split bytes=87
Spilled Records=0
Failed Shuffles=0
Merged Map outputs=0
GC time elapsed (ms)=24
Total committed heap usage (bytes)=429916160
File Input Format Counters
Bytes Read=0
File Output Format Counters
Bytes Written=95659129
2020-06-29T11:18:41,195 INFO [LocalJobRunner Map Task Executor #0] org.apache.hadoop.mapred.LocalJobRunner - Finishing task: attempt_local528795584_0001_m_000000_0
2020-06-29T11:18:41,195 INFO [Thread-18] org.apache.hadoop.mapred.LocalJobRunner - map task executor complete.
2020-06-29T11:18:41,825 INFO [main] org.apache.hadoop.mapreduce.Job - map 100% reduce 0%
2020-06-29T11:18:41,826 INFO [main] org.apache.hadoop.mapreduce.Job - Job job_local528795584_0001 completed successfully
2020-06-29T11:18:41,853 INFO [main] org.apache.hadoop.mapreduce.Job - Counters: 20
File System Counters
FILE: Number of bytes read=42995830
FILE: Number of bytes written=43639176
FILE: Number of read operations=0
FILE: Number of large read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=0
HDFS: Number of bytes written=95659129
HDFS: Number of read operations=4
HDFS: Number of large read operations=0
HDFS: Number of write operations=3
Map-Reduce Framework
Map input records=119212
Map output records=119212
Input split bytes=87
Spilled Records=0
Failed Shuffles=0
Merged Map outputs=0
GC time elapsed (ms)=24
Total committed heap usage (bytes)=429916160
File Input Format Counters
Bytes Read=0
File Output Format Counters
Bytes Written=95659129
2020-06-29T11:18:41,854 INFO [main] org.apache.sqoop.mapreduce.ImportJobBase - Transferred 91.2277 MB in 58.3388 seconds (1.5638 MB/sec)
2020-06-29T11:18:41,857 INFO [main] org.apache.sqoop.mapreduce.ImportJobBase - Retrieved 119212 records.
2020-06-29T11:18:41,893 INFO [main] org.apache.sqoop.manager.SqlManager - Executing SQL statement: SELECT t.* FROM pmc_file AS t WHERE 1=0
2020-06-29T11:18:41,907 INFO [main] org.apache.sqoop.manager.SqlManager - Executing SQL statement: SELECT t.* FROM pmc_file AS t WHERE 1=0
2020-06-29T11:18:41,913 WARN [main] org.apache.sqoop.hive.TableDefWriter - Column pmc40 had to be cast to a less precise type in Hive
2020-06-29T11:18:41,913 WARN [main] org.apache.sqoop.hive.TableDefWriter - Column pmc41 had to be cast to a less precise type in Hive
2020-06-29T11:18:41,913 WARN [main] org.apache.sqoop.hive.TableDefWriter - Column pmc42 had to be cast to a less precise type in Hive
2020-06-29T11:18:41,913 WARN [main] org.apache.sqoop.hive.TableDefWriter - Column pmc43 had to be cast to a less precise type in Hive
2020-06-29T11:18:41,913 WARN [main] org.apache.sqoop.hive.TableDefWriter - Column pmc44 had to be cast to a less precise type in Hive
2020-06-29T11:18:41,913 WARN [main] org.apache.sqoop.hive.TableDefWriter - Column pmc45 had to be cast to a less precise type in Hive
2020-06-29T11:18:41,913 WARN [main] org.apache.sqoop.hive.TableDefWriter - Column pmc46 had to be cast to a less precise type in Hive
2020-06-29T11:18:41,913 WARN [main] org.apache.sqoop.hive.TableDefWriter - Column pmcdate had to be cast to a less precise type in Hive
2020-06-29T11:18:41,930 INFO [main] org.apache.sqoop.hive.HiveImport - Loading uploaded data into Hive
2020-06-29T11:18:42,041 INFO [main] org.apache.hadoop.hive.conf.HiveConf - Found configuration file file:/D:/hadoop/job/apache-hive-2.3.7-bin/conf/hive-site.xml
2020-06-29 11:18:42,164 main WARN Unable to instantiate org.fusesource.jansi.WindowsAnsiOutputStream
2020-06-29 11:18:43,159 main WARN Unable to instantiate org.fusesource.jansi.WindowsAnsiOutputStream
2020-06-29 11:18:43,160 main WARN Unable to instantiate org.fusesource.jansi.WindowsAnsiOutputStream

Logging initialized using configuration in jar:file:/D:/WorkFiles/code/JavaPractices/myhadoop/lib/hive-common-2.3.7.jar!/hive-log4j2.properties Async: true
2020-06-29T11:18:43,447 INFO [main] org.apache.hadoop.hive.ql.session.SessionState - Created HDFS directory: /tmp/機器用戶名稱/fadb6ae0-af37-4aa2-be7f-2e8b43ecb727
2020-06-29T11:18:43,448 INFO [main] org.apache.hadoop.hive.ql.session.SessionState - Created local directory: D:/hadoop/job/apache-hive-2.3.7-bin/my_hive/scratch_dir/fadb6ae0-af37-4aa2-be7f-2e8b43ecb727
2020-06-29T11:18:43,449 INFO [main] org.apache.hadoop.hive.ql.session.SessionState - Created HDFS directory: /tmp/機器用戶名稱/fadb6ae0-af37-4aa2-be7f-2e8b43ecb727/_tmp_space.db
2020-06-29T11:18:43,455 INFO [main] org.apache.hadoop.hive.conf.HiveConf - Using the default value passed in for log id: fadb6ae0-af37-4aa2-be7f-2e8b43ecb727
2020-06-29T11:18:43,459 INFO [main] org.apache.hadoop.hive.ql.session.SessionState - Updating thread name to fadb6ae0-af37-4aa2-be7f-2e8b43ecb727 main
2020-06-29T11:18:43,460 INFO [fadb6ae0-af37-4aa2-be7f-2e8b43ecb727 main] org.apache.hadoop.hive.conf.HiveConf - Using the default value passed in for log id: fadb6ae0-af37-4aa2-be7f-2e8b43ecb727
Loading class com.mysql.jdbc.Driver'. This is deprecated. The new driver class iscom.mysql.cj.jdbc.Driver'. The driver is automatically registered via the SPI and manual loading of the driver class is generally unnecessary.
2020-06-29T11:18:53,696 WARN [fadb6ae0-af37-4aa2-be7f-2e8b43ecb727 main] org.apache.hadoop.hive.ql.session.SessionState - METASTORE_FILTER_HOOK will be ignored, since hive.security.authorization.manager is set to instance of HiveAuthorizerFactory.
OK
Time taken: 12.307 seconds
2020-06-29T11:18:55,797 INFO [fadb6ae0-af37-4aa2-be7f-2e8b43ecb727 main] CliDriver - Time taken: 12.307 seconds
2020-06-29T11:18:55,797 INFO [fadb6ae0-af37-4aa2-be7f-2e8b43ecb727 main] org.apache.hadoop.hive.conf.HiveConf - Using the default value passed in for log id: fadb6ae0-af37-4aa2-be7f-2e8b43ecb727
2020-06-29T11:18:55,798 INFO [fadb6ae0-af37-4aa2-be7f-2e8b43ecb727 main] org.apache.hadoop.hive.ql.session.SessionState - Resetting thread name to main
2020-06-29T11:18:55,798 INFO [main] org.apache.hadoop.hive.conf.HiveConf - Using the default value passed in for log id: fadb6ae0-af37-4aa2-be7f-2e8b43ecb727
2020-06-29T11:18:55,798 INFO [main] org.apache.hadoop.hive.ql.session.SessionState - Updating thread name to fadb6ae0-af37-4aa2-be7f-2e8b43ecb727 main
Loading data to table default.my_pmc
2020-06-29T11:18:56,086 WARN [fadb6ae0-af37-4aa2-be7f-2e8b43ecb727 main] org.apache.hadoop.hive.conf.HiveConf - HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
-chgrp: 'CFAG\Domain Users' does not match expected pattern for group
Usage: hadoop fs [generic options] -chgrp [-R] GROUP PATH...
2020-06-29T11:18:59,067 WARN [fadb6ae0-af37-4aa2-be7f-2e8b43ecb727 main] org.apache.hadoop.hive.conf.HiveConf - HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
2020-06-29T11:18:59,303 WARN [fadb6ae0-af37-4aa2-be7f-2e8b43ecb727 main] org.apache.hadoop.hive.conf.HiveConf - HiveConf of name hive.internal.ss.authz.settings.applied.marker does not exist
OK
2020-06-29T11:18:59,502 INFO [fadb6ae0-af37-4aa2-be7f-2e8b43ecb727 main] CliDriver - Time taken: 3.704 seconds
2020-06-29T11:18:59,502 INFO [fadb6ae0-af37-4aa2-be7f-2e8b43ecb727 main] org.apache.hadoop.hive.conf.HiveConf - Using the default value passed in for log id: fadb6ae0-af37-4aa2-be7f-2e8b43ecb727
2020-06-29T11:18:59,502 INFO [fadb6ae0-af37-4aa2-be7f-2e8b43ecb727 main] org.apache.hadoop.hive.ql.session.SessionState - Resetting thread name to main
Time taken: 3.704 seconds
2020-06-29T11:18:59,503 INFO [main] org.apache.hadoop.hive.conf.HiveConf - Using the default value passed in for log id: fadb6ae0-af37-4aa2-be7f-2e8b43ecb727
2020-06-29T11:18:59,507 INFO [main] org.apache.hadoop.hive.ql.session.SessionState - Deleted directory: /tmp/機器用戶名稱/fadb6ae0-af37-4aa2-be7f-2e8b43ecb727 on fs with scheme file
2020-06-29T11:18:59,509 INFO [main] org.apache.hadoop.hive.ql.session.SessionState - Deleted directory: D:/hadoop/job/apache-hive-2.3.7-bin/my_hive/scratch_dir/fadb6ae0-af37-4aa2-be7f-2e8b43ecb727 on fs with scheme file
2020-06-29T11:18:59,523 INFO [main] org.apache.sqoop.hive.HiveImport - Hive import complete.
2020-06-29T11:18:59,527 INFO [main] org.apache.sqoop.hive.HiveImport - Export directory is contains the _SUCCESS file only, removing the directory.
0

  • 写回答

1条回答 默认 最新

  • dabocaiqq 2020-08-12 09:41
    关注
    评论

报告相同问题?

悬赏问题

  • ¥15 matlab数字图像处理频率域滤波
  • ¥15 在abaqus做了二维正交切削模型,给刀具添加了超声振动条件后输出切削力为什么比普通切削增大这么多
  • ¥15 ELGamal和paillier计算效率谁快?
  • ¥15 file converter 转换格式失败 报错 Error marking filters as finished,如何解决?
  • ¥15 ubuntu系统下挂载磁盘上执行./提示权限不够
  • ¥15 Arcgis相交分析无法绘制一个或多个图形
  • ¥15 关于#r语言#的问题:差异分析前数据准备,报错Error in data[, sampleName1] : subscript out of bounds请问怎么解决呀以下是全部代码:
  • ¥15 seatunnel-web使用SQL组件时候后台报错,无法找到表格
  • ¥15 fpga自动售货机数码管(相关搜索:数字时钟)
  • ¥15 用前端向数据库插入数据,通过debug发现数据能走到后端,但是放行之后就会提示错误