ubuntu 用sqoop将数据从hive导入mysql时,命令:
- cd /usr/local/sqoop
- bin/sqoop export --connect jdbc:mysql://localhost:3306/dbtaobao --username root -P root --table user_log --export-dir '/user/hive/warehouse/dbtaobao.db/inner_user_log' --fields-terminated-by ',';结果显示:
Warning: /usr/local/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /usr/local/sqoop/../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
21/02/14 17:21:44 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
Enter password:
21/02/14 17:21:45 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
21/02/14 17:21:45 INFO tool.CodeGenTool: Beginning code generation
21/02/14 17:21:46 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `user_log` AS t LIMIT 1
21/02/14 17:21:46 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `user_log` AS t LIMIT 1
21/02/14 17:21:46 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/local/hadoop
注: /tmp/sqoop-hadoop/compile/716352b8f5fd8b50d5e58683cd97b340/user_log.java使用或覆盖了已过时的 API。
注: 有关详细信息, 请使用 -Xlint:deprecation 重新编译。
21/02/14 17:21:49 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/716352b8f5fd8b50d5e58683cd97b340/user_log.jar
21/02/14 17:21:49 INFO mapreduce.ExportJobBase: Beginning export of user_log
21/02/14 17:21:50 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
21/02/14 17:21:52 INFO Configuration.deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative
21/02/14 17:21:52 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative
21/02/14 17:21:52 INFO Configuration.deprecation: mapred.job.tracker is deprecated. Instead, use mapreduce.jobtracker.address
21/02/14 17:21:52 INFO Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id
21/02/14 17:21:52 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
21/02/14 17:21:53 INFO mapreduce.JobSubmitter: Cleaning up the staging area file:/usr/local/hadoop/tmp/mapred/staging/hadoop765752367/.staging/job_local765752367_0001
21/02/14 17:21:53 ERROR tool.ExportTool: Encountered IOException running export job: java.io.FileNotFoundException: File does not exist: hdfs://localhost:9000/usr/local/sqoop/lib/parquet-avro-1.4.1.jar