尘世壹俗人 2024-10-11 21:52 采纳率: 81.3%
浏览 14
已结题

搭建的presto测试sql时报错访问不到hdfs

手搭了一个原生的presto,前面都很顺利,但是在测试sql查询hive的时候,能正常运行show schemas from hiveshow tables from hive.default,但是select一张表的话就会报一个访问hdfs路径的错误

com.facebook.presto.spi.PrestoException: Failed to list directory: hdfs://hdp1/hiveData/t2
    at com.facebook.presto.hive.util.HiveFileIterator$FileStatusIterator.processException(HiveFileIterator.java:166)
    at com.facebook.presto.hive.util.HiveFileIterator$FileStatusIterator.<init>(HiveFileIterator.java:134)
    at com.facebook.presto.hive.util.HiveFileIterator$FileStatusIterator.<init>(HiveFileIterator.java:119)
    at com.facebook.presto.hive.util.HiveFileIterator.getLocatedFileStatusRemoteIterator(HiveFileIterator.java:108)
    at com.facebook.presto.hive.util.HiveFileIterator.computeNext(HiveFileIterator.java:101)
    at com.facebook.presto.hive.util.HiveFileIterator.computeNext(HiveFileIterator.java:38)
    at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:145)
    at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:140)
    at java.util.Spliterators$IteratorSpliterator.tryAdvance(Spliterators.java:1811)
    at java.util.stream.StreamSpliterators$WrappingSpliterator.lambda$initPartialTraversalState$0(StreamSpliterators.java:294)
    at java.util.stream.StreamSpliterators$AbstractWrappingSpliterator.fillBuffer(StreamSpliterators.java:206)
    at java.util.stream.StreamSpliterators$AbstractWrappingSpliterator.doAdvance(StreamSpliterators.java:161)
    at java.util.stream.StreamSpliterators$WrappingSpliterator.tryAdvance(StreamSpliterators.java:300)
    at java.util.Spliterators$1Adapter.hasNext(Spliterators.java:681)
    at com.facebook.presto.hive.BackgroundHiveSplitLoader.loadSplits(BackgroundHiveSplitLoader.java:249)
    at com.facebook.presto.hive.BackgroundHiveSplitLoader.access$300(BackgroundHiveSplitLoader.java:87)
    at com.facebook.presto.hive.BackgroundHiveSplitLoader$HiveSplitLoaderTask.process(BackgroundHiveSplitLoader.java:181)
    at com.facebook.presto.hive.util.ResumableTasks.safeProcessTask(ResumableTasks.java:47)
    at com.facebook.presto.hive.util.ResumableTasks.access$000(ResumableTasks.java:20)
    at com.facebook.presto.hive.util.ResumableTasks$1.run(ResumableTasks.java:35)
    at io.airlift.concurrent.BoundedExecutor.drainQueue(BoundedExecutor.java:78)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)
Caused by: java.net.ConnectException: Call From hdp3/192.168.239.188 to hdp1:8020 failed on connection exception: java.net.ConnectException: 拒绝连接; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:792)
    at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:732)
    at org.apache.hadoop.ipc.Client.call(Client.java:1480)
    at org.apache.hadoop.ipc.Client.call(Client.java:1413)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
    at com.sun.proxy.$Proxy176.getListing(Unknown Source)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getListing(ClientNamenodeProtocolTranslatorPB.java:578)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
    at com.sun.proxy.$Proxy177.getListing(Unknown Source)
    at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:2086)
    at org.apache.hadoop.hdfs.DistributedFileSystem$DirListingIterator.<init>(DistributedFileSystem.java:944)
    at org.apache.hadoop.hdfs.DistributedFileSystem$DirListingIterator.<init>(DistributedFileSystem.java:927)
    at org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:872)
    at org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:868)
    at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    at org.apache.hadoop.hdfs.DistributedFileSystem.listLocatedStatus(DistributedFileSystem.java:886)
    at org.apache.hadoop.fs.FileSystem.listLocatedStatus(FileSystem.java:1696)
    at org.apache.hadoop.fs.FilterFileSystem.listLocatedStatus(FilterFileSystem.java:263)
    at com.facebook.presto.hive.HadoopDirectoryLister.list(HadoopDirectoryLister.java:30)
    at com.facebook.presto.hive.util.HiveFileIterator$FileStatusIterator.<init>(HiveFileIterator.java:131)
    ... 22 more
Caused by: java.net.ConnectException: 拒绝连接
    at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
    at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
    at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
    at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:615)
    at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:713)
    at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:376)
    at org.apache.hadoop.ipc.Client.getConnection(Client.java:1529)
    at org.apache.hadoop.ipc.Client.call(Client.java:1452)
    ... 44 more

从报错分析,是没正确识别hadoop的配置文件导致去访问了8020端口,但是我配置了hadoop的环境变量,我在想难道它和datax一样需要另外显示的配置什么东西?谁知道怎么搞呀

  • 写回答

2条回答 默认 最新

  • 尘世壹俗人 2024-10-11 22:09
    关注

    解决了,需要所有的hadoop节点有export HADOOP_CONF_DIR=/opt/hadoop-2.7.2/etc/hadoop环境变量,然后hive的catalog文件中还要显示的配置hive.config.resources=/opt/hadoop-2.7.2/etc/hadoop/core-site.xml,/opt/hadoop-2.7.2/etc/hadoop/hdfs-site.xml

    本回答被题主选为最佳回答 , 对您是否有帮助呢?
    评论
查看更多回答(1条)

报告相同问题?

问题事件

  • 已结题 (查看结题原因) 10月11日
  • 已采纳回答 10月11日
  • 创建了问题 10月11日