Score:0

Start Hive failed: java.net.NoRouteToHostException: No Route to Host from xxx to xxx:22332 failed

cn flag

I start mysql docker container on CentOS machine 172.20.237.80, installed Apache Hive 3.1.3 on 172.20.237.84. The hive-site.xml is:

<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
<property>
  <name>javax.jdo.option.ConnectionURL</name>
  <value>jdbc:mysql://172.20.237.80:3306/hive?createDatabaseIfNotExist=true</value>
</property>

<property>
    <name>javax.jdo.option.ConnectionDriverName</name>
  <value>com.mysql.cj.jdbc.Driver</value>
</property>

<property>
    <name>javax.jdo.option.ConnectionUserName</name>
    <value>root</value>
</property>

<property>
    <name>javax.jdo.option.ConnectionPassword</name>
  <value>123456</value>
</property>
</configuration>

Of course there's a mysql-connector-java-xxx.jar in hive's lib directory. When I input hive on 172.20.237.84, it gave below error info:

which: no hbase in (/usr/local/src/jdk1.8.0_321/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/home/app/ffmpeg-4.4/bin:/usr/local/src/hadoop-3.3.2/bin:/usr/local/src/jdk1.8.0_321/bin://usr/local/src/hadoop-3.3.2/bin:/usr/local/src/apache-hive-3.1.3-bin/bin:/usr/local/src/jdk1.8.0_321/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/home/app/ffmpeg-4.4/bin:/usr/local/src/hadoop-3.3.2/bin:/root/bin)
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/src/apache-hive-3.1.3-bin/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/src/hadoop-3.3.2/share/hadoop/common/lib/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Hive Session ID = 6e1f526a-367c-4be7-82b9-07f236845d8d

Logging initialized using configuration in jar:file:/usr/local/src/apache-hive-3.1.3-bin/lib/hive-common-3.1.3.jar!/hive-log4j2.properties Async: true
Exception in thread "main" java.lang.RuntimeException: java.net.NoRouteToHostException: No Route to Host from  worker84/172.20.237.84 to worker80:22332 failed on socket timeout exception: java.net.NoRouteToHostException: No route to host; For more details see:  http://wiki.apache.org/hadoop/NoRouteToHost
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:651)
        at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:591)
        at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:747)
        at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:323)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:236)
Caused by: java.net.NoRouteToHostException: No Route to Host from  worker84/172.20.237.84 to worker80:22332 failed on socket timeout exception: java.net.NoRouteToHostException: No route to host; For more details see:  http://wiki.apache.org/hadoop/NoRouteToHost
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:913)
        at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:855)
        at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1616)
        at org.apache.hadoop.ipc.Client.call(Client.java:1558)
        at org.apache.hadoop.ipc.Client.call(Client.java:1455)
        at org.apache.hadoop.ipc.ProtobufRpcEngine2$Invoker.invoke(ProtobufRpcEngine2.java:242)
        at org.apache.hadoop.ipc.ProtobufRpcEngine2$Invoker.invoke(ProtobufRpcEngine2.java:129)
        at com.sun.proxy.$Proxy28.getFileInfo(Unknown Source)
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:965)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)
        at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)
        at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)
        at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
        at com.sun.proxy.$Proxy29.getFileInfo(Unknown Source)
        at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1739)
        at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1753)
        at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1750)
        at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
        at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1765)
        at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1760)
        at org.apache.hadoop.hive.ql.exec.Utilities.ensurePathIsWritable(Utilities.java:4486)
        at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:760)
        at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:701)
        at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:627)
        ... 9 more
Caused by: java.net.NoRouteToHostException: No route to host
        at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
        at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:715)
        at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:205)
        at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:586)
        at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:711)
        at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:833)
        at org.apache.hadoop.ipc.Client$Connection.access$3800(Client.java:414)
        at org.apache.hadoop.ipc.Client.getConnection(Client.java:1677)
        at org.apache.hadoop.ipc.Client.call(Client.java:1502)
        ... 34 more

Actually, the /etc/hosts is :

172.20.237.80 worker80
172.20.237.82 worker82
172.20.237.83 worker83
172.20.237.84 worker84

So worker80 is 172.20.237.80, while on worker80, I started dfs using start-all.sh in hadoop3.3.2->sbin directory. Run netstat -ntlp, we could see:

Active Internet connections (only servers)
Proto Recv-Q Send-Q Local Address           Foreign Address         State       PID/Program name
tcp        0      0 0.0.0.0:9867            0.0.0.0:*               LISTEN      16947/java
tcp        0      0 0.0.0.0:6379            0.0.0.0:*               LISTEN      11036/docker-proxy
tcp        0      0 0.0.0.0:22              0.0.0.0:*               LISTEN      2598/sshd
tcp        0      0 127.0.0.1:25            0.0.0.0:*               LISTEN      2834/master
tcp        0      0 172.20.237.80:22331     0.0.0.0:*               LISTEN      17609/java
tcp        0      0 127.0.0.1:39645         0.0.0.0:*               LISTEN      16947/java
tcp        0      0 0.0.0.0:9864            0.0.0.0:*               LISTEN      16947/java
tcp        0      0 0.0.0.0:27017           0.0.0.0:*               LISTEN      6530/docker-proxy
tcp        0      0 0.0.0.0:9866            0.0.0.0:*               LISTEN      16947/java
tcp        0      0 0.0.0.0:3306            0.0.0.0:*               LISTEN      8972/docker-proxy
tcp        0      0 :::6379                 :::*                    LISTEN      11044/docker-proxy
tcp        0      0 :::22                   :::*                    LISTEN      2598/sshd
tcp        0      0 ::1:25                  :::*                    LISTEN      2834/master
tcp        0      0 :::27017                :::*                    LISTEN      6537/docker-proxy
tcp        0      0 :::2377                 :::*                    LISTEN      25215/dockerd
tcp        0      0 :::3306                 :::*                    LISTEN      8979/docker-proxy
tcp        0      0 :::7946                 :::*                    LISTEN      25215/dockerd

So can anyone give me a clue how to solve the hive startup issue?

mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.