#spark正常启动,无法访问Master,网上说的端口问题,在修改为8089后依旧无法访问,下面是日志和一些配置
#logs:
20/03/18 20:07:56 INFO master.Master: Started daemon with process name: 1920@hadoopnode01
20/03/18 20:07:56 INFO util.SignalUtils: Registered signal handler for TERM
20/03/18 20:07:56 INFO util.SignalUtils: Registered signal handler for HUP
20/03/18 20:07:56 INFO util.SignalUtils: Registered signal handler for INT
20/03/18 20:07:56 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
20/03/18 20:07:56 INFO spark.SecurityManager: Changing view acls to: hadoopnode01
20/03/18 20:07:56 INFO spark.SecurityManager: Changing modify acls to: hadoopnode01
20/03/18 20:07:56 INFO spark.SecurityManager: Changing view acls groups to:
20/03/18 20:07:56 INFO spark.SecurityManager: Changing modify acls groups to:
20/03/18 20:07:56 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoopnode01); groups with view permissions: Set(); users with modify permissions: Set(hadoopnode01); groups with modify permissions: Set()
20/03/18 20:07:57 INFO util.Utils: Successfully started service 'sparkMaster' on port 7077.
20/03/18 20:07:57 INFO master.Master: Starting Spark master at spark://hadoopnode01:7077
20/03/18 20:07:57 INFO master.Master: Running Spark version 2.4.5
20/03/18 20:07:57 INFO util.log: Logging initialized @1410ms
20/03/18 20:07:57 INFO server.Server: jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown
20/03/18 20:07:57 INFO server.Server: Started @1484ms
20/03/18 20:07:57 INFO server.AbstractConnector: Started ServerConnector@7976972b{HTTP/1.1,[http/1.1]}{0.0.0.0:8089}
20/03/18 20:07:57 INFO util.Utils: Successfully started service 'MasterUI' on port 8089.
20/03/18 20:07:57 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@422e316{/app,null,AVAILABLE,@Spark}
20/03/18 20:07:57 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@15d02cf6{/app/json,null,AVAILABLE,@Spark}
20/03/18 20:07:57 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@74464f69{/,null,AVAILABLE,@Spark}
20/03/18 20:07:57 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@13cad671{/json,null,AVAILABLE,@Spark}
20/03/18 20:07:57 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7e9ff62{/static,null,AVAILABLE,@Spark}
20/03/18 20:07:57 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1fb5f36c{/app/kill,null,AVAILABLE,@Spark}
20/03/18 20:07:57 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@708b72b5{/driver/kill,null,AVAILABLE,@Spark}
20/03/18 20:07:57 INFO ui.MasterWebUI: Bound MasterWebUI to 0.0.0.0, and started at http://hadoopnode01:8089
20/03/18 20:07:57 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@b0f1d0c{/metrics/master/json,null,AVAILABLE,@Spark}
20/03/18 20:07:57 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5891212e{/metrics/applications/json,null,AVAILABLE,@Spark}
20/03/18 20:07:57 INFO master.Master: I have been elected leader! New state: ALIVE
#etc/profile
#set java enviroment
JAVA_HOME=/usr/lib/jvm/java
PATH=$PATH:$JAVA_HOME/bin
CLASSPATH=$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
export JAVA_HOME CLASSPATH PATH
#Scala env
export SCALA_HOME=/usr/scala/scala-2.13.1
export PATH=$PATH:$SCALA_HOME/bin
#ip
1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000
link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00
inet 127.0.0.1/8 scope host lo
valid_lft forever preferred_lft forever
inet6 ::1/128 scope host
valid_lft forever preferred_lft forever
2: ens33: mtu 1500 qdisc pfifo_fast state UP group default qlen 1000
link/ether 00:50:56:3b:85:db brd ff:ff:ff:ff:ff:ff
inet 192.168.35.130/24 brd 192.168.35.255 scope global noprefixroute dynamic ens33
valid_lft 1139sec preferred_lft 1139sec
inet 192.168.35.10/24 brd 192.168.35.255 scope global secondary noprefixroute ens33
valid_lft forever preferred_lft forever
inet6 fe80::505b:3101:5284:850c/64 scope link noprefixroute
valid_lft forever preferred_lft forever
#spark-env.sh
export SPARK_HOME=/home/hadoopnode01/apps/spark
export PATH=$PATH:$SPARK_HOME/bin
export JAVA_HOME=/usr/lib/jvm/java
export SCALA_HOME=/usr/scala/scala-2.13.1
export HADOOP_HOME=/home/hadoopnode01/apps/hadoop-2.9.2
export HADOOP_CONF_DIR=/home/hadoopnode01/apps/hadoop-2.9.2/etc/hadoop
export SPARK_LOCAL_IP=192.168.35.10
export SPARK_MASTER_HOST=master.lab.hadoop.com
export SPARK_MASTER_PORT=7077
尝试了很多方法都没用,spark正常启动,shell也能用,就是webui都进不去