报错:Hadoop Warning: fs.defaultFS is not set when running "ls" command.
报错背景
服务器安装了impala,然后hadoop命令就不能用了,hadoop fs -ls / 列出来的都是本地文件夹,而不是HDFS的文件夹。
同时,hbase、hive等命令也全都失效。
报错现象
[root@basecoalmine bin]# hdfs dfs -ls / SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/usr/lib/zookeeper/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/opt/software/tez-0.9.2/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] Warning: fs.defaultFS is not set when running "ls" command. Found 24 items -rw-r--r-- 1 root root 0 2022-01-23 04:20 /.autorelabel dr-xr-xr-x - root root 28672 2022-03-09 03:26 /bin dr-xr-xr-x - root root 4096 2022-01-23 04:20 /boot drwxr-xr-x - root root 62 2022-02-22 21:58 /data drwxr-xr-x - root root 3220 2022-01-23 04:20 /dev drwxr-xr-x - root root 8192 2022-03-08 23:45 /etc drwxr-xr-x - root root 20 2022-02-08 04:01 /home dr-xr-xr-x - root root 4096 2022-03-07 20:21 /lib dr-xr-xr-x - root root 36864 2022-03-07 20:19 /lib64 drwxr-xr-x - root root 23 2022-02-11 03:03 /logs drwxr-xr-x - root root 6 2018-04-11 00:59 /media drwxr-xr-x - root root 6 2018-04-11 00:59 /mnt drwxr-xr-x - root root 117 2022-02-22 03:18 /opt dr-xr-xr-x - root root 0 2022-01-23 04:19 /proc dr-xr-x--- - root root 4096 2022-03-09 03:23 /root drwxr-xr-x - root root 920 2022-03-08 23:18 /run dr-xr-xr-x - root root 16384 2022-03-07 01:11 /sbin drwxr-xr-x - root root 6 2018-04-11 00:59 /srv dr-xr-xr-x - root root 0 2022-01-25 01:01 /sys drwxrwxrwt - root root 12288 2022-03-09 03:27 /tmp drwxr-xr-x - root root 20 2022-03-08 23:08 /user drwxr-xr-x - root root 199 2022-01-25 00:25 /usr drwxr-xr-x - root root 278 2022-01-23 02:44 /var -rw-r--r-- 1 root root 958 2022-01-25 01:02 /zookeeper.out
报错原因
impala安装完成之后会在/usr/bin目录下生成它自己的hadoop、hive、hbase等相关命令,而且这个目录在环境变量的位置排在前面,自己安装的软件环境变量在后面,所以自己安装的软件环境变量会被覆盖,先执行impala的命令,导致命令找不到的情况发生。
报错解决
将impala的命令删除即可。
rm -rf /usr/bin/hadoop rm -rf /usr/bin/hdfs rm -rf /usr/bin/hbase rm -rf /usr/bin/hive rm -rf /usr/bin/hiveserver2