且构网

分享程序员开发的那些事...
且构网 - 分享程序员编程开发的那些事

无法找到或加载主类org.apache.hadoop.fs.FsShell

更新时间:2022-10-23 16:03:37

当你有多个hadoop实例时,通常会发生这种情况,请检查hadoop,看看它是否指出如果它指向/ usr / bin / hadoop而不是/ your-path / hadoop,那么你可以指向/ usr / bin目录下的/ usr / bin / hadoop(使用符号链接)


I understand this question might have been answered already, well, my issue is still here:

I have a vm created for hadoop on vmware using CentOS7, I can start namenode and datanode, however, when I tried to view hdfs file using the following command:

hdfs dfs -ls

it throws out an error below:

Could not find or load main class org.apache.hadoop.fs.FsShell

My google searchings suggest this might relate to hadoop variables setting in bash, here is my settings:

# .bashrc
# Source global definitions
if [ -f /etc/bashrc ]; then
. /etc/bashrc
fi
export HADOOP_HOME=/opt/hadoop/hadoop-2.7.2
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export HADOOP_YARN_HOME=$HADOOP_HOME
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export HADOOP_PREFIX=$HADOOP_HOME

export HIVE_HOME=/opt/hadoop/hive
export PATH=$HIVE_HOME/bin:$PATH

export ANT_HOME=/usr/local/apache-ant-1.9.7
export PATH=${PATH}:${JAVA_HOME}/bin

export PIG_HOME=/opt/hadoop/pig-0.15.0
export PIG_HADOOP_VERSION=0.15.0
export PIG_CLASSPATH=$HADOOP_HOME/etc/hadoop

export PATH=$PATH:$PIG_HOME/bin
export PATH=$PATH:$HADOOP_HOME/bin
export HADOOP_USER_CLASSPATH_FIRST=true

export SQOOP_HOME=/usr/lib/sqoop
export PATH=$PATH:$SQOOP_HOME/bin

export HADOOP_CLASSPATH=$HADOOP_HOME/share/hadoop/common/
export PATH=$PATH:$HADOOP_CLASSPATH

# Uncomment the following line if you don't like systemctl's auto-paging feature
:
# export SYSTEMD_PAGER=
# User specific aliases and functions

I checked my hadoop folder: /opt/hadoop/hadoop-2.7.2/share/hadoop/common, here is the list:

I am doing this practice using root account, can anyone help to find out where is the cause of this issue and fix it? Thank you very much.

this typically happens when you have multiple instances of hadoop, check which hadoop and see if its pointing out to the version that you have installed.

say if it points to /usr/bin/hadoop and not /your-path/hadoop, then you can point /usr/bin/hadoop to that (with symlink)