Sometimes when you import data from RDBMS to Hadoop via Sqoop you will see this error.
org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory
hdfs://hadoopcluster/user/username/importtable already exists
$ hdfs dfs -rm -r -skipTrash hdfs://hadoopcluster/user/username/importtable
When Sqoop is used for Importing data, sqoop creates a temporary file under home directory and later deletes those files. Sometimes due to some issue, sqoop will exit without deleting the folder. In that case it generates this error. Solution is to manually delete the folder using hdfs dfs -rm -r command.