In this post Hive Job Submission failed with exception I had mentioned about deleting .Trash folder. For any reason if you are not able to delete the folder you can go with Option B. Change the scratchdir for Hive / Sqoop to use Create a new folder under hadoop file system. Example /db/tmphive Grant read write… (0 comment)
Find Script(s) useful ?
Then don't forget to subscribe !
Useful info to your email, whenever I post.
Sometimes when you import data from RDBMS to Hadoop via Sqoop you will see this error. org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory hdfs://hadoopcluster/user/username/importtable already exists Solution: $ hdfs dfs -rm -r -skipTrash hdfs://hadoopcluster/user/username/importtable Reason: When Sqoop is used for Importing data, sqoop creates a temporary file under home directory and later deletes those files. Sometimes due to some issue,… (0 comment)