In this post  Hive Job Submission failed with exception I had mentioned about deleting .Trash folder.  For any reason if you are not able to delete the folder you can go with Option B. Change the  scratchdir for Hive / Sqoop to use Create a new folder under  hadoop file system.  Example   /db/tmphive Grant read write… (0 comment)

Hive Job Submission failed with exception
Job Submission failed with exception ‘org.apache.hadoop.hdfs.protocol.DSQuotaExceededException(The DiskSpace quota of /user/username is exceeded: quota = xx’ This is a common problem when you are working in a multi tenant environment with limited  quota. Reasons: When large quantity of data is processed via Hive / Pig the temporary data gets stored in .Trash folder which causes  /home… (0 comment)

SQOOP  : mapred.FileAlreadyExistsException : Output directory
Sometimes when you import data from RDBMS to Hadoop via Sqoop you will see this error. org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory hdfs://hadoopcluster/user/username/importtable  already exists Solution: $ hdfs dfs -rm -r -skipTrash  hdfs://hadoopcluster/user/username/importtable Reason: When Sqoop is used for Importing data, sqoop creates a temporary file under  home directory and later deletes those files. Sometimes due to some issue,… (0 comment)

Hive Error :  SemanticException TOK_ALLCOLREF is not supported in current context Reason :  Select Distinct * From <tablename> Solution : Hive doesn’t support  distinct *, so  mention the column names Select Distinct col1, col2, col3… From <tablename>  … (0 comment)

Recently I have been noticing some weird special characters on my PuTTy screen.  There was no pattern to it. It will appear all of a sudden. Its so annoying when you are working on a script. ex: ^[[28~ Reason : To keep my windows active from locking, I had installed an application called ‘caffeine’ .  Caffeine… (0 comment)