In this post  Hive Job Submission failed with exception I had mentioned about deleting .Trash folder.  For any reason if you are not able to delete the folder you can go with Option B. Change the  scratchdir for Hive / Sqoop to use Create a new folder under  hadoop file system.  Example   /db/tmphive Grant read write… (0 comment)

Hive Job Submission failed with exception
Job Submission failed with exception ‘org.apache.hadoop.hdfs.protocol.DSQuotaExceededException(The DiskSpace quota of /user/username is exceeded: quota = xx’ This is a common problem when you are working in a multi tenant environment with limited  quota. Reasons: When large quantity of data is processed via Hive / Pig the temporary data gets stored in .Trash folder which causes  /home… (0 comment)

Hive Error :  SemanticException TOK_ALLCOLREF is not supported in current context Reason :  Select Distinct * From <tablename> Solution : Hive doesn’t support  distinct *, so  mention the column names Select Distinct col1, col2, col3… From <tablename>  … (0 comment)