HomeHadoop

Hive Job Submission failed with exception

Hive Job Submission failed with exception
Like Tweet Pin it Share Share Email

Job Submission failed with exception ‘org.apache.hadoop.hdfs.protocol.DSQuotaExceededException(The DiskSpace quota of /user/username is exceeded: quota = xx’

This is a common problem when you are working in a multi tenant environment with limited  quota.

Reasons:

  1. When large quantity of data is processed via Hive / Pig the temporary data gets stored in .Trash folder which causes  /home directory to reach the quota limit.
  2. When users delete large files without  -skipTrash command.

 

Why it gets stored in .Trash:

Like Recycle bin  /user/username/.Trash is place to store deleted files. If someone deletes file accidentally it can be recovered from here.

Solution:

  1. View the contents of  /user/username folder

$ hdfs dfs -ls /user/username
$ hdfs dfs -du -h /user/username

2. Find the folder with large size  (mostly it will be .Trash). If its something else then use that.

3. Find the large folder inside .Trash

$ hdfs dfs -du -h /user/username/.Trash

4. After making note of the folder for deletion, remove it using hdfs command  with  -skipTrash option

$ hdfs dfs -rm -r -skipTrash /user/username/.Trash/foldername_1

5. Run this command again to find out the status of  /user/username folder.

$ hdfs dfs -du -h /user/username

Now that .Trash is cleared, Hive should work without issues.

Happy Hiving 🙂

Comments (0)

Leave a Reply

Your email address will not be published. Required fields are marked *