hiveout-of-memoryreducememory-efficienthadoop-partitioning

Hive: GC Overhead or Heap space error - dynamic partitioned table


Could you please guide me to resolve this GC overhead and heap space error.

I am trying to insert partitioned table from another table (dynamic partition) using the below query:

INSERT OVERWRITE table tbl_part PARTITION(county)
SELECT  col1, col2.... col47, county FROM tbl;

I have ran the following parameters:

export  HADOOP_CLIENT_OPTS=" -Xmx2048m"
set hive.exec.dynamic.partition=true;  
set hive.exec.dynamic.partition.mode=nonstrict; 
SET hive.exec.max.dynamic.partitions=2048;
SET hive.exec.max.dynamic.partitions.pernode=256;
set mapreduce.map.memory.mb=2048;
set yarn.scheduler.minimum-allocation-mb=2048;
set hive.exec.max.created.files=250000;
set hive.vectorized.execution.enabled=true;
set hive.merge.smallfiles.avgsize=283115520;
set hive.merge.size.per.task=209715200;

Also added in yarn-site.xml :

<property>
<name>yarn.nodemanager.vmem-check-enabled</name>
<value>false</value>
<description>Whether virtual memory limits will be enforced for    containers</description>
</property>

<property>
<name>yarn.nodemanager.vmem-pmem-ratio</name>
<value>4</value>
<description>Ratio between virtual memory to physical memory when setting memory limits for containers</description>
</property>

Running free -m:

            total       used       free     shared    buffers     cached
Mem:         15347      11090       4256          0        174       6051
-/+ buffers/cache:       4864      10483
Swap:        15670         18      15652

Its a standalone cluster with 1 core. Preparing test data to run my unit test cases in spark.

Could you guide what else I could do.

The source table has the below properties:

Table Parameters:       
    COLUMN_STATS_ACCURATE   true                
    numFiles                13                  
    numRows                 10509065            
    rawDataSize             3718599422          
    totalSize               3729108487          
    transient_lastDdlTime   1470909228          

Thank you.


Solution

  • Add DISTRIBUTE BY county to your query:

    INSERT OVERWRITE table tbl_part PARTITION(county) SELECT  col1, col2.... col47, county FROM tbl DISTRIBUTE BY county;