YARN and MapReduce Memory Configuration Settings on AWS
Im having lots of troubles to make a Single Node Hadoop work on AWS. Im using an m1.large instance (2 cpu cores, 7,5 GB RAM, 1 disk). I already tried the HDP2.1 and HDP2.0 stacks and with the default configuration the Smoke Tests work fine but then Pig Scripts dont Run. Once I solved a similar problem problem changing the YARN memory settings, but this time I tried again and it doesnt work. Unfortunately I didnt write down the values I used when it did worked,
Acoording to the YARN Utility Script recomended in:
I get the following results:
Using cores=2 memory=7GB disks=1 hbase=True
Profile: cores=2 memory=4096MB reserved=3GB usableMem=4GB disks=1
I did the changes but then Mapreduce or other Smoke tests fail and the jobs still dont run. I also tried several configurations (8 cointainters of 512MB, etc) with no luck.
Does anybody can suggest a right configuration for YARN and MapReduce memory setting for a single node in AWS m1.large?
I’ll highly appreciate any help
You must be logged in to reply to this topic.