Home Forums MapReduce Data type for precision – BigDecimal?

This topic contains 0 replies, has 1 voice, and was last updated by  Marco Shaw 7 months, 3 weeks ago.

  • Creator
    Topic
  • #49504

    Marco Shaw
    Participant

    **NOVICE**

    I’m playing around with some pretty simple data, but I’m struggling a bit. I’m basically pulling out a text and dollar amount field to work on sales calculations.

    My output ends up looking like this:

    San Jose 9936721.410000008
    Santa Ana 1.0050309929999959E7

    So I have a problem with precision and how the data is outputted.

    What I’ve read seems to suggest that BigDecimal is the data type I should be using for currency, but I’m struggling just a big with how to take that type and convert it into one of the Writable classes. Which Writable class should I be using? Should I be doing my calculations using BigDecimal, and then writing out the K,V as Text?

    My mapper:

    public class TotalSalesMapper extends
    	Mapper<LongWritable, Text, Text, LongWritable> {
    
    	@Override
    	public void map(LongWritable key, Text value, Context context)
        	throws IOException, InterruptedException {
    
    		String data[] = value.toString().split("\t");
    
    		if (data.length == 6) {
    			String store = data[2];
    			double cost = Double.parseDouble(data[4]);
    			//BigDecimal cost = new BigDecimal(data[4]);
    			context.write(new Text(store), new LongWritable(cost));
    		}
    		
    	}
    
    }

    My reducer:

    public class TotalSalesReducer extends
    	Reducer<Text, DoubleWritable, Text, DoubleWritable> {
    
    	@Override
    	public void reduce(Text key, Iterable<DoubleWritable> values, Context
            context)
            throws IOException, InterruptedException {
    
    		double sum = Double.MIN_VALUE;
    		for (DoubleWritable value: values) {
    		//while (values.hasNext()) {
    			sum += value.get();
    			//sum += values.next().get();
    		}
    		context.write(key, new DoubleWritable(sum));
    	}
    }

You must be logged in to reply to this topic.