Walmart is running ads right now which claim that shoppers who spend more than $100 per week at the supermarket would save $650 a year by purchasing their groceries at the giant retailer instead.
That's quite a jumble of conditionals and varying metrics: you have to first meet the requirements of shopping at a supermarket and spending over $100 per week; the savings are then presented in a completely different timeframe of one year. That works out to $12.50 a week, or a still sizable 12.5% discount.
Why not present it as 12.5%? The simple answer is that "$700" is a substantial figure, and the marketing folks wanted to make people feel like they were saving more; conversely, $5200 a year on groceries sounds like a lot - better restate that as $100 per week. Depressingly, it occurs to me that many Americans may not know what to do with percentages.
Another key point is found in the wording of the ad - why target shoppers who spend more than $100 a week? If Walmart's prices are really lower, then all shoppers should reap a benefit, not just the high rollers. Since I do not think Walmart is price discriminating (offering discounts only to people spending more than $100), I have to conclude that they restricted their dataset to increase the dollar value of the average person's savings. If every shopper saved 12.5%, then the average annual dollar savings per person might be, say, $250. But if we consider only people who spend more than $100, the average dollar savings jumps to $650 even though the percent savings remains 12.5%. I would guess that $100 was chosen as a cutoff because a) it's a round, friendly number which b) creates a relatively high average dollar savings while c) remaining low enough to be in reach of many American families. This, of course, is further evidence that Americans don't understand percentages well (or at least, that marketers think they can fools us by avoiding them).
Note also that all of my calculations use the stated minimum figure of $100 vs the average figure of $650 to get the 12.5% discount. That's not a real discount - someone spending $100 wouldn't get $650 in savings, as that is the average of all the people spending more than $100. That person would realize a smaller dollar savings, and the real discount rate must therefore be less than 12.5%.