At a certain company, each employee has a salary grade s that is at least 1 and at most 5. Each employee receives an hourly wage p, in dollars, determined by the formula p = 9.50 + 0.25(s – 1). An employee with a salary grade of 5 receives how many more dollars per hour than an employee with a salary grade of 1?
This seems like a very very simple problem. But my answer does not match the OA. Please give some explanations.
Looks simple to me also .
P5 - P1 = 0.25 * 4 - 0.25 * 0 = $1 ( 9.5 just cancels out in doing the diff).
Am I missing something?