I have compared division vs. multiplication where the reciprocal value is only computed once before the actual test.
Factor and divisor:
double factor = 1.0 / 32747.0;
double divisor = 32747.0;
Calculations:
double result1 = 28739.0 / divisor;
double result2 = 28739.0 * factor;
I repeated each calculation 1 billion times and measured the time before and after. This way I was able to measure the time how long 1 billion divisions and multiplications take.
There is really a significant difference. But actually it's the division that is faster, not the multiplication. The multiplication is about 50% slower.
Here are 5 results of the test:
Division: 00:00:01.0106775
Multiplication: 00:00:01.5038100
Division: 00:00:01.0067715
Multiplication: 00:00:01.5067395
Division: 00:00:01.0077480
Multiplication: 00:00:01.5018570
Division: 00:00:01.0272780
Multiplication: 00:00:01.5653295
Division: 00:00:01.0575495
Multiplication: 00:00:01.5995070
Surprised?
P.S.: I googled a bit and found a site that also stated that multiplications are faster than divisions. That was even related to C#. But I double checked the results and even tested it with float values instead of double values and used a value with more decimals. But it was always this 50% that the multiplications took longer.
P.P.S.: A friend has run the test program and the results are quite different. On his computer both division and multiplication take virtually identical times. So, maybe there are computers where the multiplication is faster than the division.