In programming languages and specifically

And from outside perspective it may seem that all of them are similar in terms of performance, but it turns out one of them is much slower compared to the other 3.

Which one you may ask?

According to this HP paper:

While the difference is quite minuscule it's always better to use multiplication instead of division whenever possible.

For example when you need to divide the number by 2 it's better to multiply it by 0.5 instead.

**C#**there is 4 arithmetic operations that can be performed: addition, subtraction, multiplication and division.And from outside perspective it may seem that all of them are similar in terms of performance, but it turns out one of them is much slower compared to the other 3.

Which one you may ask?

**Division**.According to this HP paper:

`Floating point division and square root take considerably longer to compute than addition and multiplication. The latter two are computed directly while the former are usually computed with an iterative algorithm. The most common approach is to use a division-free Newton-Raphson iteration to get an approximation to the reciprocal of the denominator (division) or the reciprocal square root, and then multiply by the numerator (division) or input argument (square root).`

**To confirm it I decided to run a simple test using the code below:**```
System.Diagnostics.Stopwatch watch = new System.Diagnostics.Stopwatch();
watch.Start();
//Addition
for (int i = 1; i < 1000000; i++)
{
float tmp = i + i;
}
watch.Stop();
//Output
Debug.Log("Addition took: " + watch.Elapsed.TotalSeconds.ToString("0.000") + " seconds");
watch.Reset();
watch.Start();
//Subtraction
for (int i = 1; i < 1000000; i++)
{
float tmp = i - i;
}
watch.Stop();
//Output
Debug.Log("Subtraction took: " + watch.Elapsed.TotalSeconds.ToString("0.000") + " seconds");
watch.Reset();
watch.Start();
//Multiplication
for (int i = 1; i < 1000000; i++)
{
float tmp = i * i;
}
watch.Stop();
//Output
Debug.Log("Multiplication took: " + watch.Elapsed.TotalSeconds.ToString("0.000") + " seconds");
watch.Reset();
watch.Start();
//Division
for (int i = 1; i < 1000000; i++)
{
float tmp = i / i;
}
watch.Stop();
//Division
Debug.Log("Division took: " + watch.Elapsed.TotalSeconds.ToString("0.000") + " seconds");
```

**Basically I have ran a million additions, substractions, multiplications, divisions and measured the time each of them took to process.****And here is the result:**`Addition took: 0.008 seconds`

`Subtraction took: 0.008 seconds`

`Multiplication took: 0.008 seconds`

`Division took: 0.010 seconds`

**As you can see addition, substraction and multiplication took around the same time to execute but division appear to be slightly slower.**While the difference is quite minuscule it's always better to use multiplication instead of division whenever possible.

For example when you need to divide the number by 2 it's better to multiply it by 0.5 instead.