C# Why Division is Slower Than Multiplication?

In programming languages and specifically C# there are 4 arithmetic operations that can be performed: addition, subtraction, multiplication, and division.

And from an outside perspective, it may seem that all of them are similar in terms of performance, but it turns out one of them is much slower compared to the other 3.

Which one is slower you may ask? Division.

According to this HP paper:

Floating point division and square root take considerably longer to compute than addition and multiplication. The latter two are computed directly while the former are usually computed with an iterative algorithm. The most common approach is to use a division-free Newton-Raphson iteration to get an approximation to the reciprocal of the denominator (division) or the reciprocal square root, and then multiply by the numerator (division) or input argument (square root).

To verify the statement above I decided to run a simple test using the code below:

        //Generate two random numbers
        var rand = new System.Random();
        float a = rand.Next();
        float b = rand.Next();

        Debug.Log("Number a: " + a + " Number b: " + b);

        System.Diagnostics.Stopwatch watch = new System.Diagnostics.Stopwatch();

        watch.Start();
        //Addition
        for (int i = 1; i < 1000000; i++)
        {
            float tmp = a + b;
        }
        watch.Stop();
        //Output
        Debug.Log("Addition took: " + watch.Elapsed.TotalSeconds.ToString("0.0000") + " seconds");

        watch.Reset();
        watch.Start();
        //Subtraction
        for (int i = 1; i < 1000000; i++)
        {
            float tmp = a - b;
        }
        watch.Stop();
        //Output
        Debug.Log("Subtraction took: " + watch.Elapsed.TotalSeconds.ToString("0.0000") + " seconds");

        watch.Reset();
        watch.Start();
        //Multiplication
        for (int i = 1; i < 1000000; i++)
        {
            float tmp = a * b;
        }
        watch.Stop();
        //Output
        Debug.Log("Multiplication took: " + watch.Elapsed.TotalSeconds.ToString("0.0000") + " seconds");

        watch.Reset();
        watch.Start();
        //Division
        for (int i = 1; i < 1000000; i++)
        {
            float tmp = a / b;
        }
        watch.Stop();
        //Division
        Debug.Log("Division took: " + watch.Elapsed.TotalSeconds.ToString("0.0000") + " seconds");

Basically, I ran a million additions, subtractions, multiplications, and divisions for the two random numbers and measured the time each of them took to process, the test was repeated 5 times, and here is the result:

  • Addition on average took 0.0004 seconds
  • Subtraction on average took 0.0003 seconds
  • Multiplication on average took 0.0003 seconds
  • Division on average took 0.0044 seconds

The result showed that addition, subtraction, and multiplication are similar in terms of performance, but division appears to be around 1100% slower.

Not a small difference, which leads to the conclusion that it's always better to use multiplication instead of division whenever possible. For example, when you need to divide the number by 2, best to multiply it by 0.5 instead.

Suggested Articles
Why is Square Root a Slow Operation in C#?
Introduction to C#
A Guide to Writing and Retrieving Data from Multi-threaded Code in C#
Tips for Landing a Dream Job for Aspiring C# Developers
Essential Programming Tips for C# Developers
Handling Octet Data Streams in C#
Ultimate Keyboard Guide for C# Developers