Name: Anonymous 2006-09-25 16:25
In a lot of videogame code, whenever a lot of numbers are to be divided by one number, I usually see this:
inv = 1.0 / divisor;
...and then all the numbers * by inv
rather than all the numbers being / by divisor.
I'm guessing, at the CPU level, division is slightly slower and less efficient than multiplication? I can't really think of any other reason why someone would do this.
inv = 1.0 / divisor;
...and then all the numbers * by inv
rather than all the numbers being / by divisor.
I'm guessing, at the CPU level, division is slightly slower and less efficient than multiplication? I can't really think of any other reason why someone would do this.