What is the most performative way of finding the minimum number of bits needed to represent a natural number (i.e., no signal) in JavaScript? Is there a way to do without using loops?
The code below for example works for every integer between 0
and 2^30 - 1
(infinite loop if it is greater than this):
var x = 9001;
var bits = 1;
while ( x >= (1 << bits) )
bits++;
And this works up to 2^53
(the largest integer representably guaranteed without loss of precision) and somewhat beyond:
var x = 9001;
var bits = 1;
var limite = 2;
while ( x >= limite ) {
bits++;
limite *= 2;
}
But both use loops, basically asking: does it fit into 1 bit? fits in 2 bits? fits in 3 bits? etc. I was curious to know if there is a better way to do this.
Note: I'm only interested in knowing how many bits are needed, not really actually doing that rendering. Even though JavaScript does not use int
, long
, unsigned int
, etc - but double
for everything ... (and when using , does not expose to the programmer)