This question already has an answer here:
at the beginning, I write a demo to get 2G memory, but the I got a problem like this, I don't know why, what different of num1, num2, num3?
#include<stdio.h>
int main(){
unsigned long num1 = 1024*1024*1024*2;
unsigned long num2 = 1024*1024*1024*2.0;
unsigned long num3 = 1024*1024*1024;
num3 *= 2;
printf("num1:%lu\n num2:%lu\n num3:%lu\n", num1, num2, num3);
return 0;
}
output:
num1:18446744071562067968
num2:2147483648
num3:2147483648
</div>
转载于:https://stackoverflow.com/questions/53059040/why-1024102410242-1024102410242
The first line, unsigned long num1 = 1024*1024*1024*2;
, initializes an unsigned long
(num1
) with an int
. The computation 1024*1024*1024*2
has type int
because all the values are int
literals. Use the UL
suffix on at least one of them to make it unsigned long
:
unsigned long num1 = 1024*1024*1024*2UL;
As for why the other two methods work: the second line initializes num2
with a double
, which has enough precision to produce the correct value in this case; the third line initializes num3
with an int
as well, but 1024*1024*1024
is representable by a 32-bit int
(it's 2^30, which is less than 2^31-1). The multiplication by 2 after that is done on an unsigned long
, so it works as expected.