I need to read from a file a binary number that is represented by type short in C language.
the number represents the size of an array so after I read it I want to use the number as a decimal value.
I tried to convert the binary number into decimal but i get the wrong value.
for example: if the binary numbers in the file are 0000000000000101 , the size in decimal should be 5 but when i use the "classic" binary to decimal i get the value 17.
The "classic" function:
void main()
{
int num, binary_val, decimal_val = 0, base = 1, rem;
printf("Enter a binary number(1s and 0s) \n");
scanf("%d", &num);
binary_val = num;
while (num > 0)
{
rem = num % 10;
decimal_val = decimal_val + rem * base;
num = num / 10 ;
base = base * 2;
}
printf("The Binary number is = %d \n", binary_val);
printf("Its decimal equivalent is = %d \n", decimal_val);
}
when i enter just the last 3 numbers (101) i get the value 5 but when i type the full number (0000000000000101) i get 17.
any ideas to solve this? what am I missing here?
thanks
This example works as you wish.
#include <math.h>
#include <stdio.h>
int convert(long long n);
int main() {
long long n;
printf("Enter a binary number: ");
scanf("%lld", &n);
printf("%lld in binary = %d in decimal", n, convert(n));
return 0;
}
int convert(long long n) {
int dec = 0, i = 0, rem;
while (n != 0) {
rem = n % 10;
n /= 10;
dec += rem * pow(2, i);
++i;
}
return dec;
}