I have the weirdest exercise I've ever seen: I have to find a leap year by scanning a year in the console and to control if that is a leap year.
I can only use + - / * %
as arithmetical operators; I am not allowed to use any other operators or functions.
Here is what I have so far:
int year = 0;
bool b = false;
printf ("Type in a year: ");
int helpVar = 1000;
for (int i = 0; i < 4; i++) {
year += (getchar() - '0') * helpVar;
helpVar = helpVar / 10;
}
b = (((year % 4) + (year % 100) + (year % 400)) + 1) % 2;
So I don't understand what I am doing wrong here. It works so far, the only case that's freaking me out is for year "1900". It shouldn't be a leap year, but appears to, by my code.
What am I missing here?
Here is one possibility (perhaps not the shortest -- obviously only works for the Gregorian calendar):
b = (((year-1)%4)+1)/4 - (((year-1)%100)+1)/100 + (((year-1)%400)+1)/400;
The idea is that ((year-1) % n) + 1
equals n
only if year
is a multiple of n
(for positive year
), and is smaller than n
otherwise. Thus, if you divide that by n
, you get 1
if and only if (year % n == 0)
.
Since year%100==0
cannot be true if year%4==0
is not, you can subtract that from each other, but add the year%400==0
term at the end.