I am writing the following size of macro, I wanted to know the difference.
#define my_sizeof(type) (char*)(&type+1)-(char*)(&type)
#define my_sizeof(type) (void*)(&type+1)-(void*)(&type)
1st question is why the type casting is required, I know that if I don't type cast it, it always return 1, I checked it by running it. I want to know the significance(I mean what it tells the compiler to do).
2ndly,
What difference does it make with char* and void*.?
For the second question: You cannot do that at all, since there is no pointer arithmetic for void pointers (or for pointers to incomplete types in general, for that matter).
For the first part: By definition, sizeof(char) == 1
, so by casting the pointers to char pointers, you obtain the difference in units of 1 rather than in units of sizeof(type)
-- in other words, you obtain precisely the value of sizeof(type)
.