I am working on this challenge:
Write a function
sort
in C that takes one integer (int
) as argument. It should return the integer with its digits rearranged in descending order.The algorithm must use recursion.
Example
Input:
26354
Expected output:
65432
This is what I tried:
int sort(int n){
if(n/10 == 0)
return n;
if(n%10 <= sort(n/10)%10)
return n;
int d1=n%10*10;
int d2=n/10%10;
return sort(n-(n%100)+d1+d2);
}
But when I run it, it says "segmentation fault" which I guess means it's calling itself too many times or something like that.
What is my mistake and how can I fix it?
The main issue is that after the comparison with the outcome of the recursive sort
call, you throw away that result from recursion:
With the second return n;
in your code you just return the original n
, not taking into account the work that was done to sort all-but-the-last digits.
With sort(n-(n%100)+d1+d2)
you swapped the rightmost two digits of the original n
, but that could lead to infinite recursion, overflowing the call stack. You need to perform this swap into the sorted partition that you got back from recursion.
Here is the corrected code:
int sort(int n) {
if (n/10 == 0)
return n;
int sorted = sort(n/10); // Don't lose the result from the recursive call
n = sorted * 10 + n%10; // Apply the result of the recursive call to n
if (n%10 <= sorted%10)
return n;
int d1=n%10*10;
int d2=n/10%10;
return sort(n-(n%100)+d1+d2);
}
Note that the recursive call in the final statement could suffice to leave the digit d2
out, and append it after the call, since it is certain that d2
is the least of all digits involved.
Applying that observation and rearranging the code a bit leads to this version:
int sort(int n) {
if (n < 10)
return n;
int d1 = n % 10;
n = sort(n / 10);
int d2 = n % 10;
return d1 <= d2 ? n * 10 + d1 : sort(n - d2 + d1) * 10 + d2;
}