I want to globally intercept certain $http
error scenarios, preventing controllers from handling the errors themselves. I think an HTTP interceptor is what I need, but I'm not sure how to get my controllers from also handling the error.
I have a controller like this:
function HomeController($location, $http) {
activate();
function activate() {
$http.get('non-existent-location')
.then(function activateOk(response) {
alert('Everything is ok');
})
.catch(function activateError(error) {
alert('An error happened');
});
}
}
And a HTTP interceptor like this:
function HttpInterceptor($q, $location) {
var service = {
responseError: responseError
};
return service;
function responseError(rejection) {
if (rejection.status === 404) {
$location.path('/error');
}
return $q.reject(rejection);
}
}
This works, in as much as the browser redirects to the '/error' path. But the promise catch in HomeController
is also executing, and I don't want that.
I know I could code HomeController
such that it ignores a 404 error, but that's not maintainable. Say I modify HttpInterceptor
to also handle 500 errors, I'd then have to modify HomeController
again (as well as any other controllers that might have since been added that use $http
). Is there a more elegant solution?
A small change in the HttpInterceptor
can serve to break/cancel the promise chain, meaning that neither activateOk
or activateError
on the controller will be executed.
function HttpInterceptor($q, $location) {
var service = {
responseError: responseError
};
return service;
function responseError(rejection) {
if (rejection.status === 404) {
$location.path('/error');
return $q(function () { return null; })
}
return $q.reject(rejection);
}
}
The line return $q(function () { return null; })
, cancels the promise.
Whether this is "ok" is a topic of debate. Kyle Simpson in "You don't know JS" states:
Many Promise abstraction libraries provide facilities to cancel Promises, but this is a terrible idea! Many developers wish Promises had natively been designed with external cancelation capability, but the problem is that it would let one consumer/observer of a Promise affect some other consumer's ability to observe that same Promise. This violates the future-value's trustability (external immutability), but moreover is the embodiment of the "action at a distance" anti-pattern...
Good? Bad? As I say, it's a topic of debate. I like the fact that it requires no change to any existing $http
consumers.
Kyle's quite right when he says:
Many Promise abstraction libraries provide facilities to cancel Promises...
The Bluebird promise library for example has support for cancellation. From the documentation:
The new cancellation has "don't care" semantics while the old cancellation had abort semantics. Cancelling a promise simply means that its handler callbacks will not be called.
Promises are a relatively broad abstraction. From the Promises/A+ specification:
A promise represents the eventual result of an asynchronous operation.
The Angular $http
service uses the $q
implementation of promises to return a promise for the eventual result of an asynchronous HTTP request.
It's worth nothing that $http
has two deprecated functions, .success
and .error
, which decorate the returned promise. These functions were deprecated because they weren't chainable in the typical way promises are, and were deemed to not add much value as a "HTTP specific" set of functions.
But that's not to say we can't make our own HTTP abstraction / wrapper that doesn't even expose the underlying promise used by $http
. Like this:
function HttpWrapper($http, $location) {
var service = {
get: function (getUrl, successCallback, errorCallback) {
$http.get(getUrl).then(function (response) {
successCallback(response);
}, function (rejection) {
if (rejection.status === 404) {
$location.path('/error');
} else {
errorCallback(rejection);
}
});
}
};
return service;
}
Being that this doesn't return a promise, its consumption needs to work a little differently too:
HttpWrapper.get('non-existent-location', getSuccess, getError);
function getSuccess(response) {
alert('Everything is ok');
}
function getError(error) {
alert('An error happened');
}
In the case of a 404, the location is changed to 'error', and neither getSuccess
nor getError
callbacks are executed.
This implementation means the ability to chain HTTP requests is no longer available. Is that an acceptable compromise? Results may vary...
Credit to TJ for his comment:
if you need error handling in a particular controller, you will need conditions to check if an error has been handled in interceptor/service etc
The HTTP interceptor can decorate the promise rejection with a property handled
to indicate whether it's handled the error.
function HttpInterceptor($q, $location) {
var service = {
responseError: responseError
};
return service;
function responseError(rejection) {
if (rejection.status === 404) {
$location.path('/error');
rejection.handled = true;
}
return $q.reject(rejection);
}
}
Controller then looks like this:
$http.get('non-existent-location')
.then(function activateOk(response) {
alert('Everything is ok');
})
.catch(function activateError(error) {
if (!error.handled) {
alert('An error happened');
}
});
Unlike option 2, option 3 still leaves the option for any $http
consumer to chain promises, which is a positive in the sense that it's not eliminating functionality.
Both options 2 and 3 have less "action at a distance". In the case of option 2, the alternative abstraction makes it clear that things will behave differently than the usual $q
implementation. And for option 3, the consumer will still receive the promise to do with as it pleases.
All 3 options satisfy the maintainability criteria, as changes to the global error handler to handle more or less scenarios don't require changes to the consumers.