I've heard all the cases in favour of using a CDN like Google APIs to host JavaScript libraries like JQuery and Prototype for my web application. It's faster, saves bandwidth, permits parallel loading of scripts, and so on. But I recently came across the following comment in Douglas Crockford's json2.js script:
USE YOUR OWN COPY. IT IS EXTREMELY UNWISE TO LOAD CODE FROM SERVERS YOU DO NOT CONTROL.
I'm curious what his argument might be behind this assertion, and whether it's specifically targeted at users of public CDNs like Google's, or something else?
Assuming he's talking about professionally hosted CDNs like Google, then the best bet is to do this:
<!-- Grab Google CDN's jQuery, with a protocol relative URL; fall back to local if necessary -->
<script src="//ajax.googleapis.com/ajax/libs/jquery/1.5.1/jquery.js"></script>
<script>window.jQuery || document.write("<script src='js/libs/jquery-1.5.1.min.js'>\x3C/script>")</script>
(taken from http://html5boilerplate.com/)
That way, you get all the benefits, without the risk of your website breaking if Google's CDN goes down.
But, he said:
USE YOUR OWN COPY. IT IS EXTREMELY UNWISE TO LOAD CODE FROM SERVERS YOU DO NOT CONTROL.
I don't actually think he's talking about CDNs. I think he's just saying "don't hotlink scripts from random websites".
You wouldn't want to do this because the website might change where the script is located, or even change the script. A CDN would never do this.