The problem is as follows.
I would like to detect files (images mostly) that could be optimized on my webpage. One way to do it is to use Google PageSpeed, but then I get only results for the exact webpage (like bbc.com). I would like to get suggestions for optimization for all of the subpages (like bbc.com/xyz, or bbc.com/sdasdwe) at once (in one list). My webpage has got Google Analytics code in it, if it helps.
Is it possible?
I can think of several ways:
web server logfiles. make sure that you're collecting image files data (often thrown away) and write a script to reap all unique instances of images. You could even order them by frequency of use. This would best satisfy the constraint "working on the images most used, first"
use or write a webcrawler - fairly easy. Restrict it to your own domain. Maintain a list of visited pages so you don't revisit a resource you've already crawled. Collect a unique list of all images that you've seen, and optionally add a count of the number of unique pages on which you've seen the image. THis best answers the constraint "and I want to work on the images that are most frequently referenced".
I don't know any tools that do these specific jobs. I can write code. It's usually quicker and easier to write code like this than to find a tool that does exactly what you want. If you can't code, I'd start with Screaming Frog. You can probably make that collect image file references.
Google Analytics is not, IMO, a good place to start, except to generate a list of pages in order of popularity or load time... Tempting targets to optimise first. GA doesn't specifically track image usage. You'd have to extend the page tracking to do that - a little piece of javascript that identifies each "img" tag and fires up a pageview for each usage. Personally, I'd do that to a different different google analytics account; it'll make your standard analytics look really strange, and stuff like pages per session will explode, bounce rates will become pointless, funnels won't work well, etc.