macosapplescriptautomatorautosaveflashair

Apple automator & applescript : saving pictures from a web page if do not exist locally


I am currently trying to automate a task as an app but have difficulties with the step "save URLs" and need a custom applescript to replace it.

To give you the context of my project: I bought a Toshiba FlashAir SD WIFI card for wifi tethered picture shooting. I want to download all the file in real time from my camera with the SD WIFI card(installed in my camera) to my Mac Computer. Toshiba FlashAir runs its own network you connect to, and you can browse the SD content through a web-browser (no ftp,no webdav..., so only http connection). I prepared an html/Jquery page uploaded to the SD, with the FlashAir API so that when you hit the page http : // flashair you get a page with all the image links refreshed every second. It's nearly real-time and display new shot images in the browser.

I want to download those pictures on my computer so that iPhoto or any other photo app can "watch" the directory like in tethered mode and process on the fly if I need to.

On my computer side, with AUTOMATOR :

  1. I "get the specific URL" > http : // flashair ( to reach the SD card micro server).
  2. Then "get all the image URLs" from this specific URL and related ones
  3. Then Download ONLY the pictures that I don't already have on my computer (this is where I lack applescript knowledge)
  4. Then Loop for 240minutes... (to observe the remote page and download new files ONLY again and again.)

So everything work perfect but when I launch the Automation workflow, The same pictures are saved again and again with the filename suffix -01, -02, ... . It means I have the same amount of duplicate pictures as the loop instance.

The automator process "Download URLs" does not allow me to precise if I want to download only new or modified files from remote to local.

Is there someone who can help me with this "applescript" step, so that I can replace the automator "Download URLs" by a specific Applescript step that looks if the files already exists then if not download them every loop instance ?

Thanks a lot for your precious answer, I am stuck with that, and everything is flawless, but that duplicated files issue.

Damien


Solution

  • Here is the code you are looking for:

    on run {input, parameters}
        set dLocation to POSIX path of (path to downloads folder) & "test/"
    
        set fileList to {}
        set AppleScript's text item delimiters to {"/"}
        repeat with i from 1 to (count of input)
            # With the delimeter to '/' last item is the filename!
            set urlStr to (get item i of input) as text
            set urlFile to last text item of urlStr
    
            set savedFile to (dLocation & urlFile) as POSIX file
    
            log "Saved file is: " & savedFile
    
            try
                savedFile as alias
                log "File exists: " & savedFile
            on error mMsg
                set end of fileList to urlStr
                log "Adding URL: " & urlStr
            end try
            #       delay 5
        end repeat
        return fileList
    end run
    

    All the above does is to grab the filename at the end of the link. It checks if the file exists and if not adds the link to the list of links to download. It then passes the list of links on.

    After the Applescript action add a Download URLs action and you should be done.

    EDIT:

    So the Workflow will be like this:

    1. Define Site URL
    2. Action: Get URL Links from Webpages
    3. Action: Run applescript (code above)
    4. Action: Download URLs
    5. Action: Pause (optional)
    6. Action: Loop

    Note that the Loop Action limits you to 1000 minutes. So you have two choices, 1) add more loop actions, or 2) launch the workflow from a script with an infinite loop...

    The other way is to write the whole thing as a script and have it run continuously.

    EDIT2:

    Oops just reread your question. It was for 240minutes, so a single loop function will work just fine.