aTrack is an ITReference* object, value is an NSImage* object, initialized via a URL to a jpeg.
[[[[[aTrack artworks] data_] set] to:value] send];
I get the following message in GDB:
2010-03-09 16:59:42.860 Sandbox[2260:a0f] Can't pack object of class NSImage (unsupported type): <NSImage 0x10054a440 Size={0, 0} Reps=(
I then tried the following code:
NSData *imageData = [[NSData alloc] initWithData:[value TIFFRepresentation]];
[[[[[aTrack artworks] data_] set] to:imageData] send];
and get this message instead
2010-03-09 16:46:09.341 Sandbox[2193:a0f] Can't pack object of class NSConcreteData (unsupported type): <4d4d002a 00000000>
In the AppScript documentation, it says that the "data" property of the "artwork" item is a PICTPicture image.
How do I convert an NSImage to a PICT? Am I using the AppScript all wrong?
Appscript doesn't bridge NSImage, in part because NSImage is AppKit and appscript links against Foundation only; in part because this area of Apple events usage is not well specced. In theory, image-related AEDescs should contain a standard block of bitmap data, but ISTR various hassles with PICT headers when dealing with iTunes artworks. It's messy.
NSData isn't bridged because packing data into an AEDesc without a meaningful descriptor type is largely pointless. If you want to pack an NSData instance, use +[NSAppleEventDescriptor descriptorWithDescriptorType:data:]
.
Also take a look at the artwork
class's raw data
property. That's a later addition and might contain image data in a more sensible form.
Oh, and take a look at the EyeTunes framework; it's not as flexible or efficient as using AppleScript/appscript, but I think it includes code for bridging NSImage to AEDescs.