I'm using Python 2.7 and a service account to copy a file in Google Drive into another folder based on its title. This requires me to execute five (5!) commands:
files().list
)files().list
)files().copy
)files().insert
)parents().insert
)This all works but I'd like to reduce the number of calls and first that means caching the ID's so I don't have to call files().list
. The next thing I'm trying to do, and specifically where I'm at with this question, is how to set the parent folder within the files().copy
command. The documentation has an optional parents
parameter described like so:
Collection of parent folders which contain this file. Setting this field will put the file in all of the provided folders. On insert, if no folders are provided, the file will be placed in the default root folder.
Although it doesn't say, I know it means parent ID's since that's what is used everywhere else. However, setting this array in the client library doesn't produce an effect: no error and the file is definitely not in the right folder.
newfile = {'title': newtitle, 'parents' : [ parentFolderId ]}
service.files().copy(fileId=originalId, body=newfile).execute()
Has anyone had any luck with this? Is there something else I'm missing?
Bonus: transfer ownership in copy command? Perhaps my service account could impersonate me?
Ah ha! The parents
array is a list of objects with the id
field:
newfile = {'name': newtitle, 'parents' : [ { "id" : parentFolderId } ]}
service.files().copy(fileId=originalId, body=newfile).execute()
I'll update this if/when I figure out how to set permissions also.
One strange note here is that the file is still being copied to the drive root as well as the parent folder(s) I specify.