Currently, our Smartsheet system where I work is rather unorganized and buggy. I have just recently inherited it with the task of upgrading our system. A big issue I have been encountering repeatedly as I work with the Smartsheet API is that automation rules are effectively unavailable through API connection (they are all complex automations, no simple ones here). This makes sheet overhauls hard for me. Essentially, as I make new rows, add new automations, add new columns, I have to do it for each and every sheet in our system (about 80 or so right now). Adding new automations individually is a tedious exercise of mental fortitude, and therefore, not the move. The only way I have found to duplicate automations is effective but still time consuming:
1. Make a template with all of the new improvements on the old sheet
2. Make a new sheet from the template for each old sheet that needs to be replaced
3. Download all of the attachments and comments associated with the old sheets and upload them to the new sheets
4. Delete the old sheets
This is the only way I've found to achieve what I need for upgrading old sheets, but its quickly becoming too large to do manually. I'm looking for a solution via API that will accomplish this process automatically. I can code it either with the SDK or in JSON, and I will probably have more freedom in JSON. So far, I've mostly been playing with the SDK, and only recently have I started messing with JSON for pulling attachments in order to save them.
I know from my last question that I can't necessarily use a get sheet
and then just update another sheet with the data pulled from the get request, but what about copy sheet
? Really the main issue I'm encountering now other than automation issues, is successfully copying or downloading the attachments and uploading them to the corresponding row in the new sheet via code. This is where I started testing with list_all_attachments
, but I feel like this all may just be wishful thinking, but that is why I am reaching out to see if there really is a way around it.
Here is the (non-functioning) code I have so far, but it's not really doing much. It's just been for planning and testing different api calls:
coolurl = 'https://api.smartsheet.com/2.0/sheets/################/attachments'
resp = requests.get(coolurl,headers=headers) #new response is equal to the get request; combine the parameters url, and headers
resp_data = resp.json()['data']
response1 = smartsheet_client.Attachments.list_all_attachments(
################,
include_all=True
)
#making a new sheet
response = smartsheet_client.Folders.create_sheet_in_folder_from_template(
################, #folder id of template
smartsheet.models.Sheet({
'name': 'newsheet',
'from_id': ################,
'attachments': resp_data
}),
'all'
)
Please let me know if anyone has any ideas, thank you!
After almost a month of tooling around, I have finally wrote a program that will do as I described in my question. At first glance it seems that Smartsheet has limited API functionality. While I think there are areas that there could be more, I recognize that when paired with a solid understanding of programming and lots of help from stack, a lot can be accomplished. I am learning Python for the purpose of Smartsheet's API, and as a beginner in both of those subjects, there is a reason why I sat on this for a month.
Below is my code for this answer:
#grab all of the relevant sheets
folder = smartsheet_client.Folders.get_folder(
##################) # folder_id of the project sheets
for sheets in folder.sheets:
oldsheetids.append(str(sheets.id))
print(oldsheetids)
for index in range(len(oldsheetids)):
indexes.append(index)
print(index)
#this function returns the last element in the newsheetids
def linkup():
return int(newsheetids[-1])
def bigupdate():
#loop through the process for each sheet, the 0:3 was for testing the first 3 sheets
for oldsheets in oldsheetids[0:3]:
global newsheetids #these are global because I was testing different approaches for linkup()
newsheetids = []
global newsheetnames
newsheetnames = []
#pull the old sheets from the folder, one at a time
global pulled_sheet
pulled_sheet = smartsheet_client.Sheets.get_sheet(
int(oldsheets), # sheet_id -
include = 'all'
)
#make a copy of the template sheet were using for the update
response1 = smartsheet_client.Sheets.copy_sheet(
groundzeroid, # sheet_id of the template
smartsheet.models.ContainerDestination({
'destination_type': 'folder', # folder, workspace, or home
'destination_id': folderzeroid, # folder_id of the new workspace
'new_name': pulled_sheet.name
}),
include = ['rules','rulerecepients'],
)
print('creating a new sheet for: ' + pulled_sheet.name)
time.sleep(1)
#grab the folder of the new sheets, need to pull all those sheet ids
folder2 = smartsheet_client.Folders.get_folder(
###################) # folder_id of the new updates folder
for sheets1 in folder2.sheets:
newsheetids.append(str(sheets1.id))
newsheetnames.append(str(sheets1.name))
#here we iterate through to grab all the row ids from all the existing rows
goodrow_ids = []
for row in pulled_sheet.rows:
if row.cells[0].value in testvalues:
goodrow_ids.append(str(row.id))#,str(row.row_number)])
print("another row prepped!")
print(goodrow_ids)
print('goodrow_ids printed!')
result = linkup()
print(result)
#copy all the rows to the new sheet
for rowid in goodrow_ids:
response = smartsheet_client.Sheets.copy_rows(
int(oldsheets), #sheet id of row we are copying
smartsheet.models.CopyOrMoveRowDirective({
'row_ids' : int(rowid), #rowid of row to copy
'to' : smartsheet.models.CopyOrMoveRowDestination({
'sheet_id' : int(result) #destination sheet id
})
}),
include = ['all'], ignore_rows_not_found = True
)
print("rows loading in...")
print(pulled_sheet.name + ' new sheet created!')
sheetcounter + 1
time.sleep(5)
bigupdate()
This code satisfies the first three objectives of my question. I did not put any deletion code in this because it seemed unnecessary given how easy it is to do regularly on the site. I realize there are prolly broken bits or redundancies, but as far as functionality goes, it's great for my needs. I tried to leave comments when necessary to explain what's happening in the code. I hope some of this helps!