I'm new to the django and in my first app,
I would like to upload a csv file, with unknown number of rows, parse it's content and display the parsed data, than manipulate this data and save it in a table.
What is the best approach for achieving this goal ?
Currently I have a view and a form for uploading the file,
my first attempt was to parse the uploaded file in that view and send the parsed data to a second view :
def UploadFileView(request):
if request.method == 'POST':
form = UploadFileForm(request.POST, request.FILES)
if form.is_valid():
upload_list = handle_uploaded_file(request.FILES['file'])
return HttpResponseRedirect(reverse('preview_file', ?????))
else:
form = UploadFileForm()
return render(request, 'upload_file.html', {'form': form,})
upload_list - a list of objects which describe a row in the file.
????? - what to put here in order to send the list to a second view ?
I understand that I need to serialize this list somehow in order to be able to send it to the second view, i.e preview_file.
The csv file has unknown number of lines, it can be 3, 10 or 500.
While posting this question, I thought about another option and would like to know if it's possible and if it's a better option :
Not to parse the file in the first view, but to send it to the second view and call handle_uploaded_file in that view.
Save the list to the session:
import random
from django.shortcuts import redirect
key = 'preview%s' % random.randint(0, 999999)
request.session[key] = upload_list
return redirect('preview_file', key)
And the in the prefiew_file()
:
from django.http import Http404
def preview_file(request, key):
upload_list = request.session.get(key)
if not upload_list:
raise Http404
UPDATE: Session size limitation depend on the used SESSION_ENGINE. The default django.contrib.sessions.backends.db
engine stores session data in the database TEXT
field. In case of Postgres/SQLite such fields can contain up to 1Gb of data. For MySQL the limit is 64Kb.