javascriptc#file-uploadmultipart

List<IFormFile> is null for large files. Reproducible at will


I have a C# Controller method that happily receives small files but when I try to upload a large file from the browser client, the "files" parameter is null.

I have searched the web and added a request size limit: -

[HttpPost]
[RequestSizeLimit(107374182400)]
public async Task<IActionResult> PostFormData(List<IFormFile> files, CancellationToken cancellationToken = default)

I have also tried to configure the limit in appsettings,json: -

"Kestrel": {
    "Limits": {
        "MaxRequestBodySize": 107374182400 // 100 GB example (1024*1024*1024)
    }
}

Is there something else I need to do?

This is the JS client: -

  const abortControl = new AbortController();
  const signal = abortControl.signal;
  const fetchOpts = {signal:signal, method:'POST', body: fileHolder, cache:"no-cache"};
  const response = fetch("https://localhost:7196/upload/PostFormData", fetchOpts).catch(
    err => {
        console.log("Upload failed: " + err.message);
        return
  })
  if (signal.aborted) {
    alert("Cancelled")
    return
  }

The file is 10,737,418,240 bytes big - bigfile.txt

The content-type is still multipart stream.

EDIT 1

I'd like to be able to process multiple files at the same time (Full code below) The 10GB test was just a "test" to prove it could be done. It would be the rare exception and configurable.

If I can't use the [FromBody] binding what do I need to change if I choose the "Set MultipartBodyLengthLimit" route?

[HttpPost]
[RequestSizeLimit(107374182400)]
public async Task<IActionResult> PostFormData(List<IFormFile> files, CancellationToken cancellationToken = default)
{
    if (!Request.ContentType?.Contains("multipart/form-data") == true)
    {
        return BadRequest("Unsupported media type.");
    }

    if (files == null || files.Count <= 0)
    {
        return BadRequest("Argument cannot be null and must be > 0 - 'files'");
    }

    long totalBytes = 0;
    foreach (var file in files) {
        totalBytes += file.Length;
        if (totalBytes > maxTotalSize)
            break;
    }
    if (totalBytes > maxTotalSize)
    {
        return BadRequest("Total File Size exceeds limit.");
    }

    var copyResults = new List<FileCopyStatus>();        

    var fileOptions = new FileStreamOptions()
    {
        Mode = FileMode.Create,
        Access = FileAccess.ReadWrite,
        Options = FileOptions.Asynchronous,
    };

    var loopOptions = new ParallelOptions { MaxDegreeOfParallelism = 3 };

    await Parallel.ForEachAsync(files, loopOptions, async (file, _) =>
    {
        if (file.Length <= 0 || file.Length > maxFileSize)
        {
            lock (_resultsLock)
            {
                copyResults.Add(new FileCopyStatus
                {
                    FileName = file.FileName,
                    UploadStatus = (int)CopyStatus.Invalid,
                    Message = "File Size"
                });
            }
            _logger.LogError("Invalid file copy for {file} prevented due to File Size", file.FileName);
        }
       
    // Check file.contentType and file.filename type

        try
        {
            var filePath = Path.Combine(folder, file.FileName);
            fileOptions.PreallocationSize = file.Length;

            using (var stream = new FileStream(filePath, fileOptions))
            {
                await file.CopyToAsync(stream, cancellationToken);
            }
            lock (_resultsLock)
            {
                lock (_resultsLock)
                {
                    copyResults.Add(new FileCopyStatus
                    {
                        FileName = file.FileName,
                        UploadStatus = (int)CopyStatus.Success
                    });
                }
            }
        }
        catch (TaskCanceledException)
        {
            lock (_resultsLock)
            {
                copyResults.Add(new FileCopyStatus
                {
                    FileName = file.FileName,
                    UploadStatus = (int)CopyStatus.Cencelled
                });
            }
            _logger.LogError("File copy for {file} cancelled.", file.FileName);
        }
        catch (Exception ex)
        {
            lock (_resultsLock)
            {
                copyResults.Add(new FileCopyStatus
                {
                    FileName = file.FileName,
                    UploadStatus = (int)CopyStatus.Error,
                    Message = ex.Message
                });
            }
            _logger.LogError(ex, "Error saving {file}", file.FileName);
        }
    });

    if (cancellationToken.IsCancellationRequested)
    {
    }

    return Ok(copyResults);
}

EDIT 2

This behaviour is reproducible at will. Here is the client code I'm using. Put the server code in some controller and see what I mean.

STOP: My mistake, I'm working on a new version of the code that uses XHR instead of FETCH to get progress events. Unfortunately the "working" code I was using to demo a large upload still had a CANCEL after 10 secs. Yes I do feel stupid. My Bad :-(

<!DOCTYPE HTML>
<html>
<head>
<style>
body {
  font-size: 14px;
}
#container {
  position: relative;
  display: flex;
  width: 10em;
  height: 10em;
  padding: 0.9em;;
  border: 1px solid black;
}
#div1 {
  position: absolute;
  top: 0;
  left: 0;
  display: flex;
  align-items: center;
  justify-content: center;
  width: 100%;
  height: 100%;
  background-color: lightblue;
}
#div2 {
  --fade-time: 0;
  position: absolute;
  top: 0;
  left: 0;
  width: 100%;
  height: 100%;
  opacity: 0%;
  background-repeat: no-repeat;
  background-position: center;
  background-size: cover;
  background-image: url("data:image/svg+xml;utf8,<svg xmlns='http://www.w3.org/2000/svg' width='32' height='32' viewBox='0 0 300 300'><path fill='%23FFF' stroke='%23E50024' stroke-width='50' d='M150 25 a125 125 0 1 0 2 0z m 2 100 a25 25 0 1 1-2 0z'/></svg>");
  animation: pulseImage var(--fade-time) infinite;
}
body > p {
    color: red;
    font-size: 1.2em;
}
div > p {
    font-weight: bold;
}
@keyframes pulseImage {
  from {
    opacity: 0;
  }
  to {
    opacity: 0.7;
  } 
}
</style>
<script>
document.addEventListener("DOMContentLoaded", (e) => {
        document.addEventListener('drop', function(e){ e.preventDefault() })
        document.addEventListener('dragenter', function(e){ e.preventDefault(); e.dataTransfer.dropEffect = "none" })
        document.addEventListener('dragover', function(e){ e.preventDefault(); e.dataTransfer.dropEffect = "none" })

        const dropZone = document.getElementById("div2")
        dropZone.addEventListener("dragenter", (e) => {
                console.log("In enter " + e.target.id)
                e.dataTransfer.dropEffect = "copy"
                e.preventDefault()
                e.stopPropagation()
                document.getElementById("div2").style.setProperty("--fade-time","2.0s")
            }, {capture: true})
        dropZone.addEventListener("dragover", (e) => {
                console.log("In over " + dropZone.id)
                e.dataTransfer.dropEffect = "copy"
                e.stopPropagation()
                e.preventDefault()
            })
        dropZone.addEventListener("dragleave", (e) => {
                console.log("In leave") 
                e.dataTransfer.dropEffect = "none"              
                e.preventDefault();
                e.stopPropagation()
                document.getElementById("div2").style.removeProperty("--fade-time")
            }, {capture: true})
        dropZone.addEventListener("drop", catchFiles)
    })

async function catchFiles(e) {
  e.preventDefault()
  e.stopPropagation()
  document.getElementById("div2").style.removeProperty("--fade-time")   
  console.log("File(s) dropped");
  
  let fileHolder = new FormData()
  let fileCount = 0

  if (e.dataTransfer.items) {
    [...e.dataTransfer.items].forEach((item, i) => {
      if (item.kind === "file") {
        const file = item.getAsFile()
        console.log(`… file[${i}].name = ${file.name}`)
        fileHolder.append("files", file, file.name)
        fileCount++
      }
    });
  } else {
    [...e.dataTransfer.files].forEach((file, i) => {
      console.log(`… file[${i}].name = ${file.name}`);
      fileHolder.append("files", file, file.name)
      fileCount++
    });
  }
  if (fileCount == 0) {
    alert("Zero files received")
    return
  }
  alert("got " + fileCount + " files")
  const abortControl = new AbortController();
  const signal = abortControl.signal;
  (function(ac){
    var x = ac
    setTimeout(
        () => {
            x.abort();
            console.log("$*")
        }, 10000)
    })(abortControl)
  const fetchOpts = {signal:signal, method:'POST', body: fileHolder, cache:"no-cache"};
  const response = await fetch("https://localhost:7196/upload/PostFormData", fetchOpts).catch(
    err => {
        console.log("Upload failed: " + err.message);
        return
  });

  if (signal.aborted) {
    alert("Cancelled")
    return
  }

  let resp
  if (response.ok) {
    resp = await response.json()
  } else {
    resp = await response.text()
  }
  console.log(resp)
}

</script>
</head>
<body>

<h1>File Drop Upload Example</h1>

<p>Drag your file(s) into the Drop Zone</p>

<div id="container">
    <div id="div1">
        <p>File Drop Zone</p>
    </div>
    <div id="div2"></div>
</div>


</body>
</html>

Solution

  • ASP.NET Core can handle huge multipart/form-data requests just fine, but you cannot use [FromForm] binding to IFormFile (or FormFile) for requests larger than 128MB:

    You can adjust the FormOptions.MultipartBodyLengthLimit value to a larger value, but I don't recommend it here because it sounds like you're dead-set on allowing multi-gigabyte sized uploads, which aren't going to buffer well.

    Note that FormFile does buffer to-disk once the amount of data received exceeds the in-memory maximum (a paltry 64KB), but I suspect you probably don't want these centigigabyte-sized blobs filling up your system's local disks either.

    Instead, I recommend you follow the instructions under Upload large files with streaming instead:

    private const Int64 HUNDRED_GIGABYTES = 100 * 1024 * 1024 * 1024; // 107,374,182,400 bytes
    
    [HttpPost]
    [RequestSizeLimit( HUNDRED_GIGABYTES )]
    public async Task<IActionResult> PostFormData( CancellationToken cancellationToken )
    {
        MediaTypeHeaderValue requestContentType = MediaTypeHeaderValue.Parse( this.Request.ContentType );
        String requestMultipartBoundary = HeaderUtilities.RemoveQuotes( requestContentType.Boundary ).Value;
    
        //
    
        MultipartReader reader = new MultipartReader( requestMultipartBoundary, this.Request.Body );
        
        while( true )
        {
            MultipartSection? section = await reader.ReadNextSectionAsync( cancellationToken );
            if( section is null ) break;
    
            ContentDispositionHeaderValue? cd = section.GetContentDispositionHeader();
            if( cd is null ) continue;
            if( !cd.IsFileDisposition() ) continue;
    
            //
    
            using( Stream outputStream = /* FileStream? SAN stream? Some stream-consumer adapted into a Sink-Stream? etc? */ )
            {
                await section.Body.CopyToAsync( outputStream, cancellationToken );
            }
        }
    }
    

    Further improvements are possible still: for example, use HttpRequest.BodyReader (an PipeReader) for lower-level access - though you'll need to parse/process multipart/form-data separators yourself.