I try to upload large file in PHP to Youtube, using the API V3. I just copied the official code from there.
It perfectly works for small uploads (<100Mo). But when I try to upload large files, it returns :
Fatal error: Out of memory (allocated 102760448) (tried to allocate 1048577
bytes) in /homepages/40/d216486693/htdocs/uploadsys.php on line 143
The code I use is just below, and line 143 is $chunk = fread($handle, $chunkSizeBytes);
My server can allow some more memory, but what if I try to upload 1GB size files ?
In fact, the main problem is that if I check memory usage during the loop with chunk read & send, it never decrease !
It's just like if each chunk stays in memory... What can I do ? Tried to unset $chunk with no success
$snippet = new Google_Service_YouTube_VideoSnippet();
$filtre1 = array("'","'","'","\'");
$snippet->setTitle(str_replace("\\","",str_replace($filtre1,"’",$title)));
$snippet->setDescription("Descr");
$snippet->setTags(explode(",",$keywords));
$snippet->setCategoryId($ytcat);
$status = new Google_Service_YouTube_VideoStatus();
$status->privacyStatus = "unlisted";
$video = new Google_Service_YouTube_Video();
$video->setSnippet($snippet);
$video->setStatus($status);
$chunkSizeBytes = 1 * 1024 * 1024;
$client->setDefer(true);
$insertRequest = $youtube->videos->insert("status,snippet", $video);
$media = new Google_Http_MediaFileUpload($client,$insertRequest,'video/*',null,true,$chunkSizeBytes);
$file_size = filesize($videofile);
$media->setFileSize($file_size);
// Read the media file and upload it chunk by chunk.
$status = false;
$handle = fopen($videofile, "rb");
$cc = 0;
while (!$status && !feof($handle)) {
//Line 143 below
$chunk = fread($handle, $chunkSizeBytes);
$status = $media->nextChunk($chunk);
$cc++;
}
fclose($handle);
We've faced the same problem and solved it by reducing the chunk size. The official code sample of $chunkSizeBytes = 1 * 1024 * 1024;
may cause the server to stop responding and report memory problems. Reduce the size of the chunks and try again.