I need some help. I have a .mov
file in Firebase Storage. The file is 25 seconds long and 106 MB
. I wrote a callable Firebase function that uses ffmpeg
to convert the file to a .mp4
file and save it to Firebase Storage. When I test the function using the Functions emulator, it works without issue. The function returns successfully and I see the converted file appear in storage. The converted video is about 6 MB
and plays correctly when dowloaded.
When I deploy this function and run it in production on the exact same video file, the function fails with:
'Memory limit of 256 MiB exceeded with 407 MiB used. Consider increasing the memory limit, see https://cloud.google.com/functions/docs/configuring/memory'
As a test, I edited the function and changed its allocated memory to 1 GiB
. Then I test the function again in production. Now I receive the same error:
'Memory limit of 1024 MiB exceeded with 1029 MiB used. Consider increasing the memory limit, see https://cloud.google.com/functions/docs/configuring/memory'
This is my function code:
const {initializeApp} = require("firebase-admin/app");
const {onCall} = require("firebase-functions/v2/https");
const { getStorage, getDownloadURL } = require('firebase-admin/storage');
initializeApp();
exports.convertVideo = onCall((request) => {
const ffmpegPath = require('@ffmpeg-installer/ffmpeg').path;
const ffmpeg = require('fluent-ffmpeg');
ffmpeg.setFfmpegPath(ffmpegPath);
const originalLocation = request.data.originalLocation;
const convertedLocation = request.data.convertedLocation;
const originalVideoFile = getStorage().bucket().file(originalLocation);
const newVideoFile = getStorage().bucket().file(convertedLocation);
return new Promise(async (resolve, reject) => {
await originalVideoFile.download({destination: '/tmp/original'}).catch(console.error);
ffmpeg('/tmp/original')
.addOutputOptions('-movflags +frag_keyframe+separate_moof+omit_tfhd_offset+empty_moov')
.format('mp4')
.on('error', (err) => {
console.log(err);
})
.pipe(newVideoFile.createWriteStream())
.on('error', (err) => {
console.log(err);
})
.on('close', async () => {
fs.unlink('/tmp/original', (err) => {
if (err) throw err;
});
const convertedUrl = await getDownloadURL(newVideoFile);
resolve([convertedLocation, convertedUrl]);
});
});
});
I am sending a test request to the Function emulator using curl:
curl -d '{"data": {"originalLocation": "customer_videos/original_video.mov", "convertedLocation": "customer_videos/converted/original_video.mp4"}}' -H "Content-Type: application/json" http://127.0.0.1:5001/foo/bar/convertVideo
This works correctly. I send the same request to the deployed function, and receive the out of memory error.
curl -d '{"data": {"originalLocation": "customer_videos/original_video.mov", "convertedLocation": "customer_videos/converted/original_video.mp4"}}' -H "Content-Type: application/json" https://convertvideo-foobarbaz-uc.a.run.ap
Can somebody please help me understand why this is happening? I didn't think I was doing anything too memory intensive, especially since it works correctly using the emulator.
It stands to reason that you still haven't configured enough memory for this workload. If your workload needs more memory than can be configured for Cloud Functions, then use a different compute product. If you think your workload shouldn't take as much memory as you've configured, then consider asking a new question to focus on how effectively use ffmpeg in constrained memory situations rather than Cloud Functions config.
Note that the local emulator doesn't enforce memory constraints, so changing the memory config locally won't match the behavior in production you find when deployed. Your Cloud Functions config actually selects specific hardware for deployment, with varying costs based on the amount of memory and cpu available in each instance. See the documentation for details on that.