I'm testing out a trigger on MongoDB atlas which runs a Realm function for adding an object to Algolia index upon insertion to the MongoDB collection. In my case the record gets uploaded to Algolia index successfully but the function doesn't stop there and happens to exceed the time limit.
The docs mention that
Function runtime is limited to 120 seconds
and that's the reason for the function to timeout
Here is my Realm function
exports = function(changeEvent) {
const algoliasearch = require('algoliasearch');
const client = algoliasearch(context.values.get('algolia_app'),context.values.get('algolia_key'));
const index = client.initIndex("movies");
changeEvent.fullDocument.objectID = changeEvent.fullDocument._id;
delete changeEvent.fullDocument._id;
index.saveObject(changeEvent.fullDocument)
.then(({objectID}) => {
console.log('successfully inserted: ',objectID);
})
.catch(err => {
console.log(err);
});
};
Here is the result I get on the logs
Logs:
[
"successfully inserted: 61cf0a79c577393620dd8c80"
]
Error:
execution time limit exceeded
I even tried with return
statements after the console.log
s but still the same issue.
What I'm I doing wrong
Apparently this was fixed by MongoDB team early this March as seen by https://www.mongodb.com/community/forums/t/extremely-slow-execution-of-an-external-dependency-function/16919/27.
I tested with this code below and it worked perfect without any timeouts this time. I made the function to be an async function. According to the logs it didn't even take 1 second to perform the indexing.
exports = async function(changeEvent) {
const algoliasearch = require('algoliasearch');
const client = algoliasearch(context.values.get('algolia_app'),context.values.get('algolia_key'));
const index = client.initIndex("movies");
changeEvent.fullDocument.objectID = changeEvent.fullDocument._id;
delete changeEvent.fullDocument._id;
try{
const result = await index.saveObject(changeEvent.fullDocument);
console.log(Date.now(),'successfully updated: ',result);
}
catch(e){
console.error(e);
}
}