I'm trying to create a WriteBatch
to keep control of one of my dynamic references in my database. My app have a simple User-Follow-Post-Feed
model where I want my user to see in his feed the posts of all the users he is following. What I'm doing after research on Firebase examples (as Firefeed ) and a lot of posts on Stack Overflow.
The optimal idea is keep a path (collection
in this case) where I store the Ids
of the posts that my user should see in his feed, which means keep control of copy and delete every post of all the users that he follow/unfollow.
I made my Cloud functions
for keep this in an atomic way, and everything is working fine, but when I tried to do a massive test, adding more than 5000 posts for an user an trying to follow him (looking for see how much time the Cloud function
would take), I saw that batches have a limit of 500 operations. So what I did is split my 5000 id's in multiple small lists and execute one batch per list, never surpassing the 500 limit.
But even doing it on this way, I still getting an error that I can't do more than 500 operations in a single commit
, I don't know if maybe is because the batches are executing at the same time, or why. I think that maybe I can concat one after another, and avoid execute them all at once. But I still having some troubles with it. So that's the reason of my question.
Here is my method:
fun updateFeedAfterUserfollow(postIds: QuerySnapshot, userId: String) {
//If there is no posts from the followed user, return
if (postIds.isEmpty) return
val listOfPostsId = postIds.map { it.id }
val mapOfInfo = postIds.map { it.id to it.toObject(PublicUserData::class.java) }.toMap()
//Get User ref
val ref = firestore.collection(PRIVATE_USER_DATA).document(userId).collection(FEED)
//Split the list in multiple list to avoid the max 500 operations per batch
val idsPartition = Lists.partition(listOfPostsId, 400)
//Create a batch with max 400 operations and execute it until the whole list have been updated
idsPartition.forEach { Ids ->
val batch = firestore.batch().also { batch ->
Ids.forEach { id -> batch.set(ref.document(id), mapOfInfo[id]!!) }
}
batch.commit().addOnCompleteListener {
if (it.isSuccessful)
Grove.d { "Commit updated successfully" }
else Grove.d { "Commit fail" }
}
}
}
Finally the problem was caused because I was trying to realize this batch operations inside a transaction, which also acts like a batch in the end. That's why even I was generating batches for each 400 references, these instances were created inside a transaction and it counts like a single big transaction which surpass the 500 limit.
I made a few changes and implemented on a repository on my GitHub.
//Generate the right amount of batches
const batches = _.chunk(updateReferences, MAX_BATCH_SIZE)
.map(dataRefs => {
const writeBatch = firestoreInstance.batch();
dataRefs.forEach(ref => {
writeBatch.update(ref, 'author', newAuthor);
});
return writeBatch.commit();
});
It is written in typescript, but you will understand it for sure: https://github.com/FrangSierra/firestore-cloud-functions-typescript/blob/master/functions/src/atomic-operations/index.ts