I have to run 1000 async calculations. Since the API has a limit of 50 requests/min I have to split it up into chunks of 50 and wait for a minute after processing once chunk. Eventually I want to print the results.
resultsArray = [Double]()
// chunked is an extension
points.chunked(into: 50).forEach { pointsChunk in
pointsChunk.forEach { pointsPair
// this function is async
service.calculate(pointsPair) { result in
resultsArray.append(result)
}
}
// wait for a minute before continuing with the next chunk
}
// after all 1000 calculations are done, print result
print(resultsArray)
I did try finding a solution with using DispatchGroup
but struggled on how to incorporate a timer:
let queue = DispatchQueue(label: "MyQueue", attributes: .concurrent)
let chunkGroup = DispatchGroup()
let workGroup = DispatchGroup()
points.chunked(into: 50).forEach { pointsChunk in
chunkGroup.enter()
pointsChunk.forEach { routePointsPair in
workGroup.enter()
// do something async and in the callback:
workGroup.leave()
}
workGroup.notify(queue: queue) {
do { sleep(60) }
chunkGroup.leave()
}
}
chunkGroup.notify(queue: .main) {
print(resultArray)
}
This just executes all chunks at once instead of delayed by 60 seconds.
What I have implemented in a similar situation is manual suspending and resuming of my serial queue.
my queue reference:
public static let serialQueue = DispatchQueue(label: "com.queue.MyProvider.Serial")
func serialQueue() -> DispatchQueue {
return MyProvider.serialQueue
}
suspend queue:
func suspendSerialQueue() -> Void {
self.serialQueue().suspend()
}
resume queue after delay:
func resumeSerialQueueAfterDelay(seconds: Double) -> Void {
DispatchQueue.global(qos: .userInitiated).asyncAfter(deadline: .now() + seconds) {
self.serialQueue().resume()
}
}
This way I have full control over when I suspend and when I resume the queue and I can spread out many API calls evenly over longer period of time.
self.serialQueue().async {
self.suspendSerialQueue()
// API call completion block {
self.resumeSerialQueueAfterDelay(seconds: delay)
}
}
Not sure if this is what you were looking for, but maybe you can adapt my example to your needs.