phplaravelqueuejobslaravel-horizon

Laravel 11: job rate limit to not exceed rate limit of external API


since Laravel 11 queues can also have a rate limit now (https://laravel.com/docs/11.x/queues#rate-limiting). My Laravel application is doing some requests to the Shopify API to fetch new products, add notes to some orders and also adding shipment information such as a tracking number to the order.

I have a Shopify Basic plan which is allowed to make a max of 2 requests per second but no more than 40 requests per minute.

Now I want to write a universal job which I can utilize to make calls to the Shopify API. So either I am fetching products or updating products, I want to have one single job class which takes care of that, so I can make sure I am not exceeding the Shopify API rate limit.

However, I am not able to make this work. I have defined a rate limit in my AppServiceProvider.php as described in the Laravel documentation:

public function boot(): void
{
    // Rate limit Shopify API requests set to 2 per second and 40 per minute
    RateLimiter::for('shopify-api-requests', function (object $job) {
        return [
            Limit::perSecond(2),
            Limit::perMinute(40),
        ];
    });
}

This is my reusable job class, which I want to reuse for every request I make to the Shopify API:

class ShopifyApiRequestJob implements ShouldQueue
{
    use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;

    public $endpoint;
    public $method;
    public $data;
    
    public function __construct(string $endpoint, string $method = 'GET', array $data = null)
    {
        $this->endpoint = $endpoint;
        $this->method = $method;
        $this->data = $data;
    }
    
    public function backoff(): array
    {
        return [1, 5, 10];
    }
    
    public function tries(): int
    {
        return 3;
    }
    
    public function middleware(): array
    {
        return [
            new RateLimited('shopify-api-requests'),
            //new WithoutOverlapping('shopify-api-requests')
        ];
    }
    
    public function handle()
    {
        // Construct the full URL
        $url = 'https://' . config('settings.SHOPIFY_API_DOMAIN') . '/admin/api/' . config('settings.SHOPIFY_API_VERSION') . '/' . $this->endpoint;

        $response = Http::withHeaders([
            'X-Shopify-Access-Token' => config('settings.SHOPIFY_API_KEY'),
            'Content-Type' => 'application/json',
        ])->{$this->method}($url, $this->data);

        // Handle the response as needed (e.g., log it, store it, etc.)
        if ($response->failed()) {
            // Handle failure (e.g., retry the job, log the error, etc.)
            Log::error("Shopify API Request Failed (" . $response->status() . "): " . $response->body() . " " . $url);
        } else {
            // Handle success (e.g., process the response, store it, etc.)
            Log::info("Shopify API Request Successful");
        }
    }
}

When I test my job class, it does not behave as expected. I have created a foreach loop and have dispatched my job 10 times.

The expected result I am trying to archive is that every job which got dispatched is not overlapping with another job of that same class (ShopifyApiRequestJob) and per second are only 2 jobs being processed max and per minute 30 jobs max.

However, I end up with a log like this:

[2024-08-24 19:29:11] local.INFO: Shopify API Request Successful  
[2024-08-24 19:29:17] local.INFO: Shopify API Request Successful  
[2024-08-24 19:29:20] local.INFO: Shopify API Request Successful  
[2024-08-24 19:29:26] local.ERROR: App\Jobs\ShopifyApiRequestJob has been attempted too many times. {"exception":"[object] (Illuminate\\Queue\\MaxAttemptsExceededException(code: 0): [...]
[2024-08-24 19:29:26] local.ERROR: App\Jobs\ShopifyApiRequestJob has been attempted too many times. {"exception":"[object] (Illuminate\\Queue\\MaxAttemptsExceededException(code: 0): [...]
[2024-08-24 19:29:26] local.ERROR: App\Jobs\ShopifyApiRequestJob has been attempted too many times. {"exception":"[object] (Illuminate\\Queue\\MaxAttemptsExceededException(code: 0): [...]
[2024-08-24 19:29:26] local.ERROR: App\Jobs\ShopifyApiRequestJob has been attempted too many times. {"exception":"[object] (Illuminate\\Queue\\MaxAttemptsExceededException(code: 0): [...]
[2024-08-24 19:29:26] local.ERROR: App\Jobs\ShopifyApiRequestJob has been attempted too many times. {"exception":"[object] (Illuminate\\Queue\\MaxAttemptsExceededException(code: 0): [...]
[2024-08-24 19:29:26] local.ERROR: App\Jobs\ShopifyApiRequestJob has been attempted too many times. {"exception":"[object] (Illuminate\\Queue\\MaxAttemptsExceededException(code: 0): [...]
[2024-08-24 19:29:26] local.ERROR: App\Jobs\ShopifyApiRequestJob has been attempted too many times. {"exception":"[object] (Illuminate\\Queue\\MaxAttemptsExceededException(code: 0): [...]

Three jobs are being processed sucessfully but all other 7 jobs fail because of MaxAttemptsExceededException. I have increased the $backOff time on purpose to debug it but was not successful.

I don't understand what I did wrong, configuring my job. I have followed the documentation. Anybody can give me an advice on how to solve this problem?

Furthermore, I would like to receive a notification if all retries of a job have failed and not for every retry.

Anybody knows how to archive this behavior?

Kind regards


Solution

  • I've faced similar issue related to Rate Limit with external API.

    Can you remove tries method and add retryUntil method

    public function retryUntil(): \DateTime
    {
        return now()->addMinutes(Illuminate\Support\Carbon::MINUTES_PER_HOUR * 2);
    }
    

    Here is the My Job implementation

    <?php
    
    namespace App\Jobs\Report\PrtgReport;
    
    use App\Models\User;
    use DateTime;
    use Illuminate\Bus\Batchable;
    use Illuminate\Bus\Queueable;
    use Illuminate\Contracts\Queue\ShouldQueue;
    use Illuminate\Foundation\Bus\Dispatchable;
    use Illuminate\Queue\InteractsWithQueue;
    use Illuminate\Queue\Middleware\WithoutOverlapping;
    use Illuminate\Queue\SerializesModels;
    use Illuminate\Support\Carbon;
    use Illuminate\Support\Facades\Cache;
    use Illuminate\Support\Facades\Http;
    
    class SendExternalRequest implements ShouldQueue
    {
        use Batchable;
        use Dispatchable;
        use InteractsWithQueue;
        use Queueable;
        use SerializesModels;
    
        /**
         * The number of seconds the job can run before timing out.
         *
         * @var int
         */
        public $timeout = 120;
    
        /**
         * Create a new job instance.
         *
         * @return void
         */
        public function __construct(User $user)
        {
            //
        }
    
        /**
         * Execute the job.
         *
         * @return void
         */
        public function handle()
        {
            $response = Http::baseUrl('mybaseurl.com')
                ->get('test.php', []);
    
            if ($response->successful()) {
                //logic Goes here
    
            } else {
                $this->release(10);
            }
        }
    
        /**
         * Get the middleware the job should pass through.
         *
         * @return array<int, object>
         */
        public function middleware(): array
        {
            return [
                new APIRateLimiterMiddleware(),
    
                (new WithoutOverlapping(User::class.':HIT:'.$this->user->id))
                    ->releaseAfter(10),
            ];
        }
    
        /**
         * Determine the time at which the job should timeout.
         */
        public function retryUntil(): DateTime
        {
            return now()->addMinutes(Carbon::MINUTES_PER_HOUR * 2);
        }
    }
    

    And Here is the middleware that I use

    <?php
    
    namespace App\Jobs\Middleware;
    
    use App\Models\User;
    use Closure;
    use Illuminate\Support\Facades\Redis;
    
    class APIRateLimiterMiddleware
    {
        /**
         * Process the queued job.
         *
         * @param  \Closure(object): void  $next
         */
        public function handle(object $job, Closure $next): void
        {
            Redis::throttle('API'.':Screenshot')
                ->block(0)
                ->allow(1)
                ->every(3) // Allow 1 job every 3 seconds
                ->then(function () use ($job, $next) {
                    // Lock obtained...
                    $next($job);
                }, function () use ($job) {
                    // Could not obtain lock...
                    $job->release(3); // Release the job back to the queue after 3 seconds
                });
        }
    }
    

    The code I've Provided is just a sample from my implementation pls make sure to adjust to your requirements.