Skip to content

Add API rate limit handler #371

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 9 commits into
base: main
Choose a base branch
from

Conversation

NathalieCharbel
Copy link
Contributor

@NathalieCharbel NathalieCharbel commented Jun 25, 2025

Description

  • New RateLimitHandler interface
  • Default behavior: retry and exponential backoff using Tenacity
  • Provider-specific errors are converted to RateLimitError
  • Basic users get rate limiting out-of-the-box
  • Advanced users can implement custom rate limiting strategies via the RateLimitHandler interface

Type of Change

  • New feature
  • Bug fix
  • Breaking change
  • Documentation update
  • Project configuration change

Complexity

Complexity: Medium

How Has This Been Tested?

  • Unit tests
  • E2E tests
  • Manual tests

Checklist

The following requirements should have been met (depending on the changes in the branch):

  • Documentation has been updated
  • Unit tests have been updated
  • E2E tests have been updated
  • Examples have been updated
  • New files have copyright header
  • CLA (https://neo4j.com/developer/cla/) has been signed
  • CHANGELOG.md updated if appropriate

@NathalieCharbel NathalieCharbel requested a review from a team as a code owner June 25, 2025 09:08
raise convert_to_rate_limit_error(e)
raise

return active_handler.handle_sync(inner_func)()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we need to call the returned function here?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not so familiar with defining decorators. But don't we still need this to return the retry version of the inner function?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The decorator should return a function, that's why I tend to think the final () are not needed, but I haven't tested.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we omit the trailing (), the decorator would return the wrapped function object instead of the function’s result.

before_sleep=before_sleep_log(logger, logging.WARNING),
)
@functools.wraps(func)
async def wrapper(*args: Any, **kwargs: Any) -> Any:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Did you manage to test it in async mode? I'm worried we'll have some weird behaviour with the concurrent mode if retries are using await asyncio.sleep(..), that would allow other calls to be "enqueued" and almost all of the following calls will have to wait, which will still fail pretty quickly since they are all waiting and retrying at the same time.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I havn't thought about that but we should definitely avoid this. One solution could be adding randomisation to delay retries. something like the below:

  def handle_async(self, func: AF) -> AF:
        @retry(
            retry=retry_if_exception_type(RateLimitError),
            stop=stop_after_attempt(self.max_attempts),
            wait=wait_exponential(
                multiplier=self.multiplier,
                min=self.min_wait,
                max=self.max_wait,
            ) + (wait_random(0, 1) if self.jitter else wait_fixed(0)),  # add jitter
            before_sleep=before_sleep_log(logger, logging.WARNING),
        )

wdyt?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To be tested ;)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks like Tenacity already has a function that does exponential backoff + jitter, so I ended up using it 8cfacb4


class CustomRateLimitHandler(RateLimitHandler):
"""Implement your custom rate limiting strategy."""
# Implement required methods: handle_sync, handle_async
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Which parameters are these methods getting as input?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

func representing sync/async functions that need rate limit handling applied to them as it is the case for current RetryRateLimitHandler (the default one)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you think this interface could be used to limit the number of requests per second for instance?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we would like to extend our strategy for rate limit handling and do throttling, we can define a new subclass:

class ThrottlingRateLimitHandler(RateLimitHandler):
    
   def __init__(self, requests_per_window: int = 10, window_size: float = 1.0):
        self.requests_per_window = requests_per_window
        self.window_size = window_size
        ....

    
    def handle_sync(self, func: F) -> F:
        @functools.wraps(func)
        def wrapper(*args: Any, **kwargs: Any) -> Any:
            # handle throttling here
            return func(*args, **kwargs)
        return wrapper
        
     .....

then we can basically use the same decorators with the invoke functions of the LLM classes, we just need to ensure we pass the right rate limit handler in their constructors.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants