Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove usages of RateLimiter #16

Closed
sbordet opened this issue Jul 27, 2017 · 6 comments
Closed

Remove usages of RateLimiter #16

sbordet opened this issue Jul 27, 2017 · 6 comments

Comments

@sbordet
Copy link
Member

sbordet commented Jul 27, 2017

@olamy, RateLimiter blocks and therefore we don't want to use it in this load generator.

I'm not sure why RateLimiter is used at all. The load generator runs at the given rate by design, so why you want to limit the rate, that won't be exceeded anyway ?

@olamy
Copy link
Member

olamy commented Jul 27, 2017

it's optional and not used per default. I was able to generate traffic with less spike as the limit is at a lower level. As I understand the current rate limitation is before request are queued. So in case of a big queue (server took time to answer) suddenly when the server answer dequeuing can be faster that the rate.
Anyway if you don't like this optional component I don't mind moving it somewhere else.

@sbordet
Copy link
Member Author

sbordet commented Jul 27, 2017

@olamy this RateLimiter limits the request sending. If there is queueing in sending requests, the load generator is not configured properly because it cannot sustain its own load.
Yes, removing it and associated classes.

@olamy
Copy link
Member

olamy commented Jul 27, 2017

yup it's the goal to limit to a given rate.
Again it's optional and not used per default. And it can really help users.

@olamy
Copy link
Member

olamy commented Jul 27, 2017

maybe in starter module (out of core part)?

@sbordet
Copy link
Member Author

sbordet commented Jul 27, 2017

@olamy I don't understand what it is for. We already have a way to specify the rate. Why do we need 2 ways of specifying the rate ?

@olamy
Copy link
Member

olamy commented Jul 28, 2017

As I can understand we currently "limit" the queueing and not sending.
For any reasons in case of some long response from the target server, the queue size can increase and then suddenly (when server is back) all requests are dequeueing which generate some burst we don't really want.
Anyway I removed the classes :-)

olamy added a commit that referenced this issue Jul 28, 2017
Signed-off-by: olivier lamy <[email protected]>
@olamy olamy closed this as completed Jul 28, 2017
sbordet added a commit that referenced this issue Jul 28, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants