Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update configuration.md +performance, batch size #1967

Merged
merged 2 commits into from
Aug 2, 2024

Conversation

MaximilianKohler
Copy link
Contributor

@MaximilianKohler MaximilianKohler commented Aug 1, 2024

I don't know if this is correct, but there's no info in the docs about these kinds of settings. All I could find was one comment here #172 with some info.

I'm trying to figure out a solution to #1717. I'm hoping that the larger the batch size, the longer the delay between retries will be, but I'm not sure if that's true.

I saw another "performance" section https://listmonk.app/docs/maintenance/performance/, but it's under "maintenance" so this seems more appropriate under "configuration".

@knadh
Copy link
Owner

knadh commented Aug 1, 2024

The campaign stats use this metric to show an approximation of progress. By setting a lower number, the progress accuracy can increase.

This is in correct. As of last version, every message is tracked granularly showing accurate progress. Batch size has no effect on the rate calculation.

Under Settings -> SMTP -> retries, listmonk will attempt to retry any failed deliveries after the full batch has been sent. So increasing the batch size will increase the time between retries. This can be helpful if you're getting 4xx sending errors.

This is also not correct. When a message is sent, if it fails, it's retried on a new connection then and there, before being marked as an error (if the retry also fails). There is no batching in retries.

@MaximilianKohler MaximilianKohler deleted the patch-2 branch August 1, 2024 07:42
@MaximilianKohler
Copy link
Contributor Author

So batch size had a minor role in previous versions but is largely meaningless now?

@knadh
Copy link
Owner

knadh commented Aug 1, 2024

It comes in handy when working with a large number of subscribers. Pulling bigger batches (which'll use more memory, but reduce the number of DB queries) will help achieve more throughput.

@MaximilianKohler MaximilianKohler restored the patch-2 branch August 1, 2024 19:58
@knadh knadh merged commit d284e35 into knadh:master Aug 2, 2024
@knadh knadh added the documentation Improvements or additions to documentation label Aug 2, 2024
@MaximilianKohler MaximilianKohler deleted the patch-2 branch August 2, 2024 19:40
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants