We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
We can use https://github.com/open-telemetry/opentelemetry-collector-contrib/tree/main/cmd/oteltestbedcol to compare the performance of an OTLP pipeline with a batch processor versus an OTLP pipeline with an enabled batcher in the OTLP exporter.
The text was updated successfully, but these errors were encountered:
@dmitryax I can take this on, can you assign it to me?
Sorry, something went wrong.
@swiatekm, done. Thank you!
Can we reopen? I'd like add benchmarks for more processing-heavy scenarios, as suggested by @sfc-gh-sili and @jmacd in the PR.
swiatekm
Successfully merging a pull request may close this issue.
We can use https://github.com/open-telemetry/opentelemetry-collector-contrib/tree/main/cmd/oteltestbedcol to compare the performance of an OTLP pipeline with a batch processor versus an OTLP pipeline with an enabled batcher in the OTLP exporter.
The text was updated successfully, but these errors were encountered: