Skip to content
This repository has been archived by the owner on Nov 7, 2022. It is now read-only.

Add processor queue #197

Merged
merged 6 commits into from
Nov 17, 2018

Conversation

pjanotti
Copy link

Adding pipeline to collector

This is the basis to do the work for actually implementing OC collector. At first enable receivers:

1. OC Receiver
2. Jaeger
3. Zipkin

and connects them with the exporters

The queue processor is added but not connected to exporters by default, although there are
options to do so. They are not connected because the exporters do their own buffering and
the queue is intended to work with senders that do no buffering.

As previously indicated before there is some duplication of code with the agent, this should
be reduced as we implement the interfaces and project coalesce around certain patterns.

The queue is still missing its metrics and more tests.

Close #169
Close #174

Paulo Janotti added 2 commits November 14, 2018 18:25
This is the basis to do the work for actually implementing OC collector
in the short run. At first enable receivers:

1. OC Receiver
2. Jaeger
3. Zipkin

The queue processor is added but not connected to exporters by default,
although there are options to do so.

As previously indicated before there is some duplication of code
with the agent, this should be reduced as we implement the interfaces and
project coalesce around certain patterns.

The queue is missing its metrics and those will be added soon.
@pjanotti pjanotti self-assigned this Nov 15, 2018
@pjanotti pjanotti requested review from odeke-em and maynk November 15, 2018 03:02
README.md Outdated Show resolved Hide resolved
internal/collector/jaeger/receiver.go Outdated Show resolved Hide resolved
cmd/occollector/app/collector/collector.go Show resolved Hide resolved
cmd/occollector/app/collector/collector.go Show resolved Hide resolved
internal/collector/jaeger/receiver.go Show resolved Hide resolved
internal/collector/opencensus/receiver.go Outdated Show resolved Hide resolved
internal/collector/processor/queued_processor.go Outdated Show resolved Hide resolved
internal/collector/processor/queued_processor.go Outdated Show resolved Hide resolved
@odeke-em
Copy link
Member

@pjanotti sorry for the late review responses. From our discussions today after the meeting and also with @bogdandrutu, I've done some internal reflection and will be updating some interfaces and structure later this weekend and early next week. For now though, I'll do a review of this code but as we discussed will need some changes in follow-up PRs.

Copy link
Member

@odeke-em odeke-em left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the updates @pjanotti, I've added some suggestions for updates and then after that LGTM!

CONTRIBUTING.md Outdated Show resolved Hide resolved
CONTRIBUTING.md Outdated Show resolved Hide resolved
internal/collector/processor/multi_processor.go Outdated Show resolved Hide resolved
internal/collector/processor/options.go Outdated Show resolved Hide resolved
internal/collector/processor/processor.go Outdated Show resolved Hide resolved
internal/collector/processor/queued_processor.go Outdated Show resolved Hide resolved
internal/collector/processor/queued_processor.go Outdated Show resolved Hide resolved

spans := []*tracepb.Span{{}}
wantBatches := 10
wg.Add(wantBatches)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does this test ever terminate? I see wg.Add(n) but never any wg.Done() n times.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

😄 yes, it terminates (Travis agree). The wait group is passed to the mock and the mock is in charge of calling wg.Done().

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LOL gotcha, that confused me a bit

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Later perphaps it would be nice to make mockConcurrentSpanProcessor handle all the synchronization logic otherwise right now we are leaking the abstraction with wg.Add and wg.Wait yet no wg.Done.

Methods perhaps:

  • (mockSpanProcessor) runConcurrently(fn func()) which will handle doing wg.Add and then wg.Done
  • (mockSpanProcessor) awaitAsyncProcessing() which then invoke wg.Wait
    will ensure that we don't have to expose such internals nor ever have to worry, thinking about termination

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

good suggestion, done.

}
wantSpans += len(spans)
spans = append(spans, &tracepb.Span{})
go goFn(batch)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We never know when this goroutine terminates. Seems like you actually meant to use the sync.WaitGroup with this goroutine.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See above.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Gotcha gotcha, thanks!

}

// Wait until all batches received
wg.Wait()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't see any usage of wg.Done n times, looks like something is missing and I'd be surprised if we ever get past this point.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ditto

Copy link
Member

@odeke-em odeke-em left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you @pjanotti, LGTM with just one last suggestion to avoid a leaky abstraction with the mockSpanProcessor in test.

@pjanotti
Copy link
Author

Thanks for the review @odeke-em and @maynk! I will look at the suggestion in about 45 minutes, if is something small we may get it already done.

@pjanotti pjanotti merged commit 4f3bb2c into census-instrumentation:master Nov 17, 2018
@pjanotti pjanotti deleted the add-processor-queue branch November 17, 2018 00:13
fivesheep pushed a commit to fivesheep/opencensus-service that referenced this pull request Jun 12, 2019
* Start adding pipeline to collector

This is the basis to do the work for actually implementing OC collector
in the short run. At first enable receivers:

1. OC Receiver
2. Jaeger
3. Zipkin

The queue processor is added but not connected to exporters by default,
although there are options to do so.

As previously indicated before there is some duplication of code
with the agent, this should be reduced as we implement the interfaces and
project coalesce around certain patterns.

The queue is missing its metrics and those will be added soon.
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants