Skip to content

Commit

Permalink
Kafka-Python Serialization Metrics (#628)
Browse files Browse the repository at this point in the history
* Add more metrics, consumer, and producer tests

Co-authored-by: Timothy Pansino <[email protected]>
Co-authored-by: Hannah Stepanek <[email protected]>

* Remove breakpoint

* Add DT accepted validator

* Fix issue with holdover transaction

Co-authored-by: Lalleh Rafeei <[email protected]>
Co-authored-by: Hannah Stepanek <[email protected]>

* [Mega-Linter] Apply linters fixes

* Fix broken producer error test

* [Mega-Linter] Apply linters fixes

* Working on serializer instrumentation

* Working on testing

* Squashed commit of the following:

commit 0b7b56b
Author: Tim Pansino <[email protected]>
Date:   Tue Sep 20 15:26:49 2022 -0700

    Assert records counts for consumer

commit c0d32bb
Author: Tim Pansino <[email protected]>
Date:   Tue Sep 20 15:05:20 2022 -0700

    Add producer requirement to consumers

    Co-authored-by: Lalleh Rafeei <[email protected]>
    Co-authored-by: Hannah Stepanek <[email protected]>
    Co-authored-by: Uma Annamalai <[email protected]>

commit 9e94920
Author: Tim Pansino <[email protected]>
Date:   Tue Sep 20 14:49:57 2022 -0700

    Remove commented out code

commit b2f1257
Author: Tim Pansino <[email protected]>
Date:   Tue Sep 20 14:49:47 2022 -0700

    Fix exception tracebacks for py27

commit 686c9ae
Author: Tim Pansino <[email protected]>
Date:   Tue Sep 20 14:27:40 2022 -0700

    Fix errors in test and tox matrix

    Co-authored-by: Hannah Stepanek <[email protected]>
    Co-authored-by: Uma Annamalai <[email protected]>
    Co-authored-by: Lalleh Rafeei <[email protected]>

commit 7f92c6d
Author: Hannah Stepanek <[email protected]>
Date:   Tue Sep 20 13:54:48 2022 -0700

    Fix Py2.7 kafka consumer first access issue

commit 4245201
Author: Hannah Stepanek <[email protected]>
Date:   Tue Sep 20 12:34:35 2022 -0700

    Add sudo to tz info

commit 2251a0a
Author: Hannah Stepanek <[email protected]>
Date:   Tue Sep 20 12:26:36 2022 -0700

    Use ubuntu-latest

commit 1ca1175
Author: Hannah Stepanek <[email protected]>
Date:   Tue Sep 20 12:15:59 2022 -0700

    Grab librdkafka from confluent

commit bb0a192
Author: Tim Pansino <[email protected]>
Date:   Tue Sep 20 10:37:33 2022 -0700

    Fixed cutting release from lsb

commit 3cf3852
Author: Hannah Stepanek <[email protected]>
Date:   Tue Sep 20 10:19:24 2022 -0700

    Fixup: librdkafka installed from universe

commit bf20359
Author: Tim Pansino <[email protected]>
Date:   Mon Sep 19 16:46:23 2022 -0700

    Use lsb to install librdkafka-dev

commit a85e3fd
Merge: 7fc2b89 d5cf9a0
Author: Timothy Pansino <[email protected]>
Date:   Mon Sep 19 16:34:19 2022 -0700

    Merge branch 'develop-kafka' into feature-confluent-kafka

commit 7fc2b89
Author: Tim Pansino <[email protected]>
Date:   Mon Sep 19 16:30:19 2022 -0700

    Fix package name

commit f25c59d
Author: Hannah Stepanek <[email protected]>
Date:   Mon Sep 19 16:00:07 2022 -0700

    Specify later version of librdkafka

commit 6be9b43
Author: Hannah Stepanek <[email protected]>
Date:   Mon Sep 19 09:26:39 2022 -0700

    Fix removing client_id from incorrect kafka

commit d658ef1
Author: Hannah Stepanek <[email protected]>
Date:   Fri Sep 16 10:50:04 2022 -0700

    Add install of librdkafka-dev for kafka

commit 940f9f4
Author: Tim Pansino <[email protected]>
Date:   Fri Sep 9 16:09:21 2022 -0700

    Clean up names

commit 8bbed46
Author: Tim Pansino <[email protected]>
Date:   Fri Sep 9 14:58:21 2022 -0700

    Serialization timing

commit 761c753
Merge: bccb321 7897a99
Author: Hannah Stepanek <[email protected]>
Date:   Fri Sep 9 12:51:29 2022 -0700

    Merge branch 'feature-confluent-kafka' of github.com:newrelic/newrelic-python-agent into feature-confluent-kafka

commit bccb321
Author: Hannah Stepanek <[email protected]>
Date:   Fri Sep 9 12:51:09 2022 -0700

    Fix test_consumer_errors tests

commit 7897a99
Author: hmstepanek <[email protected]>
Date:   Fri Sep 9 19:04:58 2022 +0000

    [Mega-Linter] Apply linters fixes

commit 34c084c
Author: Hannah Stepanek <[email protected]>
Date:   Fri Sep 9 12:01:18 2022 -0700

    Fix consumer tests & reorg fixtures

commit ff77d90
Author: Tim Pansino <[email protected]>
Date:   Thu Sep 8 14:53:28 2022 -0700

    Consumer testing setup

commit 9f1451e
Author: Tim Pansino <[email protected]>
Date:   Thu Sep 8 13:25:15 2022 -0700

    Confluent kafka test setup

commit 74c443c
Author: Tim Pansino <[email protected]>
Date:   Thu Sep 8 11:49:03 2022 -0700

    Starting work on confluent kafka

* Clean up tests to refactor out fixtures

* Refactor and test serialization tracing

* Finish fixing testing for serialization.

Co-authored-by: Lalleh Rafeei <[email protected]>
Co-authored-by: Hannah Stepanek <[email protected]>
Co-authored-by: Uma Annamalai <[email protected]>

* Add serializer object test

* Remove unused test files

* Starting merge of confluent kafka to kafka python

* Make message trace terminal optional

* Clean up confluent_kafka tests

* Update kafkapython tests

* Fix failing tests

* Finish kafkapython serialization

Co-authored-by: Lalleh Rafeei <[email protected]>
Co-authored-by: Hannah Stepanek <[email protected]>
Co-authored-by: Uma Annamalai <[email protected]>

* Clean up tests

* Remove kafkapython deserialization metrics

* Fix py2 testing differences

* Add mutliple transaction test

* Rename fixtures

* Fix multiple transaction consumer failure

Co-authored-by: Lalleh Rafeei <[email protected]>
Co-authored-by: Timothy Pansino <[email protected]>
Co-authored-by: Hannah Stepanek <[email protected]>
Co-authored-by: Lalleh Rafeei <[email protected]>
Co-authored-by: Uma Annamalai <[email protected]>
Co-authored-by: Hannah Stepanek <[email protected]>
  • Loading branch information
7 people committed Sep 27, 2022
1 parent 75a8c6e commit 8beb0cc
Show file tree
Hide file tree
Showing 8 changed files with 456 additions and 156 deletions.
17 changes: 10 additions & 7 deletions newrelic/api/message_trace.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,14 +28,16 @@ class MessageTrace(CatHeaderMixin, TimeTrace):
cat_appdata_key = "NewRelicAppData"
cat_synthetics_key = "NewRelicSynthetics"

def __init__(self, library, operation, destination_type, destination_name, params=None, **kwargs):
def __init__(self, library, operation, destination_type, destination_name, params=None, terminal=True, **kwargs):
parent = kwargs.pop("parent", None)
source = kwargs.pop("source", None)
if kwargs:
raise TypeError("Invalid keyword arguments:", kwargs)

super(MessageTrace, self).__init__(parent=parent, source=source)

self.terminal = terminal

self.library = library
self.operation = operation

Expand Down Expand Up @@ -69,7 +71,7 @@ def __repr__(self):
)

def terminal_node(self):
return True
return self.terminal

def create_node(self):
return MessageNode(
Expand All @@ -89,7 +91,7 @@ def create_node(self):
)


def MessageTraceWrapper(wrapped, library, operation, destination_type, destination_name, params={}):
def MessageTraceWrapper(wrapped, library, operation, destination_type, destination_name, params={}, terminal=True):
def _nr_message_trace_wrapper_(wrapped, instance, args, kwargs):
wrapper = async_wrapper(wrapped)
if not wrapper:
Expand Down Expand Up @@ -131,7 +133,7 @@ def _nr_message_trace_wrapper_(wrapped, instance, args, kwargs):
else:
_destination_name = destination_name

trace = MessageTrace(_library, _operation, _destination_type, _destination_name, params={}, parent=parent, source=wrapped)
trace = MessageTrace(_library, _operation, _destination_type, _destination_name, params={}, terminal=terminal, parent=parent, source=wrapped)

if wrapper: # pylint: disable=W0125,W0126
return wrapper(wrapped, trace)(*args, **kwargs)
Expand All @@ -142,18 +144,19 @@ def _nr_message_trace_wrapper_(wrapped, instance, args, kwargs):
return FunctionWrapper(wrapped, _nr_message_trace_wrapper_)


def message_trace(library, operation, destination_type, destination_name, params={}):
def message_trace(library, operation, destination_type, destination_name, params={}, terminal=True):
return functools.partial(
MessageTraceWrapper,
library=library,
operation=operation,
destination_type=destination_type,
destination_name=destination_name,
params=params,
terminal=terminal,
)


def wrap_message_trace(module, object_path, library, operation, destination_type, destination_name, params={}):
def wrap_message_trace(module, object_path, library, operation, destination_type, destination_name, params={}, terminal=True):
wrap_object(
module, object_path, MessageTraceWrapper, (library, operation, destination_type, destination_name, params)
module, object_path, MessageTraceWrapper, (library, operation, destination_type, destination_name, params, terminal)
)
6 changes: 6 additions & 0 deletions newrelic/core/message_node.py
Original file line number Diff line number Diff line change
Expand Up @@ -51,6 +51,12 @@ def time_metrics(self, stats, root, parent):
yield TimeMetric(name=name, scope=root.path,
duration=self.duration, exclusive=self.exclusive)

# Now for the children, if the trace is not terminal.

for child in self.children:
for metric in child.time_metrics(stats, root, self):
yield metric

def trace_node(self, stats, root, connections):
name = root.string_table.cache(self.name)

Expand Down
179 changes: 132 additions & 47 deletions newrelic/hooks/messagebroker_kafkapython.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,12 +13,19 @@
# limitations under the License.
import sys

from kafka.serializer import Serializer

from newrelic.api.application import application_instance
from newrelic.api.function_trace import FunctionTraceWrapper
from newrelic.api.message_trace import MessageTrace
from newrelic.api.message_transaction import MessageTransaction
from newrelic.api.time_trace import notice_error
from newrelic.api.time_trace import current_trace, notice_error
from newrelic.api.transaction import current_transaction
from newrelic.common.object_wrapper import wrap_function_wrapper
from newrelic.common.object_wrapper import (
ObjectProxy,
function_wrapper,
wrap_function_wrapper,
)

HEARTBEAT_POLL = "MessageBroker/Kafka/Heartbeat/Poll"
HEARTBEAT_SENT = "MessageBroker/Kafka/Heartbeat/Sent"
Expand Down Expand Up @@ -47,6 +54,7 @@ def wrap_KafkaProducer_send(wrapped, instance, args, kwargs):
destination_type="Topic",
destination_name=topic or "Default",
source=wrapped,
terminal=False,
) as trace:
dt_headers = [(k, v.encode("utf-8")) for k, v in trace.generate_request_headers(transaction)]
headers.extend(dt_headers)
Expand All @@ -57,49 +65,6 @@ def wrap_KafkaProducer_send(wrapped, instance, args, kwargs):
raise


def metric_wrapper(metric_name, check_result=False):
def _metric_wrapper(wrapped, instance, args, kwargs):
result = wrapped(*args, **kwargs)

application = application_instance(activate=False)
if application:
if not check_result or check_result and result:
# If the result does not need validated, send metric.
# If the result does need validated, ensure it is True.
application.record_custom_metric(metric_name, 1)

return result

return _metric_wrapper


def instrument_kafka_heartbeat(module):
if hasattr(module, "Heartbeat"):
if hasattr(module.Heartbeat, "poll"):
wrap_function_wrapper(module, "Heartbeat.poll", metric_wrapper(HEARTBEAT_POLL))

if hasattr(module.Heartbeat, "fail_heartbeat"):
wrap_function_wrapper(module, "Heartbeat.fail_heartbeat", metric_wrapper(HEARTBEAT_FAIL))

if hasattr(module.Heartbeat, "sent_heartbeat"):
wrap_function_wrapper(module, "Heartbeat.sent_heartbeat", metric_wrapper(HEARTBEAT_SENT))

if hasattr(module.Heartbeat, "received_heartbeat"):
wrap_function_wrapper(module, "Heartbeat.received_heartbeat", metric_wrapper(HEARTBEAT_RECEIVE))

if hasattr(module.Heartbeat, "session_timeout_expired"):
wrap_function_wrapper(
module,
"Heartbeat.session_timeout_expired",
metric_wrapper(HEARTBEAT_SESSION_TIMEOUT, check_result=True),
)

if hasattr(module.Heartbeat, "poll_timeout_expired"):
wrap_function_wrapper(
module, "Heartbeat.poll_timeout_expired", metric_wrapper(HEARTBEAT_POLL_TIMEOUT, check_result=True)
)


def wrap_kafkaconsumer_next(wrapped, instance, args, kwargs):
if hasattr(instance, "_nr_transaction") and not instance._nr_transaction.stopped:
instance._nr_transaction.__exit__(*sys.exc_info())
Expand All @@ -110,7 +75,12 @@ def wrap_kafkaconsumer_next(wrapped, instance, args, kwargs):
# StopIteration is an expected error, indicating the end of an iterable,
# that should not be captured.
if not isinstance(e, StopIteration):
notice_error()
if current_transaction():
# Report error on existing transaction if there is one
notice_error()
else:
# Report error on application
notice_error(application=application_instance(activate=False))
raise

if record:
Expand Down Expand Up @@ -177,11 +147,126 @@ def wrap_kafkaconsumer_next(wrapped, instance, args, kwargs):
return record


def wrap_KafkaProducer_init(wrapped, instance, args, kwargs):
get_config_key = lambda key: kwargs.get(key, instance.DEFAULT_CONFIG[key]) # noqa: E731

kwargs["key_serializer"] = wrap_serializer(
instance, "Serialization/Key", "MessageBroker", get_config_key("key_serializer")
)
kwargs["value_serializer"] = wrap_serializer(
instance, "Serialization/Value", "MessageBroker", get_config_key("value_serializer")
)

return wrapped(*args, **kwargs)


class NewRelicSerializerWrapper(ObjectProxy):
def __init__(self, wrapped, serializer_name, group_prefix):
ObjectProxy.__init__.__get__(self)(wrapped)

self._nr_serializer_name = serializer_name
self._nr_group_prefix = group_prefix

def serialize(self, topic, object):
wrapped = self.__wrapped__.serialize
args = (topic, object)
kwargs = {}

if not current_transaction():
return wrapped(*args, **kwargs)

group = "%s/Kafka/Topic" % self._nr_group_prefix
name = "Named/%s/%s" % (topic, self._nr_serializer_name)

return FunctionTraceWrapper(wrapped, name=name, group=group)(*args, **kwargs)


def wrap_serializer(client, serializer_name, group_prefix, serializer):
@function_wrapper
def _wrap_serializer(wrapped, instance, args, kwargs):
transaction = current_transaction()
if not transaction:
return wrapped(*args, **kwargs)

topic = "Unknown"
if isinstance(transaction, MessageTransaction):
topic = transaction.destination_name
else:
# Find parent message trace to retrieve topic
message_trace = current_trace()
while message_trace is not None and not isinstance(message_trace, MessageTrace):
message_trace = message_trace.parent
if message_trace:
topic = message_trace.destination_name

group = "%s/Kafka/Topic" % group_prefix
name = "Named/%s/%s" % (topic, serializer_name)

return FunctionTraceWrapper(wrapped, name=name, group=group)(*args, **kwargs)

try:
# Apply wrapper to serializer
if serializer is None:
# Do nothing
return serializer
elif isinstance(serializer, Serializer):
return NewRelicSerializerWrapper(serializer, group_prefix=group_prefix, serializer_name=serializer_name)
else:
# Wrap callable in wrapper
return _wrap_serializer(serializer)
except Exception:
return serializer # Avoid crashes from immutable serializers


def metric_wrapper(metric_name, check_result=False):
def _metric_wrapper(wrapped, instance, args, kwargs):
result = wrapped(*args, **kwargs)

application = application_instance(activate=False)
if application:
if not check_result or check_result and result:
# If the result does not need validated, send metric.
# If the result does need validated, ensure it is True.
application.record_custom_metric(metric_name, 1)

return result

return _metric_wrapper


def instrument_kafka_producer(module):
if hasattr(module, "KafkaProducer"):
wrap_function_wrapper(module, "KafkaProducer.__init__", wrap_KafkaProducer_init)
wrap_function_wrapper(module, "KafkaProducer.send", wrap_KafkaProducer_send)


def instrument_kafka_consumer_group(module):
if hasattr(module, "KafkaConsumer"):
wrap_function_wrapper(module.KafkaConsumer, "__next__", wrap_kafkaconsumer_next)
wrap_function_wrapper(module, "KafkaConsumer.__next__", wrap_kafkaconsumer_next)


def instrument_kafka_heartbeat(module):
if hasattr(module, "Heartbeat"):
if hasattr(module.Heartbeat, "poll"):
wrap_function_wrapper(module, "Heartbeat.poll", metric_wrapper(HEARTBEAT_POLL))

if hasattr(module.Heartbeat, "fail_heartbeat"):
wrap_function_wrapper(module, "Heartbeat.fail_heartbeat", metric_wrapper(HEARTBEAT_FAIL))

if hasattr(module.Heartbeat, "sent_heartbeat"):
wrap_function_wrapper(module, "Heartbeat.sent_heartbeat", metric_wrapper(HEARTBEAT_SENT))

if hasattr(module.Heartbeat, "received_heartbeat"):
wrap_function_wrapper(module, "Heartbeat.received_heartbeat", metric_wrapper(HEARTBEAT_RECEIVE))

if hasattr(module.Heartbeat, "session_timeout_expired"):
wrap_function_wrapper(
module,
"Heartbeat.session_timeout_expired",
metric_wrapper(HEARTBEAT_SESSION_TIMEOUT, check_result=True),
)

if hasattr(module.Heartbeat, "poll_timeout_expired"):
wrap_function_wrapper(
module, "Heartbeat.poll_timeout_expired", metric_wrapper(HEARTBEAT_POLL_TIMEOUT, check_result=True)
)
Loading

0 comments on commit 8beb0cc

Please sign in to comment.