Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ext/system-metrics: adding instrumentation to collect system metrics #652

Merged
merged 21 commits into from
May 29, 2020

Conversation

codeboten
Copy link
Contributor

@codeboten codeboten commented May 5, 2020

Adding an extension to provide users an easy mechanism to collect metrics for their system. Some thoughts around this PR:

  • we could make which metrics are observed configurable
  • the semantic conventions around metric naming are still being defined so they are likely to change
  • the tests are currently checking the aggregators directly, I would like to change this to using the in-memory metrics exporter

The PR includes changes which I pulled out into this separate PR #653. I'll resolve if that one gets merged first.

@codeboten codeboten added WIP Work in progress ext labels May 5, 2020
Adding an extension to provide users an easy mechanism to collect metrics for their system. As part of this change, I also added an InMemoryMetricsExporter. I ended up adding meter_provider and memory_metrics_exporter to TestBase
@codeboten codeboten marked this pull request as ready for review May 6, 2020 15:49
@codeboten codeboten requested a review from a team May 6, 2020 15:49
@codeboten codeboten removed the WIP Work in progress label May 6, 2020
@codeboten codeboten changed the title [WIP] ext/system-metrics: adding instrumentation to collect system metrics ext/system-metrics: adding instrumentation to collect system metrics May 6, 2020
Copy link
Member

@mauriciovasquezbernal mauriciovasquezbernal left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks pretty good. Minor comments. I didn't review the pieces ported to #653.

@jmacd
Copy link

jmacd commented May 7, 2020

we could make which metrics are observed configurable

I am hoping this becomes a general function of the SDK, through the Views API you should be able to disable or configure any instrument. Each plugin shouldn't expose a way to configure the instruments, ideally.

@codeboten codeboten added the needs reviewers PRs with this label are ready for review and needs people to review to move forward. label May 14, 2020
Copy link
Contributor

@ocelotl ocelotl left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looking good! Comments are non-blocking but worth taking a look 👍

description="System memory",
unit="bytes",
value_type=int,
label_keys=self._labels.keys(),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think you need to supply label_keys. Originally, these were served as "recommended keys", in which if the metric is recorded with labels that are not in label_keys, they would be dropped. Since you are adding the "type" label in each of the observes, they would theoretically be dropped. However, this functionality is going to be done in views (so label_keys here doesn't actually do anything). As well, from the specs we are most likely going to be getting rid of this field anyways.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removed

observer: the observer to update
"""
system_memory = psutil.virtual_memory()
_metrics = [
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would define these outside of the callback. We don't want to keep instantiating these.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should add an API to modify the _metrics for each system metric. This way, users that use this library will be able to customize the types of metrics for each system metric they export.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree, I've updated the constructor to support a config parameter to allow for other metrics to be configured. I think it can be improved, but maybe its enough to get started?

"used",
"free",
]
for metric in _metrics:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would use "type" instead of "metrics".

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

updated

@codeboten
Copy link
Contributor Author

@lzchen addressed your comments if you get a chance to review

self.controller = PushController(
meter=self.meter, exporter=exporter, interval=interval
)
if config is None:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there somewhere that we can let the user know that passing in their own config is possible? Maybe an example file?

Copy link
Contributor

@lzchen lzchen left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Just one comment about an example file.

@codeboten codeboten merged commit bfc54bf into open-telemetry:master May 29, 2020
srikanthccv pushed a commit to srikanthccv/opentelemetry-python that referenced this pull request Nov 1, 2020
…lemetry#652)

* feat: port mongodb-core plugin to mongodb open-telemetry#622

* chore: address PR comments
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
needs reviewers PRs with this label are ready for review and needs people to review to move forward.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants