Skip to content

Commit

Permalink
Perf tests for monitor
Browse files Browse the repository at this point in the history
  • Loading branch information
rakshith91 committed Mar 3, 2021
1 parent 303ff1f commit c3705ea
Show file tree
Hide file tree
Showing 3 changed files with 95 additions and 0 deletions.
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
# Monitor Exporter Performance Tests

In order to run the performance tests, the `azure-devtools` package must be installed. This is done as part of the `dev_requirements`.
Start by creating a new virtual environment for your perf tests. This will need to be a Python 3 environment, preferably >=3.7.

### Setup for test resources

These tests will run against a pre-configured Application Insights resource. The following environment variable will need to be set for the tests to access the live resources:
```
APPLICATIONINSIGHTS_CONNECTION_STRING=<connection string for app insights>
```

### Setup for perf test runs

```cmd
(env) ~/azure-monitor-opentelemetry-exporter> pip install -r dev_requirements.txt
(env) ~/azure-monitor-opentelemetry-exporter> pip install -e .
```

## Test commands

```cmd
(env) ~/azure-monitor-opentelemetry-exporter> cd tests
(env) ~/azure-monitor-opentelemetry-exporter/tests> perfstress
```

### Common perf command line options
These options are available for all perf tests:
- `--duration=10` Number of seconds to run as many operations (the "run" function) as possible. Default is 10.
- `--iterations=1` Number of test iterations to run. Default is 1.
- `--parallel=1` Number of tests to run in parallel. Default is 1.
- `--warm-up=5` Number of seconds to spend warming up the connection before measuring begins. Default is 5.
- `--sync` Whether to run the tests in sync or async. Default is False (async). This must be set to True explicitly.
- `--no-cleanup` Whether to keep newly created resources after test run. Default is False (resources will be deleted).

### MonitorExporter Test options
These options are available for all monitor exporter perf tests:
- `--num-traces` Number of traces to be collected. Defaults to 1.

### T2 Tests
The tests currently written for the T2 SDK:
- `MonitorExporterPerfTest` Collects sample traces and exports to application insights.

## Example command
```cmd
(env) ~/azure-monitor-opentelemetry-exporter/tests> perfstress MonitorExporterPerfTest --sync --num-traces=10
```
Empty file.
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
#-------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
#--------------------------------------------------------------------------

from azure_devtools.perfstress_tests import PerfStressTest

from azure.monitor.opentelemetry.exporter import AzureMonitorTraceExporter

import os
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchExportSpanProcessor

from azure.monitor.opentelemetry.exporter import AzureMonitorTraceExporter

class MonitorExporterPerfTest(PerfStressTest):
def __init__(self, arguments):
super().__init__(arguments)

# auth configuration
connection_string = self.get_from_env("APPLICATIONINSIGHTS_CONNECTION_STRING")

# Create clients
exporter = AzureMonitorTraceExporter.from_connection_string(
os.environ["APPLICATIONINSIGHTS_CONNECTION_STRING"]
)

trace.set_tracer_provider(TracerProvider())
self.tracer = trace.get_tracer(__name__)
span_processor = BatchExportSpanProcessor(exporter)
trace.get_tracer_provider().add_span_processor(span_processor)

def run_sync(self):
"""The synchronous perf test.
Try to keep this minimal and focused. Using only a single client API.
Avoid putting any ancilliary logic (e.g. generating UUIDs), and put this in the setup/init instead
so that we're only measuring the client API call.
"""
with tracer.start_as_current_span("hello"):
print("Hello, World!")

@staticmethod
def add_arguments(parser):
super(MonitorExporterPerfTest, MonitorExporterPerfTest).add_arguments(parser)
parser.add_argument('-n', '--num-traces', nargs='?', type=int, help='Number of traces to be collected. Defaults to 10', default=10)

0 comments on commit c3705ea

Please sign in to comment.