Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add perf tests #17519

Merged
merged 4 commits into from
Apr 14, 2021
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
# Metrics advisor Performance Tests

In order to run the performance tests, the `azure-devtools` package must be installed. This is done as part of the `dev_requirements`.
Start be creating a new virtual environment for your perf tests. This will need to be a Python 3 environment, preferably >=3.7.

### Setup for test resources

These tests will run against a pre-configured metrics advisor service. The following environment variable will need to be set for the tests to access the live resources:
```
AZURE_APP_CONFIG_CONNECTION_STRING=<app config connection string>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

app config?

```

### Setup for perf test runs

```cmd
(env) ~/azure-ai-metricsadvisor> pip install -r dev_requirements.txt
(env) ~/azure-ai-metricsadvisor> pip install -e .
```

## Test commands

When `azure-devtools` is installed, you will have access to the `perfstress` command line tool, which will scan the current module for runable perf tests. Only a specific test can be run at a time (i.e. there is no "run all" feature).

```cmd
(env) ~/azure-ai-metricsadvisor> cd tests
(env) ~/azure-ai-metricsadvisor/tests> perfstress
```
Using the `perfstress` command alone will list the available perf tests found.

### Common perf command line options
These options are available for all perf tests:
- `--duration=10` Number of seconds to run as many operations (the "run" function) as possible. Default is 10.
- `--iterations=1` Number of test iterations to run. Default is 1.
- `--parallel=1` Number of tests to run in parallel. Default is 1.
- `--warmup=5` Number of seconds to spend warming up the connection before measuring begins. Default is 5.
- `--sync` Whether to run the tests in sync or async. Default is False (async). This flag must be used for Storage legacy tests, which do not support async.
- `--no-cleanup` Whether to keep newly created resources after test run. Default is False (resources will be deleted).

## Example command
```cmd
(env) ~/azure-ai-metricsadvisor/tests> perfstress ListAnomaliesTest
(env) ~/azure-ai-metricsadvisor/tests> perfstress ListIncidentsTest
(env) ~/azure-ai-metricsadvisor/tests> perfstress ListRootCausesTest
```
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
# --------------------------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for license information.
# --------------------------------------------------------------------------------------------

import os

from azure_devtools.perfstress_tests import PerfStressTest

from azure.ai.metricsadvisor import MetricsAdvisorClient as SyncClient, MetricsAdvisorKeyCredential
from azure.ai.metricsadvisor.aio import MetricsAdvisorClient as AsyncClient


class ListAnomaliesTest(PerfStressTest):
def __init__(self, arguments):
super().__init__(arguments)
service_endpoint = os.getenv("AZURE_METRICS_ADVISOR_ENDPOINT")
subscription_key = os.getenv("AZURE_METRICS_ADVISOR_SUBSCRIPTION_KEY")
api_key = os.getenv("AZURE_METRICS_ADVISOR_API_KEY")
self.anomaly_alert_configuration_id = os.getenv("AZURE_METRICS_ADVISOR_ANOMALY_ALERT_CONFIGURATION_ID")
self.alert_id = os.getenv("AZURE_METRICS_ADVISOR_ALERT_ID")
self.service_client = SyncClient(service_endpoint, MetricsAdvisorKeyCredential(subscription_key, api_key))
self.async_service_client = AsyncClient(service_endpoint, MetricsAdvisorKeyCredential(subscription_key, api_key))

async def global_setup(self):
await super().global_setup()

async def close(self):
await self.async_service_client.close()
await super().close()

def run_sync(self):
ret = list(self.service_client.list_anomalies(
alert_configuration_id=self.anomaly_alert_configuration_id,
alert_id=self.alert_id,
))

async def run_async(self):
results = self.async_service_client.list_anomalies(
alert_configuration_id=self.anomaly_alert_configuration_id,
alert_id=self.alert_id,
)
tolist = []
async for result in results:
tolist.append(result)
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
# --------------------------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for license information.
# --------------------------------------------------------------------------------------------

import os
import datetime
from azure_devtools.perfstress_tests import PerfStressTest

from azure.ai.metricsadvisor import MetricsAdvisorClient as SyncClient, MetricsAdvisorKeyCredential
from azure.ai.metricsadvisor.aio import MetricsAdvisorClient as AsyncClient


class ListIncidentsTest(PerfStressTest):
def __init__(self, arguments):
super().__init__(arguments)
service_endpoint = os.getenv("AZURE_METRICS_ADVISOR_ENDPOINT")
subscription_key = os.getenv("AZURE_METRICS_ADVISOR_SUBSCRIPTION_KEY")
api_key = os.getenv("AZURE_METRICS_ADVISOR_API_KEY")
self.anomaly_detection_configuration_id = os.getenv("AZURE_METRICS_ADVISOR_ANOMALY_DETECTION_CONFIGURATION_ID")
self.service_client = SyncClient(service_endpoint, MetricsAdvisorKeyCredential(subscription_key, api_key))
self.async_service_client = AsyncClient(service_endpoint, MetricsAdvisorKeyCredential(subscription_key, api_key))

async def global_setup(self):
await super().global_setup()
xiangyan99 marked this conversation as resolved.
Show resolved Hide resolved

async def close(self):
await self.async_service_client.close()
await super().close()

def run_sync(self):
ret = list(self.service_client.list_incidents(
detection_configuration_id=self.anomaly_detection_configuration_id,
start_time=datetime.datetime(2020, 1, 1, tzinfo=datetime.timezone.utc),
end_time=datetime.datetime(2020, 10, 21, tzinfo=datetime.timezone.utc),
))

async def run_async(self):
results = self.async_service_client.list_incidents(
detection_configuration_id=self.anomaly_detection_configuration_id,
start_time=datetime.datetime(2020, 1, 1, tzinfo=datetime.timezone.utc),
end_time=datetime.datetime(2020, 10, 21, tzinfo=datetime.timezone.utc),
)
tolist = []
async for result in results:
tolist.append(result)
xiangyan99 marked this conversation as resolved.
Show resolved Hide resolved
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
# --------------------------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for license information.
# --------------------------------------------------------------------------------------------

import os

from azure_devtools.perfstress_tests import PerfStressTest

from azure.ai.metricsadvisor import MetricsAdvisorClient as SyncClient, MetricsAdvisorKeyCredential
from azure.ai.metricsadvisor.aio import MetricsAdvisorClient as AsyncClient


class ListRootCausesTest(PerfStressTest):
def __init__(self, arguments):
super().__init__(arguments)
service_endpoint = os.getenv("AZURE_METRICS_ADVISOR_ENDPOINT")
subscription_key = os.getenv("AZURE_METRICS_ADVISOR_SUBSCRIPTION_KEY")
api_key = os.getenv("AZURE_METRICS_ADVISOR_API_KEY")
self.anomaly_detection_configuration_id = os.getenv("AZURE_METRICS_ADVISOR_ANOMALY_DETECTION_CONFIGURATION_ID")
self.incident_id = os.getenv("AZURE_METRICS_ADVISOR_INCIDENT_ID")
self.service_client = SyncClient(service_endpoint, MetricsAdvisorKeyCredential(subscription_key, api_key))
self.async_service_client = AsyncClient(service_endpoint, MetricsAdvisorKeyCredential(subscription_key, api_key))

async def global_setup(self):
await super().global_setup()

async def close(self):
await self.async_service_client.close()
await super().close()

def run_sync(self):
ret = list(self.service_client.list_incident_root_causes(
detection_configuration_id=self.anomaly_detection_configuration_id,
incident_id=self.incident_id,
))

async def run_async(self):
results = self.async_service_client.list_incident_root_causes(
detection_configuration_id=self.anomaly_detection_configuration_id,
incident_id=self.incident_id,
)
tolist = []
async for result in results:
tolist.append(result)