Skip to content

Commit

Permalink
Use consistent file name for data-model.md
Browse files Browse the repository at this point in the history
We use (mostly) snake-case for file names in the repository.
I renamed datamodel.md to data-model.md to be consistent.
  • Loading branch information
tigrannajaryan committed May 27, 2022
1 parent b13c164 commit e035558
Show file tree
Hide file tree
Showing 12 changed files with 37 additions and 37 deletions.
2 changes: 1 addition & 1 deletion specification/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
- Data Specification
- [Semantic Conventions](overview.md#semantic-conventions)
- [Protocol](protocol/README.md)
- [Metrics](metrics/datamodel.md)
- [Metrics](metrics/data-model.md)
- [Logs](logs/data-model.md)

## Notation Conventions and Compliance
Expand Down
2 changes: 1 addition & 1 deletion specification/common/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -119,7 +119,7 @@ at this time, as discussed in
## Attribute Collections

[Resources](../resource/sdk.md), Metrics
[data points](../metrics/datamodel.md#metric-points),
[data points](../metrics/data-model.md#metric-points),
[Spans](../trace/api.md#set-attributes), Span
[Events](../trace/api.md#add-events), Span
[Links](../trace/api.md#specifying-links) and
Expand Down
2 changes: 1 addition & 1 deletion specification/metrics/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -94,7 +94,7 @@ SDK](../overview.md#sdk) concept for more information.

* [Metrics API](./api.md)
* [Metrics SDK](./sdk.md)
* [Metrics Data Model and Protocol](./datamodel.md)
* [Metrics Data Model and Protocol](./data-model.md)
* [Semantic Conventions](./semantic_conventions/README.md)

## References
Expand Down
4 changes: 2 additions & 2 deletions specification/metrics/api.md
Original file line number Diff line number Diff line change
Expand Up @@ -213,7 +213,7 @@ implementation MUST create a valid Instrument in every case. Here,
"valid" means an instrument that is functional and can be expected to
export data, despite potentially creating a [semantic error in the
data
model](datamodel.md#opentelemetry-protocol-data-model-producer-recommendations).
model](data-model.md#opentelemetry-protocol-data-model-producer-recommendations).

It is unspecified whether or under which conditions the same or
different Instrument instance will be returned as a result of
Expand All @@ -229,7 +229,7 @@ to the user informing them of duplicate registration conflict(s).
__Note the warning about duplicate Instrument registration conflicts
is meant to help avoid the semantic error state described in the
[OpenTelemetry Metrics data
model](datamodel.md#opentelemetry-protocol-data-model-producer-recommendations)
model](data-model.md#opentelemetry-protocol-data-model-producer-recommendations)
when more than one `Metric` is written for a given instrument `name`
and Meter identity by the same MeterProvider.

Expand Down
File renamed without changes.
34 changes: 17 additions & 17 deletions specification/metrics/sdk.md
Original file line number Diff line number Diff line change
Expand Up @@ -179,9 +179,9 @@ are the inputs:
general guidance.
* The `name` of the View (optional). If not provided, the Instrument `name`
MUST be used by default. This will be used as the name of the [metrics
stream](./datamodel.md#events--data-stream--timeseries).
stream](./data-model.md#events--data-stream--timeseries).
* The configuration for the resulting [metrics
stream](./datamodel.md#events--data-stream--timeseries):
stream](./data-model.md#events--data-stream--timeseries):
* The `description`. If not provided, the Instrument `description` MUST be
used by default.
* A list of `attribute keys` (optional). If provided, the attributes that are
Expand Down Expand Up @@ -218,7 +218,7 @@ made with an Instrument:
* For each View, if the Instrument could match the instrument selection
criteria:
* Try to apply the View configuration. If applying the View results
in [conflicting metric identities](./datamodel.md#opentelemetry-protocol-data-model-producer-recommendations)
in [conflicting metric identities](./data-model.md#opentelemetry-protocol-data-model-producer-recommendations)
the implementation SHOULD apply the View and emit a warning. If it is not
possible to apply the View without producing semantic errors (e.g. the
View sets an asynchronous instrument to use
Expand Down Expand Up @@ -289,7 +289,7 @@ meter_provider

An `Aggregation`, as configured via the [View](./sdk.md#view),
informs the SDK on the ways and means to compute
[Aggregated Metrics](./datamodel.md#opentelemetry-protocol-data-model)
[Aggregated Metrics](./data-model.md#opentelemetry-protocol-data-model)
from incoming Instrument [Measurements](./api.md#measurement).

Note: the term _aggregation_ is used instead of _aggregator_. It is recommended
Expand Down Expand Up @@ -336,8 +336,8 @@ we will explore how to allow configuring custom
[ExemplarReservoir](#exemplarreservoir)s with the [View](#view) API.

The SDK MUST provide the following `Aggregation` to support the
[Metric Points](./datamodel.md#metric-points) in the
[Metrics Data Model](./datamodel.md).
[Metric Points](./data-model.md#metric-points) in the
[Metrics Data Model](./data-model.md).

- [Drop](./sdk.md#drop-aggregation)
- [Default](./sdk.md#default-aggregation)
Expand Down Expand Up @@ -376,7 +376,7 @@ This Aggregation does not have any configuration parameters.
#### Sum Aggregation

The Sum Aggregation informs the SDK to collect data for the
[Sum Metric Point](./datamodel.md#sums).
[Sum Metric Point](./data-model.md#sums).

The monotonicity of the aggregation is determined by the instrument type:

Expand All @@ -398,7 +398,7 @@ This Aggregation informs the SDK to collect:
#### Last Value Aggregation

The Last Value Aggregation informs the SDK to collect data for the
[Gauge Metric Point](./datamodel.md#gauge).
[Gauge Metric Point](./data-model.md#gauge).

This Aggregation does not have any configuration parameters.

Expand All @@ -420,7 +420,7 @@ instruments that record negative measurements (e.g. `UpDownCounter` or `Observab
#### Explicit Bucket Histogram Aggregation

The Explicit Bucket Histogram Aggregation informs the SDK to collect data for
the [Histogram Metric Point](./datamodel.md#histogram) using a set of
the [Histogram Metric Point](./data-model.md#histogram) using a set of
explicit boundary values for histogram bucketing.

This Aggregation honors the following configuration parameters:
Expand All @@ -440,7 +440,7 @@ or equal to the measurement.

The Exponential Histogram Aggregation informs the SDK to collect data
for the [Exponential Histogram Metric
Point](./datamodel.md#exponentialhistogram), which uses an exponential
Point](./data-model.md#exponentialhistogram), which uses an exponential
formula to determine bucket boundaries and an integer `scale`
parameter to control resolution.

Expand Down Expand Up @@ -545,7 +545,7 @@ specification](api.md#instrument-type-conflict-detection),
implementations are REQUIRED to create valid instruments in case of
duplicate instrument registration, and the [data model includes
RECOMMENDATIONS on how to treat the consequent duplicate
conflicting](datamodel.md#opentelemetry-protocol-data-model-producer-recommendations)
conflicting](data-model.md#opentelemetry-protocol-data-model-producer-recommendations)
`Metric` definitions.

The implementation MUST aggregate data from identical Instruments
Expand Down Expand Up @@ -585,7 +585,7 @@ aggregated metric data and the original API calls where measurements are
recorded. Exemplars work for trace-metric correlation across any metric, not
just those that can also be derived from `Span`s.

An [Exemplar](./datamodel.md#exemplars) is a recorded
An [Exemplar](./data-model.md#exemplars) is a recorded
[Measurement](./api.md#measurement) that exposes the following pieces of
information:

Expand Down Expand Up @@ -776,7 +776,7 @@ The SDK MUST support multiple `MetricReader` instances to be registered on the
same `MeterProvider`, and the [MetricReader.Collect](#collect) invocation on one
`MetricReader` instance SHOULD NOT introduce side-effects to other `MetricReader`
instances. For example, if a `MetricReader` instance is receiving metric data
points that have [delta temporality](./datamodel.md#temporality), it is expected
points that have [delta temporality](./data-model.md#temporality), it is expected
that SDK will update the time range - e.g. from (T<sub>n</sub>, T<sub>n+1</sub>]
to (T<sub>n+1</sub>, T<sub>n+2</sub>] - **ONLY** for this particular
`MetricReader` instance.
Expand Down Expand Up @@ -890,7 +890,7 @@ protocol-dependent telemetry exporters. The protocol exporter is expected to be
primarily a simple telemetry data encoder and transmitter.

Metric Exporter has access to the [aggregated metrics
data](./datamodel.md#timeseries-model). Metric Exporters SHOULD
data](./data-model.md#timeseries-model). Metric Exporters SHOULD
report an error condition for data output by the `MetricReader` with
unsupported Aggregation or Aggregation Temporality, as this condition
can be corrected by a change of `MetricReader` configuration.
Expand Down Expand Up @@ -939,7 +939,7 @@ A Push Metric Exporter MUST support the following functions:

##### Export(batch)

Exports a batch of [Metric points](./datamodel.md#metric-points). Protocol
Exports a batch of [Metric points](./data-model.md#metric-points). Protocol
exporters that will implement this function are typically expected to serialize
and transmit the data to the destination.

Expand Down Expand Up @@ -978,11 +978,11 @@ Batch: | Metric | | Metric | ... | Metric |
+--> timestamps, attributes, value (or buckets), exemplars, ...
```

Refer to the [Metric points](./datamodel.md#metric-points) section from the
Refer to the [Metric points](./data-model.md#metric-points) section from the
Metrics Data Model specification for more details.

Note: it is highly recommended that implementors design the `Metric` data type
_based on_ the [Data Model](./datamodel.md), rather than directly use the data
_based on_ the [Data Model](./data-model.md), rather than directly use the data
types generated from the [proto
files](https://github.com/open-telemetry/opentelemetry-proto/blob/main/opentelemetry/proto/metrics/v1/metrics.proto)
(because the types generated from proto files are not guaranteed to be backward
Expand Down
2 changes: 1 addition & 1 deletion specification/metrics/sdk_exporters/in-memory.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ Exporter](../sdk.md#push-metric-exporter) which accumulates metrics data in the
local memory and allows to inspect it (useful for e.g. unit tests).

In-memory Metrics Exporter MUST support both Cumulative and Delta
[Temporality](../datamodel.md#temporality).
[Temporality](../data-model.md#temporality).

If a language provides a mechanism to automatically configure a
[MetricReader](../sdk.md#metricreader) to pair with the associated
Expand Down
2 changes: 1 addition & 1 deletion specification/metrics/sdk_exporters/otlp.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ Exporter](../sdk.md#push-metric-exporter) which sends metrics via the
[OpenTelemetry Protocol](../../protocol/README.md).

OTLP Metrics Exporter MUST support both Cumulative and Delta
[Aggregation Temporality](../datamodel.md#temporality).
[Aggregation Temporality](../data-model.md#temporality).

The exporter MUST provide configuration according to the [OpenTelemetry Protocol
Exporter](../../protocol/exporter.md) specification.
Expand Down
2 changes: 1 addition & 1 deletion specification/metrics/sdk_exporters/prometheus.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ A Prometheus Exporter MUST NOT support [Push
mode](../sdk.md#push-metric-exporter).

A Prometheus Exporter MUST only support [Cumulative
Temporality](../datamodel.md#temporality).
Temporality](../data-model.md#temporality).

A Prometheus Exporter MUST support version `0.0.4` of the [Text-based
format](https://github.com/prometheus/docs/blob/main/content/docs/instrumenting/exposition_formats.md#text-based-format).
Expand Down
2 changes: 1 addition & 1 deletion specification/metrics/sdk_exporters/stdout.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ name for their language. For example, ConsoleExporter, StdoutExporter,
StreamExporter, etc.

"Standard output" Metrics Exporter MUST support both Cumulative and Delta
[Temporality](../datamodel.md#temporality).
[Temporality](../data-model.md#temporality).

If a language provides a mechanism to automatically configure a
[MetricReader](../sdk.md#metricreader) to pair with the associated
Expand Down
18 changes: 9 additions & 9 deletions specification/metrics/supplementary-guidelines.md
Original file line number Diff line number Diff line change
Expand Up @@ -217,7 +217,7 @@ backend might have trouble handling subnormal numbers.

### Monotonicity property

In the OpenTelemetry Metrics [Data Model](./datamodel.md) and [API](./api.md)
In the OpenTelemetry Metrics [Data Model](./data-model.md) and [API](./api.md)
specifications, the word `monotonic` has been used frequently.

It is important to understand that different
Expand Down Expand Up @@ -302,9 +302,9 @@ Conventions`, rather than inventing your own semantics.

#### Synchronous example

The OpenTelemetry Metrics [Data Model](./datamodel.md) and [SDK](./sdk.md) are
The OpenTelemetry Metrics [Data Model](./data-model.md) and [SDK](./sdk.md) are
designed to support both Cumulative and Delta
[Temporality](./datamodel.md#temporality). It is important to understand that
[Temporality](./data-model.md#temporality). It is important to understand that
temporality will impact how the SDK could manage memory usage. Let's take the
following HTTP requests example:

Expand All @@ -331,7 +331,7 @@ API with specified Delta aggregation temporality.

##### Synchronous example: Delta aggregation temporality

Let's imagine we export the metrics as [Histogram](./datamodel.md#histogram),
Let's imagine we export the metrics as [Histogram](./data-model.md#histogram),
and to simplify the story we will only have one histogram bucket `(-Inf, +Inf)`:

If we export the metrics using **Delta Temporality**:
Expand Down Expand Up @@ -411,7 +411,7 @@ So here are some suggestions that we encourage SDK implementers to consider:
things that are no longer needed**.
* You probably don't want to keep exporting the same thing over and over again,
if there is no updates. You might want to consider [Resets and
Gaps](./datamodel.md#resets-and-gaps). For example, if a Cumulative metrics
Gaps](./data-model.md#resets-and-gaps). For example, if a Cumulative metrics
stream hasn't received any updates for a long period of time, would it be okay
to reset the start time?

Expand Down Expand Up @@ -473,7 +473,7 @@ and send them.

The data model prescribes several valid behaviors at T<sub>5</sub> in
this case, where one stream dies and another starts. The [Resets and
Gaps](./datamodel.md#resets-and-gaps) section describes how start
Gaps](./data-model.md#resets-and-gaps) section describes how start
timestamps and staleness markers can be used to increase the
receiver's understanding of these events.

Expand Down Expand Up @@ -546,7 +546,7 @@ So here are some suggestions that we encourage SDK implementers to consider:

* If you have to do Cumulative->Delta conversion, and you encountered min/max,
rather than drop the data on the floor, you might want to convert them to
something useful - e.g. [Gauge](./datamodel.md#gauge).
something useful - e.g. [Gauge](./data-model.md#gauge).

##### Asynchronous example: attribute removal in a view

Expand Down Expand Up @@ -577,9 +577,9 @@ As discussed in the asynchronous cumulative temporality example above,
there are various treatments available for detecting resets. Even if
the first course is taken, which means doing nothing, a receiver that
follows the data model's rules for [unknown start
time](datamodel.md#cumulative-streams-handling-unknown-start-time) and
time](data-model.md#cumulative-streams-handling-unknown-start-time) and
[inserting true start
times](datamodel.md#cumulative-streams-inserting-true-reset-points)
times](data-model.md#cumulative-streams-inserting-true-reset-points)
will calculate a correct rate in this case. The "58" received at
T<sub>5</sub> resets the stream - the change from "107" to "58" will
register as a gap and rate calculations will resume correctly at
Expand Down
4 changes: 2 additions & 2 deletions specification/overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -270,7 +270,7 @@ supports both - push and pull model of setting the `Metric` value.

### Metrics data model and SDK

Metrics data model is [specified here](metrics/datamodel.md) and is based on
Metrics data model is [specified here](metrics/data-model.md) and is based on
[metrics.proto](https://github.com/open-telemetry/opentelemetry-proto/blob/master/opentelemetry/proto/metrics/v1/metrics.proto).
This data model defines three semantics: An Event model used by the API, an
in-flight data model used by the SDK and OTLP, and a TimeSeries model which
Expand All @@ -288,7 +288,7 @@ validation and sanitization of the Metrics data. Instead, pass the data to the
backend, rely on the backend to perform validation, and pass back any errors
from the backend.

See [Metrics Data Model Specification](metrics/datamodel.md) for more
See [Metrics Data Model Specification](metrics/data-model.md) for more
information.

## Log Signal
Expand Down

0 comments on commit e035558

Please sign in to comment.