Skip to content

Commit

Permalink
feat(aiplatform): update the api
Browse files Browse the repository at this point in the history
#### aiplatform:v1

The following keys were added:
- resources.projects.resources.locations.resources.featureOnlineStores.methods.getIamPolicy (Total Keys: 14)
- resources.projects.resources.locations.resources.featureOnlineStores.methods.setIamPolicy (Total Keys: 12)
- resources.projects.resources.locations.resources.featureOnlineStores.methods.testIamPermissions (Total Keys: 14)
- resources.projects.resources.locations.resources.featureOnlineStores.resources.featureViews.methods.getIamPolicy (Total Keys: 14)
- resources.projects.resources.locations.resources.featureOnlineStores.resources.featureViews.methods.setIamPolicy (Total Keys: 12)
- resources.projects.resources.locations.resources.featureOnlineStores.resources.featureViews.methods.testIamPermissions (Total Keys: 14)
- schemas.GoogleCloudAiplatformV1FeatureGroupBigQuery.properties.dense.type (Total Keys: 1)
- schemas.GoogleCloudAiplatformV1FeatureGroupBigQuery.properties.staticDataSource.type (Total Keys: 1)
- schemas.GoogleCloudAiplatformV1FeatureView.properties.vertexRagSource.$ref (Total Keys: 1)
- schemas.GoogleCloudAiplatformV1FeatureViewVertexRagSource (Total Keys: 5)
- schemas.GoogleCloudAiplatformV1FunctionDeclaration.properties.response.$ref (Total Keys: 1)

#### aiplatform:v1beta1

The following keys were added:
- resources.projects.resources.locations.resources.ragCorpora.methods.patch (Total Keys: 12)
- schemas.GoogleCloudAiplatformV1beta1ApiAuth (Total Keys: 3)
- schemas.GoogleCloudAiplatformV1beta1CachedContent.properties.usageMetadata (Total Keys: 2)
- schemas.GoogleCloudAiplatformV1beta1CachedContentUsageMetadata (Total Keys: 12)
- schemas.GoogleCloudAiplatformV1beta1CorpusStatus (Total Keys: 6)
- schemas.GoogleCloudAiplatformV1beta1FeatureGroupBigQuery.properties.dense.type (Total Keys: 1)
- schemas.GoogleCloudAiplatformV1beta1FeatureGroupBigQuery.properties.staticDataSource.type (Total Keys: 1)
- schemas.GoogleCloudAiplatformV1beta1FeatureView.properties.vertexRagSource.$ref (Total Keys: 1)
- schemas.GoogleCloudAiplatformV1beta1FeatureViewVertexRagSource (Total Keys: 5)
- schemas.GoogleCloudAiplatformV1beta1FileStatus (Total Keys: 6)
- schemas.GoogleCloudAiplatformV1beta1GenerateContentResponseUsageMetadata.properties.cachedContentTokenCount (Total Keys: 3)
- schemas.GoogleCloudAiplatformV1beta1RagCorpus.properties.corpusStatus (Total Keys: 2)
- schemas.GoogleCloudAiplatformV1beta1RagCorpus.properties.ragVectorDbConfig.$ref (Total Keys: 1)
- schemas.GoogleCloudAiplatformV1beta1RagFile.properties.fileStatus (Total Keys: 2)
- schemas.GoogleCloudAiplatformV1beta1RagVectorDbConfig (Total Keys: 15)

The following keys were changed:
- resources.projects.resources.locations.resources.extensions.methods.list.scopes (Total Keys: 1)
  • Loading branch information
yoshi-automation committed Sep 10, 2024
1 parent 7f2675c commit b0e9b70
Show file tree
Hide file tree
Showing 24 changed files with 1,507 additions and 209 deletions.
116 changes: 100 additions & 16 deletions docs/dyn/aiplatform_v1.endpoints.html

Large diffs are not rendered by default.

Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,7 @@ <h2>Instance Methods</h2>
<p class="firstline">Close httplib2 connections.</p>
<p class="toc_element">
<code><a href="#list">list(parent, filter=None, orderBy=None, pageSize=None, pageToken=None, readMask=None, x__xgafv=None)</a></code></p>
<p class="firstline">Lists Annotations belongs to a dataitem</p>
<p class="firstline">Lists Annotations belongs to a dataitem This RPC is only available in InternalDatasetService. It is only used for exporting conversation data to CCAI Insights.</p>
<p class="toc_element">
<code><a href="#list_next">list_next()</a></code></p>
<p class="firstline">Retrieves the next page of results.</p>
Expand All @@ -96,7 +96,7 @@ <h3>Method Details</h3>

<div class="method">
<code class="details" id="list">list(parent, filter=None, orderBy=None, pageSize=None, pageToken=None, readMask=None, x__xgafv=None)</code>
<pre>Lists Annotations belongs to a dataitem
<pre>Lists Annotations belongs to a dataitem This RPC is only available in InternalDatasetService. It is only used for exporting conversation data to CCAI Insights.

Args:
parent: string, Required. The resource name of the DataItem to list Annotations from. Format: `projects/{project}/locations/{location}/datasets/{dataset}/dataItems/{data_item}` (required)
Expand Down
116 changes: 100 additions & 16 deletions docs/dyn/aiplatform_v1.projects.locations.endpoints.html

Large diffs are not rendered by default.

8 changes: 8 additions & 0 deletions docs/dyn/aiplatform_v1.projects.locations.featureGroups.html
Original file line number Diff line number Diff line change
Expand Up @@ -125,9 +125,11 @@ <h3>Method Details</h3>
&quot;bigQuerySource&quot;: { # The BigQuery location for the input content. # Required. Immutable. The BigQuery source URI that points to either a BigQuery Table or View.
&quot;inputUri&quot;: &quot;A String&quot;, # Required. BigQuery URI to a table, up to 2000 characters long. Accepted forms: * BigQuery path. For example: `bq://projectId.bqDatasetId.bqTableId`.
},
&quot;dense&quot;: True or False, # Optional. If set, all feature values will be fetched from a single row per unique entityId including nulls. If not set, will collapse all rows for each unique entityId into a singe row with any non-null values if present, if no non-null values are present will sync null. ex: If source has schema (entity_id, feature_timestamp, f0, f1) and values (e1, 2020-01-01T10:00:00.123Z, 10, 15) (e1, 2020-02-01T10:00:00.123Z, 20, null) If dense is set, (e1, 20, null) is synced to online stores. If dense is not set, (e1, 20, 15) is synced to online stores.
&quot;entityIdColumns&quot;: [ # Optional. Columns to construct entity_id / row keys. If not provided defaults to `entity_id`.
&quot;A String&quot;,
],
&quot;staticDataSource&quot;: True or False, # Optional. Set if the data source is not a time-series.
&quot;timeSeries&quot;: { # Optional. If the source is a time-series source, this can be set to control how downstream sources (ex: FeatureView ) will treat time-series sources. If not set, will treat the source as a time-series source with `feature_timestamp` as timestamp column and no scan boundary.
&quot;timestampColumn&quot;: &quot;A String&quot;, # Optional. Column hosting timestamp values for a time-series source. Will be used to determine the latest `feature_values` for each entity. Optional. If not provided, column named `feature_timestamp` of type `TIMESTAMP` will be used.
},
Expand Down Expand Up @@ -227,9 +229,11 @@ <h3>Method Details</h3>
&quot;bigQuerySource&quot;: { # The BigQuery location for the input content. # Required. Immutable. The BigQuery source URI that points to either a BigQuery Table or View.
&quot;inputUri&quot;: &quot;A String&quot;, # Required. BigQuery URI to a table, up to 2000 characters long. Accepted forms: * BigQuery path. For example: `bq://projectId.bqDatasetId.bqTableId`.
},
&quot;dense&quot;: True or False, # Optional. If set, all feature values will be fetched from a single row per unique entityId including nulls. If not set, will collapse all rows for each unique entityId into a singe row with any non-null values if present, if no non-null values are present will sync null. ex: If source has schema (entity_id, feature_timestamp, f0, f1) and values (e1, 2020-01-01T10:00:00.123Z, 10, 15) (e1, 2020-02-01T10:00:00.123Z, 20, null) If dense is set, (e1, 20, null) is synced to online stores. If dense is not set, (e1, 20, 15) is synced to online stores.
&quot;entityIdColumns&quot;: [ # Optional. Columns to construct entity_id / row keys. If not provided defaults to `entity_id`.
&quot;A String&quot;,
],
&quot;staticDataSource&quot;: True or False, # Optional. Set if the data source is not a time-series.
&quot;timeSeries&quot;: { # Optional. If the source is a time-series source, this can be set to control how downstream sources (ex: FeatureView ) will treat time-series sources. If not set, will treat the source as a time-series source with `feature_timestamp` as timestamp column and no scan boundary.
&quot;timestampColumn&quot;: &quot;A String&quot;, # Optional. Column hosting timestamp values for a time-series source. Will be used to determine the latest `feature_values` for each entity. Optional. If not provided, column named `feature_timestamp` of type `TIMESTAMP` will be used.
},
Expand Down Expand Up @@ -270,9 +274,11 @@ <h3>Method Details</h3>
&quot;bigQuerySource&quot;: { # The BigQuery location for the input content. # Required. Immutable. The BigQuery source URI that points to either a BigQuery Table or View.
&quot;inputUri&quot;: &quot;A String&quot;, # Required. BigQuery URI to a table, up to 2000 characters long. Accepted forms: * BigQuery path. For example: `bq://projectId.bqDatasetId.bqTableId`.
},
&quot;dense&quot;: True or False, # Optional. If set, all feature values will be fetched from a single row per unique entityId including nulls. If not set, will collapse all rows for each unique entityId into a singe row with any non-null values if present, if no non-null values are present will sync null. ex: If source has schema (entity_id, feature_timestamp, f0, f1) and values (e1, 2020-01-01T10:00:00.123Z, 10, 15) (e1, 2020-02-01T10:00:00.123Z, 20, null) If dense is set, (e1, 20, null) is synced to online stores. If dense is not set, (e1, 20, 15) is synced to online stores.
&quot;entityIdColumns&quot;: [ # Optional. Columns to construct entity_id / row keys. If not provided defaults to `entity_id`.
&quot;A String&quot;,
],
&quot;staticDataSource&quot;: True or False, # Optional. Set if the data source is not a time-series.
&quot;timeSeries&quot;: { # Optional. If the source is a time-series source, this can be set to control how downstream sources (ex: FeatureView ) will treat time-series sources. If not set, will treat the source as a time-series source with `feature_timestamp` as timestamp column and no scan boundary.
&quot;timestampColumn&quot;: &quot;A String&quot;, # Optional. Column hosting timestamp values for a time-series source. Will be used to determine the latest `feature_values` for each entity. Optional. If not provided, column named `feature_timestamp` of type `TIMESTAMP` will be used.
},
Expand Down Expand Up @@ -319,9 +325,11 @@ <h3>Method Details</h3>
&quot;bigQuerySource&quot;: { # The BigQuery location for the input content. # Required. Immutable. The BigQuery source URI that points to either a BigQuery Table or View.
&quot;inputUri&quot;: &quot;A String&quot;, # Required. BigQuery URI to a table, up to 2000 characters long. Accepted forms: * BigQuery path. For example: `bq://projectId.bqDatasetId.bqTableId`.
},
&quot;dense&quot;: True or False, # Optional. If set, all feature values will be fetched from a single row per unique entityId including nulls. If not set, will collapse all rows for each unique entityId into a singe row with any non-null values if present, if no non-null values are present will sync null. ex: If source has schema (entity_id, feature_timestamp, f0, f1) and values (e1, 2020-01-01T10:00:00.123Z, 10, 15) (e1, 2020-02-01T10:00:00.123Z, 20, null) If dense is set, (e1, 20, null) is synced to online stores. If dense is not set, (e1, 20, 15) is synced to online stores.
&quot;entityIdColumns&quot;: [ # Optional. Columns to construct entity_id / row keys. If not provided defaults to `entity_id`.
&quot;A String&quot;,
],
&quot;staticDataSource&quot;: True or False, # Optional. Set if the data source is not a time-series.
&quot;timeSeries&quot;: { # Optional. If the source is a time-series source, this can be set to control how downstream sources (ex: FeatureView ) will treat time-series sources. If not set, will treat the source as a time-series source with `feature_timestamp` as timestamp column and no scan boundary.
&quot;timestampColumn&quot;: &quot;A String&quot;, # Optional. Column hosting timestamp values for a time-series source. Will be used to determine the latest `feature_values` for each entity. Optional. If not provided, column named `feature_timestamp` of type `TIMESTAMP` will be used.
},
Expand Down
Loading

0 comments on commit b0e9b70

Please sign in to comment.